Presentation is loading. Please wait.

Presentation is loading. Please wait.

ESPC Air-Ocean-Land-Ice Global Coupled Prediction

Similar presentations


Presentation on theme: "ESPC Air-Ocean-Land-Ice Global Coupled Prediction"— Presentation transcript:

1 ESPC Air-Ocean-Land-Ice Global Coupled Prediction
ESPC Infrastructure HYCOM in CESM (Fei Liu/NRL, Alex Bozec/FSU, Gerhard Theurich/NRL, Walter Spector/ESRL, Mat Rothstein/CU, Mariana Vertenstein/NCAR, Jim Edwards/NCAR, …) ESMF integration with MOAB (Bob Oehmke/CU) Accelerator-aware ESMF (Jayesh Krishna/ANL, Gerhard Theurich/NRL) Earth System Prediction Suite (Cecelia DeLuca/CU, Gerhard Theurich/NRL, Rocky Dunlap/CU, …) ESPC Air-Ocean-Land-Ice Global Coupled Prediction Project Meeting, Dec. 2-3, 2015 Arlington, VA

2 HYCOM in CESM Technical Status:
Completed HYCOM technical integration into CESM using NUOPC interfaces (winter-spring 2015). CESM “BHY” compsets include Community Atmosphere Model (CAM), Community Land Model (CLM), CICE sea ice, HYCOM, and a River Transport Model (RTM). NUOPC version validates bit-for-bit with original CESM when using the POP ocean. Updated to recent versions of HYCOM (2.2.98) and ESMF/NUOPC (v7 beta snapshot 60)

3 CESM-HYCOM Optimization
Performance of the CESM NUOPC-based code was compared to the original CESM code (summer 2015) Initial measurements showed unacceptable ~20-25% slowdown for the NUOPC-based version in an all-active configuration. Moved to simpler configurations (e.g. one processor, dead models) for evaluation and optimization. First set of optimizations - change to field dictionary lookup, fix to control flow – reduced difference to 15%. Second set of optimizations – minimizing the need for character comparisons – reduced slowdown in NUOPC version to 5%. Additional evaluation and optimization may be needed.

4 HYCOM Memory Optimization
Memory issues in ESMF StateReconcile(), which is used to set up data transfers when using concurrent components, caused problems for running HYCOM on larger processor counts. Showstopper for upcoming Initial Operating Capability of ESPC system at NRL. Changes implemented in StateReconcile() algorithms during fall Result reduced associated memory for 1024 processors from 1.4GB to ~190MB, and enabled larger processor count configurations to run. New ESMF/NUOPC version (v7 beta snapshot 60) currently being tested by James Chen at NRL.

5 ESMF/NUOPC v7 release Last public, non-patch release was 1/31/14 (ESMF v6.3.0r). ESMF v7 is now frozen in preparation for release (with ~beta snapshot 60) scheduled before the end of the year. Includes performance and memory optimizations for ESMF/NUOPC developed under this NOPP award, along with: Parallel IO (DOE/NCAR PIO) on by default and newly developed NUOPC asynchronous I/O prototypes Framework awareness of accelerator resources Grid remapping with point cloud as destination for data assimilation applications Representation of general n-gons for hydrology and other applications 3D grid remapping of spherical shell with vertical thickness for space weather applications MOAB (DOE finite element mesh framework) option for conservative grid remapping, to better align with future DOE development Review and standardization of NUOPC interfaces, led by Theurich and Campbell/NRL

6 Future Work Currently both ESMF/NUOPC and MCT co-exist in the CESM coupled architecture developed under this NOPP program. Future work will examine simplifications to the architecture, potentially creating a NUOPC-only coupler. This work is likely to be informed by the architecture of other NUOPC-based couplers at NOAA, NRL, and NASA. Significant future technical modifications, including the full implementation of during-run grid remapping, will be deferred to the simplified architecture.

7 MOAB Integration MOAB: A Mesh-Oriented datABase
Library for representing an unstructured finite-element mesh Department of Energy/University of Wisconsin project ESMF’s internal Mesh representation Part of ESMF source for representing an unstructured finite-element mesh Custom ESMF code developed and maintained by ESMF developers Goal: To compare ESMF’s internal Mesh and MOAB, and switch to MOAB if it offers better performance, features, and development path. Status: Conservative interpolation using both mesh frameworks is supported, and a performance and feature evaluation still needs to be done. Now I’m going to talk about the part of the project where we’re bringing MOAB in under ESMF.

8 Accelerator-Aware ESMF
Emerging computer architectures CPUs + GPGPUs + MICs Threading and SIMD parallelism on CPU and device cores Multiple programming models ESMF Support Provide users information about the underlying hardware Enhance the existing interfaces to help users leverage the underlying h/w GOAL To support migration of optimization strategies to infrastructure packages and coupled model applications, and provide support for coupling of optimized components in the ESPC program. Images courtesy:

9 Accelerator Support General structure of an ESMF Application
Some accelerated components (Comp1) Leverages GPUs, MICs Some non-accelerated components Runs on CPU cores Querying the number of accelerator devices ESMF_VMGet() ESMF_VMGet(vm, localpet, accDeviceCount=accDeviceCount, rc=rc) Support for multiple software stacks OpenCL OpenACC Intel MIC OpenMP Several prototypes developed Available in master More info available at:

10 ESMF Accelerator Utilities
New utilities added to create and manipulate PET lists Value Iterators allow representing value ranges Specified using start index, end index and a stride Specified using user defined vector of values Algorithms that work on PET lists Split, join, add, subtract PET lists The splitting algorithm allows users to split a PET list in parallel based on “colors” specified by the PETs Developed a prototype to create a PET list for accelerated components Two components (accelerated, non-accelerated) Query ESMF to get the number of accelerator devices Based on a user policy, use the split and other algorithms above to split the global PET list into A PET list containing only accelerated PETs (used to create the accelerated component) A PET list containing only non-accelerated PETs (used to create the non-accelerated component)

11 ESMF Accelerator Utilities (cont.)
Accelerator utilities may be more suited for developers than end users Code is currently available in a development branch 1d_vec_algos : branch contains the utilities More info available at

12 Future Work Explore and implement resource negotiation among different components Allow negotiation between the driver and the different components for resources Modify NUOPC mediator and connector to support resource negotiation Leverage the newly added accelerator utilities Simplify the user interfaces required to manipulate PET lists Add more features to the split algorithm Allow splitting based on multiple “colors” (each color representing a different user constraint) A more general representation of resources required by the PET This would simplify the interface to the splitting algorithm A regular expression for resources?

13 Earth System Prediction Suite
The Earth System Prediction Suite (ESPS) is a collection of flagship U.S. weather and climate models and model components that are instrumented to conform to interoperability conventions, documented following metadata standards, and available either under open source terms or to credentialed users. Inclusion Criteria ESPS components and coupled modeling systems are NUOPC-compliant. ESPS codes are versioned. A minimal, prescribed set of model documentation is provided for each version of the ESPS component or modeling system. ESPS codes must have clear terms of use (e.g. public domain statement, open source license, proprietary status), and must have a way for credentialed ESPC collaborators to request access. Regression tests are provided for each component and modeling system. There is a commitment to continued NUOPC compliance and ESPS participation for new versions of the code.

14 Next Steps/Future Work
Slowly advancing ESPS code usability/access – and website: Posted “compliance reports” generated by running NUOPC components through compliance checkers – provides information on init and run phases, import and export fields, other metadata More ESPS components need to have reports generated (currently only HYCOM, MOM5, CICE) BAMS paper in press, The Earth System Prediction Suite: Toward a Coordinated U.S. Modeling Capability. Latest version: Started a follow-on white paper, Building on Technical Interoperability to Achieve a More Effective U.S. Earth System Modeling Capability


Download ppt "ESPC Air-Ocean-Land-Ice Global Coupled Prediction"

Similar presentations


Ads by Google