Download presentation
Presentation is loading. Please wait.
Published byMarjorie Richard Modified over 6 years ago
1
State of ESMF Cecelia DeLuca NOAA CIRES / NESII February 7, 2017
ESMF Executive Board Meeting Hurricane Irene/NASA GOES-13 satellite image/August 26, 2011
2
Outline Software Status Team and Financial Status Application Status
Highlights of ESMF and NUOPC Layer Capabilities Trends and Challenges
3
ESMF, the NUOPC Layer, and Custom Coupling
ESMF Metrics: ~7000 downloads ~150 components in use ~4000 individuals on info mailing list ~40 platform/compilers regression tested nightly ~8000 regression tests ~1M SLOC Multiple layers provide both code reuse and flexibility The Earth System Modeling Framework (ESMF) is community- developed, community-governed software for building and coupling model components. The National Unified Operational Prediction Capability (NUOPC Layer) is a set of extensions to ESMF that increases component interoperability The NOAA Environmental Modeling System (NEMS) is one of the U.S. modeling systems that are using ESMF and the NUOPC Layer 3
4
ESMF Software Status Release (January 2016) and (December 2016) Simplification and reorganization of the NUOPC interface, plus conveniences – namespaces for field exchange, clearer naming of phases ESMF grid remapping additions: point cloud/observational data representation first-order conservative regridding of concave quadrilateral cells non-conservative regridding on mesh elements with more than four sides the use of great circle lines when computing 3D bilinear interpolation weights on a sphere A new command-line tool, ESMF_Regrid, generates interpolation weights and applies them to an input data field The ESMPy(thon) grid remapping interface allows a Field to be filled with data from a NetCDF file, and Anaconda packages are available for easy installation of ESMPy on Linux and Darwin system Optimizations and fixes for running at >10,000 processors Parallel IO (PIO) is on by default, and there are new asynchronous I/O prototypes
5
ESMF Software Status Included in the next release (anticipated May 2017) Implementation of higher order conservative grid remapping (Bob Oehmke/CIRES) Three new shortcuts for cubed sphere grid creation (Bob Oehmke/CIRES) Support for hierarchies in the NUOPC Layer (Gerhard Theurich/NRL) Completion of the DOE MOAB (Mesh Oriented database) as an option for the underlying finite element mesh framework for grid remapping, vs the ESMF native FEM implementation (Ryan O’Kuinghttons and Bob Oehmke/CIRES) Extension of the ESMF virtual machine to recognize, allocate to, and begin to negotiate for heterogeneous resources such as accelerators (Jayesh Krishna/ANL) Implementation of dynamic masking during the application of interpolation weights (Gerhard Theurich/NRL) Extrapolation of points that lie outside the source grid (Bob Oehmke/CIRES)
6
ESMF Support Challenging to keep up
7
ESMF Team Stable since ~2005, some changes coming up
Working toward a team of ~10 staff members (from ~12 staff members) – currently: Manager: Cecelia DeLuca (NOAA CPO) Architectural lead: Gerhard Theurich (NRL) Training and project lead: Rocky Dunlap (NRL/NASA) Project and development lead: Raffaele Montuoro (NOAA CPO) Grid and grid remapping lead: Bob Oehmke (NOAA SWPC) Testing and release management lead: Silverio Vasquez (NASA) Model applications and development: Fei Liu (NRL), Dan Rosen (NRL/NASA), Ryan O’Kuinghttons (NWS) Coupler construction: Anthony Craig (NWS, as-needed) Grid support and GIS projects: Ben Koziol (NOAA CPO) Porting, support and I/O: Walter Spector (NWS) Open position – system administrator (GSD, once overheads start) – to be merged with test lead position Positions recently released: 1 FTE administrative support (to be replaced by GSD administrative support), 1 FTE CMIP6 support, .5 FTE CoG development, all NOAA CPO
8
NESII Move to Global Systems Division
Opportunities and Costs Starting in October, 2016, the NOAA Environmental Software Inrastructure and Interoperability (NESII) team moved from the ESRL Director’s Office to the Global Systems Division The move has the advantage of providing infrastructure support (system administration, administrative assistance, financial tracking, etc.) and advocacy at NOAA The move has the disadvantage of adding a 29% additional overhead to incoming funds Existing projects were grandfathered Currently the NESII team consists of CIRES (University of Colorado) employees and federal contractors The NESII team does not have any base-funded positions or any federal positions
9
ESMF Projects Current support Title Start End Per year Sponsor
An Integrated, Observation-Driven Hydrological Modeling System Using LIS and WRF-Hydro Enabled by ESMF 12/1/2015 11/30/2019 $80,000 NASA MAP Extending Interoperability of ESMF-Based Models at NASA $175,000 Cupid: An Integrated Development Environment for Earth System Models 10/20/2015 10/19/2017 $155,000 NASA CMAQ NOAA Environmental Software Infrastructure and Interoperability team 5/1/2015 4/30/2018 $600,000 NWS Coupling Between the Whole Atmosphere Model (WAM) and the Ionosphere-Plasmasphere Electrodynamics (IPE) Model 8/1/2016 7/31/2017 $240,000 NOAA SWPC Modeling and Data Infrastructure in Support of NOAA’s Global Models 8/1/2015 7/31/2018 $840,000 NOAA CPO An Integration and Evaluation Framework for ESPC Coupled Models 8/1/2013 11/30/2018 $40,000 ONR NOPP An Integrated Hydrological Modeling System for High-Resolution Coastal Applications 4/1/2015 3/31/2018 NRL Earth System Modeling Framework and NUOPC Layer Development 10/1/2015 9/30/2018 $500,000 TOTAL $2,710,000
10
ESMF Applications Interoperable components
● NUOPC-compliant ● In progress ESMF Applications COUPLED MODELING SYSTEMS NEMS COAMPS and COAMPS-TC NESPC GEOS-5 GISS ModelE CESM Driver(s) and Coupler(s) ● ATMOSPHERE MODELS CAM COAMPS atmosphere GEOS-5 FV atmosphere GSM ModelE atmosphere NavGEM NEPTUNE NMMB OCEAN MODELS HYCOM MOM NCOM POP (Additional rows not shown) The Earth System Prediction Suite - ESPS Interoperable components The ESPS is a collection of federal and community weather and climate model components that use the Earth System Modeling Framework (ESMF) with interoperability conventions called the National Unified Operational Prediction Capability (NUOPC) Layer. Model components are more easily shared across systems The standard component interfaces enable major modeling centers to assemble systems with components from different organizations, and test a variety of components more easily. Complete table available at:
11
ESMF Capstone Publication
Topics: History of infrastructure development Description of ESMF and the NUOPC Layer The adoption process and tools for training and testing compliance Model applications and advances using ESMF and the NUOPC Layer Limits of interoperability Bulletin of the American Meteorological Society, July 2016 The Earth System Prediction Suite: Toward a Coordinated U.S. Modeling Capability G. Theurich, Science Applications International Corporation, McLean, Virginia C. DeLuca, NOAA/ESRL, and CIRES, Boulder, Colorado T. Campbell, Naval Research Laboratory, Stennis Space Center, Mississippi F. Liu and K. Saint, Cherokee Services Group, Fort Collins, Colorado M. Vertenstein, National Center for Atmospheric Research, Boulder, Colorado J. Chen, Science Applications International Corporation, McLean, Virginia R. Oehmke, NOAA/ESRL, and CIRES, Boulder, Colorado J. Doyle and T. Whitcomb, Naval Research Laboratory, Monterey, California A. Wallcraft, Naval Research Laboratory, Stennis Space Center, Mississippi M. Iredell and T. Black, NOAA/NWS/NCEP/Environmental Modeling Center, College Park, Maryland A. M. Da Silva and T. Clune, NASA Goddard Space Flight Center, Greenbelt, Maryland R. Ferraro and P. Li, NASA Jet Propulsion Laboratory, Pasadena, California M. Kelley and I. Aleinov, NASA Goddard Institute for Space Studies, New York, New York V. Balaji, Geophysical Fluid Dynamics Laboratory, and Princeton University, Princeton, New Jersey N. Zadeh, Geophysical Fluid Dynamics Laboratory, and Engility, Inc., Princeton, New Jersey R. Jacob, Argonne National Laboratory, Lemont, Illinois B. Kirtman, University of Miami, Miami, Florida F. Giraldo, Naval Postgraduate School, Monterey, California D. McCarren, Naval Meteorology and Oceanography Command, Naval Meteorology and Oceanography, Silver Spring, Maryland S. Sandgathe, Applied Physics Laboratory, University of Washington, Seattle, Washington S. Peckham, University of Colorado Boulder, Boulder, Colorado R. Dunlap, NOAA/ESRL, and CIRES, Boulder, Colorado
12
NUOPC Layer Additions to ESMF
Some simplified examples NUOPC Layer interoperability rules are implemented in ESMF applications using a set of generic components that represent the major structural pieces needed to build coupled models. Connector Mediator Driver Model NUOPC Generic Components
13
Highlight: Flexible Configuration
Some simplified examples Driver: SIMPLE Model: ATM WAVE Coupled system with a Driver, two Model components, and two Connectors This configuration creates a coupled system that allows a two-way feedback loop between ATM and WAVE. Driver: COUPLED WAVE A Driver with four Models and a Mediator The OCN and WAVE components communicate directly while other components receive data only after processing by the Mediator. The OCN component is hierarchical with an embedded driver for components representing subprocesses. Model: ATM Model: ICE Mediator Model: OCN Model: WAVE
14
Highlight: Flexible, Parameterized Run Sequences
Colors show actions performed by: Connectors (->) Mediator (MED) Models indicates coupling interval Highlight: Flexible, Parameterized Run Sequences Easy to change layout, component, order of operations ##################################### # Run Time Configuration File # ##################################### # EARTH # EARTH_component_list: MED ATM OCN ICE # MED # med_model: nems med_petlist_bounds: 60 65 #ATM# atm_model: fv3 atm_petlist_bounds: 0 31 # OCN # ocn_model: mom5 ocn_petlist_bounds: 32 55 # ICE # ice_model: cice ice_petlist_bounds: 56 59 # Run Sequence # runSeq:: @1800.0 MED MedPhase_prep_ocn MED -> OCN :remapMethod=redist OCN @600.0 MED MedPhase_prep_ice MED MedPhase_prep_atm MED -> ATM :remapMethod=redist MED -> ICE :remapMethod=redist ATM ICE ATM -> MED :remapMethod=redist ICE -> MED :remapMethod=redist MED MedPhase_atm_ocn_flux MED MedPhase_accum_fast @ OCN -> MED :remapMethod=redist :: Driver: SEASONAL Model: ATM Mediator ICE Model: OCN
15
Highlight: High Performance
Key features Do-no-harm: Overhead of ESMF/NUOPC component interfaces is small (for ESMF, ~ μs) Set of NUOPC prototypes demonstrates preservation of accelerator and other component-specific optimizations Scalable - key methods (e.g. sparse mat mul) tested to ~ 16K processors by ESMF team, ~100K processors by customers (e.g. NASA) Component interfaces and sequential/concurrent modes support increasing task parallelism and optimized mappings to hardware Data communications between components can preserve locality: Components with the same grid and decomposition: direct reference sharing local memory-to-memory copy Components on disjoint processor sets: redistribution parallel grid remapping Performance sudies available at:
16
Highlight: Grid Remapping Tools
Fast, flexible interpolation of discretized data High-performance Interpolation weight matrix is generated in parallel in 3D space and applied in parallel Wide range of supported grids Logically rectangular connected tiles, unstructured meshes, observational data streams (point cloud), 2D and 3D, global and regional grids, Cartesian and spherical coordinates Multiple interpolation methods Bilinear, higher-order patch recovery, first order conservative, nearest neighbor, higher order conservative in next release Options Masking, multiple pole treatments, straight or great circle distance measure Multiple interfaces Fortran API - generate and apply weights during a model run Python API - generate and apply weights using ESMPy File-based - generate and apply weights from grid files using ESMF command line utilities
17
Highlight: Non-intrusive Adoption
Adoption via model “caps” NUOPC Infrastructure (Driver, Mediator, Connector) In a NUOPC application, all model components interact with other components through NUOPC “caps.” “Caps” are wrappers that translate native model time, grids, and memory layouts into standard forms that the coupling infrastructure can understand. Caps can usually be built with few or no changes to the native model code. The same “cap” source code can be used in multiple applications. NUOPC “Cap” Import State sea_surface_temperature ocn_current_zonal ocn_current_merid Export State sea_ice_temperature Clock start, current, stop, timestep ESMF Grid/Mesh NUOPC execution phases and specialization points: AdvertiseFields(), RealizeFields(), ModelAdvance() Physical Model Fortran code Model Input Arrays Model Output Arrays Model Clock Model Grid/Mesh Model execution subroutines: Model_Init(), Model_Run(), Model_Finalize()
18
Specific Coupled System Examples
Examples across agencies – more during the afternoon session NUOPC software has been implemented in most major federal modeling systems, for example: The new Navy global coupled system including NAVGEM, HYCOM, CICE, LIS land and WAVEWATCHIII The GEOS-5 coupled modeling system, structured as a hierarchy with 50+ components, that is used for weather prediction through decadal time scales and reanalyses NUOPC CESM, which supports high resolution ocean coupling at a range of time scales, with a focus on climate, with separate atmosphere, HYCOM ocean, sea ice, land and other components The NOAA NUOPC-based prototype coupled seasonal system, with GSM (soon FV3), MOM5, and CICE Components can be shared with minimal or no code changes ESMF and NUOPC capabilities allow for customization of coupling techniques for specific problems. 18
19
Trends Observations and opportunities Growing interest in extended weather prediction and unified modeling across scales, leading to greater interest in coupled systems that can be flexibly configured Interest in hierarchies, and support for subcomponent concurrency and grid remapping (i.e., parallel radiation) Continued interest in emerging computing architectures, and optimizations related to preserving locality and increasing task parallelism Continued interest in all forms of ESMF grid remapping, and frequent requests for new features Growing demand for ESMF and NUOPC training
20
Challenges Post-interoperability, many challenges are about management of shared codes Where and how do we create repositories for shared components and couplers? How can we support agencies that require some codes to be proprietary, but benefit from aspects of community interaction? Where do we store model “caps” and who is responsible for updating, documenting, and testing them? How do we coordinate changes across components and coupled systems that are increasingly community-developed? Other challenges… Staying ahead of development and training requests while supporting a national and international user base Establishing a new “battle rhythm” (meetings and training schedule) for a framework that is close to fully deployed
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.