The Earth System Modeling Framework Robert Oehmke, Gerhard Theurich, Cecelia DeLuca NOAA Cooperative Institute for Research in Environmental Sciences University.

Slides:



Advertisements
Similar presentations
Expanding Regridding Capabilities of the Earth System Modeling Framework Andrew Scholbrock University of Colorado – Boulder Robert Oehmke NOAA/CIRES 1.
Advertisements

INTRODUCTION TO SIMULATION WITH OMNET++ José Daniel García Sánchez ARCOS Group – University Carlos III of Madrid.
NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT | U MICH Emergence of the Earth System Modeling Framework NSIPP Seasonal Forecast.
Earth System Curator Spanning the Gap Between Models and Datasets.
1 Earth System Modeling Framework ESMF and the Transformation of Earth System Modeling Sylvia Murphy
Integrated Frameworks for Earth and Space Weather Simulation Timothy Killeen and Cecelia DeLuca National Center for Atmospheric Research, Boulder, Colorado.
CSE351/ IT351 Modeling And Simulation Choosing a Mesh Model Dr. Jim Holten.
Mesoscale & Microscale Meteorological Division / NCAR ESMF and the Weather Research and Forecast Model John Michalakes, Thomas Henderson Mesoscale and.
Interim Review Cupid: An IDE for Model Development and Modeler Training Cecelia DeLuca 1, Rocky Dunlap 2, Spencer Rugaber 2 1 NOAA ESRL/University of Colorado.
A Quick Tour of the NOAA Environmental Software Infrastructure and Interoperability Group Cecelia DeLuca and the ESMF team ESRL Directorate Seminar June.
Coupling Climate and Hydrological Models Interoperability Through Web Services.
Coupling Climate and Hydrological Models Interoperability Through Web Services Kathy Saint/SGI – NESII Jon Goodall/University of South Carolina Richard.
Session 2: ESMF Distributed Data Classes
Metadata Creation with the Earth System Modeling Framework Ryan O’Kuinghttons – NESII/CIRES/NOAA Kathy Saint – NESII/CSG July 22, 2014.
Fast Parallel Grid Remapping for Unstructured and Structured Grids Robert Oehmke NOAA Cooperative Institute for Research in Environmental Sciences University.
Project Overview GMAO Seasonal Forecast NCAR/LANL CCSM NCEP Forecast GFDL FMS Suite MITgcm NASA GMAO Analysis Climate Data Assimilation.
ESMF Town Hall Meeting AGU Fall Meeting 2010 San Francisco Gerhard Theurich, Fei Liu, Peggy Li, Cecelia DeLuca NOAA/CIRES December 15, 2010
An Introduction to Software Architecture
What is ESMF and what does it mean to adopt it? 3 rd ESMF Community Meeting Cecelia DeLuca Nancy Collins
Metadata for the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) using the Earth System Modeling Framework (ESMF) Peter Bosler University.
NE II NOAA Environmental Software Infrastructure and Interoperability Program Cecelia DeLuca Sylvia Murphy V. Balaji GO-ESSP August 13, 2009 Germany NE.
Cecelia DeLuca, Don Stark, Chris Hill Arctic System Model Workshop May 20, 2008 Earth System Modeling Framework.
ESMF Development Status and Plans ESMF 4 th Community Meeting Cecelia DeLuca July 21, 2005 Climate Data Assimilation Weather.
Computational Design of the CCSM Next Generation Coupler Tom Bettge Tony Craig Brian Kauffman National Center for Atmospheric Research Boulder, Colorado.
A Metadata Based Approach For Supporting Subsetting Queries Over Parallel HDF5 Datasets Vignesh Santhanagopalan Graduate Student Department Of CSE.
Initial Results from the Integration of Earth and Space Frameworks Cecelia DeLuca/NCAR, Alan Sussman/University of Maryland, Gabor Toth/University of Michigan.
Model Coupling Environmental Library. Goals Develop a framework where geophysical models can be easily coupled together –Work across multiple platforms,
Coupling Climate and Hydrological Models Interoperability Through Web Services.
The use of modeling frameworks to facilitate interoperability Cecelia DeLuca/NCAR (ESMF) Bill Putman/NASA GSFC (MAPL) David Neckels/NCAR.
CESM/ESMF Progress Report Mariana Vertenstein NCAR Earth System Laboratory CESM Software Engineering Group (CSEG) NCAR is sponsored by the National Science.
Earth System Modeling Framework Status Cecelia DeLuca NOAA Cooperative Institute for Research in Environmental Sciences University of Colorado, Boulder.
Presented by An Overview of the Common Component Architecture (CCA) The CCA Forum and the Center for Technology for Advanced Scientific Component Software.
Components, Coupling and Concurrency in the Earth System Modeling Framework N. Collins/NCAR, C. DeLuca/NCAR, V. Balaji/GFDL, G. Theurich/SGI, A. da Silva/GSFC,
State of ESMF Cecelia DeLuca NOAA ESRL/University of Colorado ESMF Executive Board/Interagency Meeting June 12, 2014.
Earth System Modeling Framework Workshop on “Coupling Technologies for Earth System Modelling : Today and Tomorrow” CERFACS, Toulouse (France) – Dec 15.
ESMF Status and Future Plans Cecelia DeLuca BEI Technical Review Boulder, CO March 13-14, 2007 Climate Data Assimilaton Weather.
Earth System Modeling Framework Python Interface (ESMP) October 2011 Ryan O’Kuinghttons Robert Oehmke Cecelia DeLuca.
Strategic Plan Implementation Cecelia DeLuca/NCAR (ESMF) December 17, 2008 ESMF Board/Interagency Meeting.
Slides for NUOPC ESPC NAEFS ESMF. A NOAA, Navy, Air Force strategic partnership to improve the Nation’s weather forecast capability Vision – a national.
ESMF Code Generation with Cupid Update and Demo October 2009 Rocky Dunlap Spencer Rugaber Leo Mark Georgia Tech College of Computing.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH May 15, 2003 Nancy Collins, NCAR 2nd Community Meeting, Princeton, NJ Earth System.
CCA Common Component Architecture CCA Forum Tutorial Working Group CCA Status and Plans.
Introduction to Grids By: Fetahi Z. Wuhib [CSD2004-Team19]
NCEP ESMF GFS Global Spectral Forecast Model Weiyu Yang, Mike Young and Joe Sela ESMF Community Meeting MIT, Cambridge, MA July 21, 2005.
Coupling protocols – software strategy Question 1. Is it useful to create a coupling standard? YES, but … Question 2. Is the best approach to make a single.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH May 14, 2003 Nancy Collins, NCAR Components Workshop, Princeton, NJ Components in the.
ESMF Strategic Discussion Cecelia DeLuca NOAA ESRL/University of Colorado ESMF Executive Board/Interagency Meeting June 12, 2014.
ESMF Regridding Update Robert Oehmke, Peggy Li, Ryan O’Kuinghttons, Mat Rothstein, Joseph Jacob NOAA Cooperative Institute for Research in Environmental.
ESMF Regridding Update Robert Oehmke Ryan O’Kuinghttons Amik St. Cyr.
Earth System Curator and Model Metadata Discovery and Display for CMIP5 Sylvia Murphy and Cecelia Deluca (NOAA/CIRES) Hannah Wilcox (NCAR/CISL) Metafor.
ESMF,WRF and ROMS. Purposes Not a tutorial Not a tutorial Educational and conceptual Educational and conceptual Relation to our work Relation to our work.
Using ESMF Regridding Tools as an Observation Operator Presenter: Mathew V. Rothstein Software Engineer, NOAA/CNT Training at NRL Monterey August 5-6,
Enhancements for Hydrological Modeling in ESMF Cecelia DeLuca/NCAR (ESMF) December 19, 2008 AGU Fall Meeting.
ESMF Change Review Board Robert Ferraro ESMF Board Meeting Dec 17, 2008 Climate Data Assimilation Weather.
Emergence of a Common Modeling Architecture for Earth System Science American Geophysical Union December 13, 2010 Cecelia DeLuca NOAA/CIRES.
ESMF and the future of end-to-end modeling Sylvia Murphy National Center for Atmospheric Research
State of ESMF: The NUOPC Layer Gerhard Theurich NRL/SAIC ESMF Executive Board / Interagency Working Group Meeting June 12, 2014.
NOAA Environmental Modeling System Cecelia DeLuca NOAA Environmental Software Infrastructure and Interoperability (NESII) NOAA ESRL/University of Colorado.
The NOAA Environmental Modeling System at NCEP Mark Iredell and the NEMS group NOAA/NWS/NCEP Environmental Modeling Center June 12, 2014.
A TIME-GCM CAM Multi-executable Coupled Model Using ESMF and InterComm Robert Oehmke, Michael Wiltberger, Alan Sussman, Wenbin Wang, and Norman Lo.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH C. DeLuca/NCAR, J. Anderson/NCAR, V. Balaji/GFDL, B. Boville/NCAR, N. Collins/NCAR,
Metadata Development in the Earth System Curator Spanning the Gap Between Models and Datasets Rocky Dunlap, Georgia Tech 5 th GO-ESSP Community Meeting.
A Quick Tour of the NOAA Environmental Software Infrastructure and Interoperability Group Cecelia DeLuca Dr. Robert Detrick visit March 28, 2012
GMAO Seasonal Forecast
Building a Distributed Earth System Model Community
ESPC Air-Ocean-Land-Ice Global Coupled Prediction
A Quick Tour of the NOAA Environmental Software Infrastructure and Interoperability Group Cecelia DeLuca and the ESMF team ESRL Directorate Seminar June.
ESMF Regridding Update
An Introduction to Software Architecture
A brief introduction to NEMS
Presentation transcript:

The Earth System Modeling Framework Robert Oehmke, Gerhard Theurich, Cecelia DeLuca NOAA Cooperative Institute for Research in Environmental Sciences University of Colorado, Boulder Gung Ho Meeting Bath, England November 2, 2014

Outline ESMF Overview ESMF Infrastructure ESMF Superstructure New Directions: NUOPC/ESPS and Cupid Current Status and Future Work

Motivation In climate research and numerical weather prediction… increased emphasis on detailed representation of individual physical processes; requires many teams of specialists to contribute components to an overall “coupled” modeling system (e.g. atmosphere, ocean, sea ice) In computing technology... increase in hardware and software complexity in high-performance computing, as we shift toward the use of scalable and now accelerator-based computing architectures In software … emergence of frameworks to promote code reuse and interoperability The ESMF is a focused community effort to tame the complexity of models and the computing environment. It leverages, unifies and extends existing software frameworks, creating new opportunities for scientific contribution and collaboration.

What is ESMF? ESMF provides: – Tools for building models – Tools for coupling models ESMF is: – Portable: Unix/Linux and Windows (Cygwin/MinGW) systems Tested on >40 different OS/Compiler/MPI combinations every night – Well tested: More than 6000 unit/system tests – Parallel: Based on MPI (with “mpiuni” bypass mode available) OpenMP and Pthreads support – Flexible: Interfaces to multiple languages A wide range of functionality and supported options

Interfaces Complete F95 API: – use ESMF – Derived types and methods – Investigating moving to Fortran 2003 Limited C API: – #include “ESMC.h” – Structs and methods Limited Python API: – Import ESMPy – Classes with methods Applications: – File-based interpolation weight generation: mpirun –np ESMF_RegridWeightGen –s …. – File-based weight generation AND application of weights: mpirun –np ESMF_Regrid (coming next release)

The ESMF Sandwich Superstructure: Component data structures and methods for coupling model components Infrastructure: Field data structures and methods for building model components, and utilities for coupling YOU DON’T NEED TO USE THEM BOTH!

Outline ESMF Overview ESMF Infrastructure ESMF Superstructure New Directions: NUOPC/ESPS and Cupid Current Status and Future Work

ESMF Infrastructure Distributed data classes: Used to hold data spread across a set of processors – Represents data so ESMF can perform operations on it – Provides a standard representation to be passed between components – Can reference user memory (usually) – Consists of two kinds of structures: Index Space classes (Arrays) Physical Space classes (Fields) Utilities: – Time Manager: Classes to represent time, time intervals, alarms… Used in ESMF for passing time info between models, time loops, etc. Also useful for doing calculations with time, conversions, etc. – Attributes: Allow metadata to be attached to ESMF classes Instrument models to be more self describing Can be written to various file formats, e.g. CIM compliant XML – Others: Logging (LogError), Virtual Machine (VM), Config

Index Space Distributed Data Distgrid: Represents index space and distribution across processors – Supports multiple index space representations From arbitrary 1-D sequences To N-D Tiles To N-D Tiles connected together along their edges – Supports multiple distribution options Array: A distributed index space data container – Array = data + Distgrid – Supports different data types: integer 4, real 4, real 8, … – Other options: halos, undistributed dimensions, … ArrayBundle: A set of Arrays which can be operated on at one time.

Physical Space Grid Representation Classes Grid: – Structured representation of a region – Consists of logically rectangular tiles Mesh: – Unstructured representation of a region – In 2D: polygons with any number of sides N-gon support added for MetOffice users – In 3D: tetrahedrons & hexahedrons Xgrid (Exchange Grid): – Represents boundary layer between two regions – Represented by custom constructed Mesh LocStream (Location Stream): – Set of disconnected points – E.g. locations of observations

Physical Space Distributed Data Field: A distributed physical space data container – Field = data + grid representation class (e.g. Grid, Mesh, …) – Based on Array/Distgrid, so supports those index/distribution options – Can get corresponding coordinates from grid representation class – Supports different data types: integer 4, real 4, real 8, … – Other options: halos, undistributed dimensions, … FieldBundle: A set of Fields which can be operated on at one time.

Distributed Data Class Operations Sparse Matrix Multiply: – Apply coefficients (weights) in parallel to distributed data – Highly tuned for efficiency/auto-tunes for optimal execution – Underlies most of ESMF distributed data operations Redistribution: – Move data between distributions without changing values – Useful in cases where grid doesn’t change, but distribution does Halo: – Fill “Halo” cells which hold data from another processor – Useful during computations on distributed data Regridding: – Move data from one grid to a different one – Useful when moving data between models with different grids – Only available on physical space data classes

Basic Computation ! Create grid representation class grid=ESMF_GridCreate(….) ! Create Field on grid representation class Field=ESMF_FieldCreate(grid,….) ! Create Halo Communication Structure Call ESMF_FieldHaloStore(Field,…,routehandle) ! Loop over time do t=1,…. ! Get Field data Call ESMF_FieldGet(Field, farrayPtr=ptr_to_data,…) ! Loop over memory doing computations on data do i=1,… do j=1… ptr_to_data(i,j)=…. enddo ! Update halo Call ESMF_FieldHalo(Field,…) enddo

Sparse Matrix Performance Example Performance of sparse matrix multiply in CESM (1_1_beta09) CPL7 Run on Cray XK6 (jaguarpf)

Regrid: Features Interfaces: F90, C, Python, ESMF_RegridWeightGen (file-based separate application) Multiple interpolation types: – Bilinear – Higher order patch recovery Yields better derivatives/smoother results than bilinear Based on “patch recovery” used in finite element modeling [1][2] – Nearest neighbor – First order conservative Path between points in bilinear: options for straight line or great circle – Added for MetOffice customers because of “accuracy” questions Normalization options for conservative: destination area or fraction – Added for MetOffice customers Pole options for global spherical logically rectangular grids: – Full circle average, N-point average, teeth, no pole Other: masking, user area, ignore unmapped,…

Regrid: Spherical Grids Support grids with spherical (lon, lat, rad) coordinates Mix and match pairs of: – Global 2D logically rectangular Grids – Regional 2D logically rectangular Grids – 2D unstructured Meshes composed of polygons with any number of sides: ESMF internally represents these as triangles and quadrilaterals Supported elements: triangles, quadrilaterals, pentagons, hexagons,… – Multi-patch grids (e.g. cubed spheres) currently supported via Meshes HOMME Cubed Sphere Grid with Pentagons Courtesy Mark Taylor of Sandia FIM Unstructured Grid Regional Grid

Regrid: Cartesian Grids In addition, regridding supports Cartesian (x,y,z) coordinates: – Regridding between any pair of: 2D Meshes composed of polygons with any number of sides 2D logically rectangular Grids composed of a single patch – Bilinear or Conservative regridding between any pair of: 3D Meshes composed of hexahedrons 3D logically rectangular Grids composed of a single patch 2D Unstructured Mesh From 3D Grid3D Unstructured Mesh

Regrid Weight Calculation Performance Platform: IBM IDataPlex cluster ( Yellowstone at NCAR)

Outline ESMF Overview ESMF Infrastructure ESMF Superstructure New Directions: NUOPC/ESPS and Cupid Current Status and Future Work

ESMF Superstructure State: structure for transferring data between models in a standard way. Can contain Array, Field, Bundles, other States, etc… Gridded Component: wraps a model and allows it to be called in a standard way. Coupler Component: wraps user code for translating data between models and allows it to be called in a standard way. Coupler Comp. Grid Comp. State

Component Hierarchy Using superstructure components can be arranged hierarchically, helping to organize complex models Different groups may create different kinds or levels of components ESMF components in the GEOS-5 atmospheric GCM

Component Overhead Overhead of ESMF component wrapper around native CCSM4 component. No significant performance overhead (<3% is typical) Platform: IBM Power 575, bluefire, at NCAR

Outline ESMF Overview ESMF Infrastructure ESMF Superstructure New Directions: NUOPC/ESPS and Cupid Current Status and Future Work

The initial ESMF software fell short of the vision for common infrastructure in several ways: 1.Implementations of ESMF could vary widely and did not guarantee a minimum level of technical interoperability among sites - creation of the NUOPC Layer 2.It was difficult to track who was using ESMF and how they were using it – initiation of the Earth System Prediction Suite 3.There was a significant learning curve for implementing ESMF in a modeling code – Cupid Integrated Development Environment New development directions address these gaps… New Directions

NUOPC Layer 1. Implementations of ESMF could vary widely and did not guarantee a minimum level of technical interoperability among sites National Unified Operational Prediction Capability: Consortium of U.S. operational weather and water prediction centers. Participants: NOAA, Navy, Air Force, NASA, and other associated modeling groups. Overall goals: – Improve collaboration among agencies. – Accelerate the transition of new technology into the operational centers. Technical goal: Increase interoperability of ESMF-based applications NUOPC websites: – –

The NUOPC Layer An interoperability layer on top of ESMF that adds: Definitions for the component interactions during Initialize, Run, Finalize. Generic components that provide a standard implementation of interoperable components. A field dictionary, based on Climate & Forecast (CF) conventions, as the basis for a standard identification of fields between components. Mechanisms to report component incompatibilities detected during run-time. A compliance checker option that serves as a development and debugging tool. A collection of example applications. – See:

Model : Implements a specific physical domain, e.g. atmosphere, ocean, wave, ice. Mediator : Scientific coupling code (flux calculations, accumulation, averaging, etc.) between (potentially multiple) Models. Connector : Connects pairs of components in one direction, e.g. Model to/from Model, or Model to/from Mediator. Executes simple transforms (Regrid/Redist, units). Driver : Provides a harness for Models, Mediators, and Connectors (supporting hierarchies). Coordinates initialize and run sequences. NUOPC Layer Generic Components

NUOPC Layer Examples

2. It was difficult to track who was using ESMF and how they were using it The Earth System Prediction Suite (ESPS) is a collection of weather and climate modeling codes that use ESMF with the NUOPC conventions. The ESPS makes clear which codes are available as ESMF components and modeling systems. ESPS website: Inclusion criteria for NUOPC: ESPS components and coupled modeling systems are NUOPC-compliant. A minimal, prescribed set of model documentation that conforms to the Common Information Model standard is provided for each version of the ESPS component or modeling system. ESPS codes must have clear terms of use (e.g. public domain statement, open source license, proprietary status), and must have a way for credentialed ESPC collaborators to request access. Regression tests are provided for each component and modeling system. There is a commitment to continued NUOPC compliance and ESPS participation for new versions of the code. The Earth System Prediction Suite

Currently, components in the ESPS can be of the following types: coupled system, atmosphere, ocean, wave, sea ice Target codes include: The Community Earth System Model (CESM) and its constituent components The NOAA Environmental Modeling System (NEMS), including the new Climate Forecast System The MOM5 and HYCOM oceans SWAN and WaveWatch 3 wave models The Navy Global Environmental Model (NavGEM)-HYCOM-CICE coupled system The Navy Coupled Ocean Atmosphere Mesoscale Prediction System (COAMPS) and COAMPS Tropical Cyclone (COAMPS-TC) NASA GEOS-5 NASA ModelE Model Codes in the ESPS

Cupid Development and Training Environment 3. There was a significant learning curve for implementing ESMF in a modeling code CUPID GOAL: Make ESMF training and development simpler and more appealing NOAA CIRES, Georgia Institute of Technology, and NASA GISS/GSFC collaboration Eclipse-based “Integrated Development Environment” or IDE Customized for ESMF applications with NUOPC conventions Cupid is a working prototype expected to be ready for first public release in Cupid project: Cupid tutorial: ide/cupid/blob/master/org.earthsystemcurator.cupid.nuopc.fsml/doc/latex/cupid.pdf?raw=true ide/cupid/blob/master/org.earthsystemcurator.cupid.nuopc.fsml/doc/latex/cupid.pdf?raw=true

Cupid Development and Training Environment Select sample code or model Pick a training problem (or coupled model) Generate a framework-aware outline of the source code Navigate around the source code using the outline Use an editor to modify the source code Automatically generate code needed for NUOPC compliance Compile and run locally or on a cloud (currently Amazon Web Services) Source code editor Console for viewing output Project explorer NUOPC outline

Outline ESMF Overview ESMF Infrastructure ESMF Superstructure New Directions: NUOPC/ESPS and Cupid Current Status and Future Work

Current Status Released ESMF 6.3.0r in January Highlights: Added support for n-gons in Mesh Great circle paths for bilinear Released ESMF 6.3.0rp1 in July Highlights: Python interface (ESMPy) brought into ESMF source Fraction normalization for conservative – Allows regrid of partial destination cells without user normalization

Scheduled for Upcoming Releases Support for 4-sided concave cells in regridding – now all cases work correctly (7.0.0) – Implemented and available as a snapshot For ESMPy, removed requirement that cell centers of Fields be defined, even for operations where they were not needed (7.0.0) – Implemented and available as a snapshot Higher order conservative regridding (7.0.0) Breaking up grid files to increase maximum grid size possible for interpolation weight generation (7.0.0) MOAB finite element library integration: (7.0.0 & 7.1.0) – Introducing MOAB finite element library in addition to ESMF native finite element library – Will be testing to see if we replace native library with MOAB – Would bring in support for higher order elements Extrapolation of points that lie outside the source grid (7.0.0)

References Patch interpolation: 1.Khoei S.A., Gharehbaghi A. R. The superconvergent patch recovery technique and data transfer operators in 3d plasticity problems. Finite Elements in Analysis and Design, 43(8), Hung K.C, Gu H., Zong Z. A modified superconvergent patch recovery method and its application to large deformation problems. Finite Elements in Analysis and Design, 40(5-6), If you have questions or requests, come talk to me, or