ESMF Town Hall Meeting AGU Fall Meeting 2010 San Francisco Gerhard Theurich, Fei Liu, Peggy Li, Cecelia DeLuca NOAA/CIRES December 15, 2010

Slides:



Advertisements
Similar presentations
Expanding Regridding Capabilities of the Earth System Modeling Framework Andrew Scholbrock University of Colorado – Boulder Robert Oehmke NOAA/CIRES 1.
Advertisements

NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT | U MICH Emergence of the Earth System Modeling Framework NSIPP Seasonal Forecast.
Earth System Curator Spanning the Gap Between Models and Datasets.
Metadata Development in the Earth System Curator Spanning the Gap Between Models and Datasets Rocky Dunlap, Georgia Tech.
NEMS/GFS Overview Mark Iredell, Software Team Lead.
Overview of NEMS infrastructure Jun Wang Mark Iredell NEMS-NMMB tutorial April 1,
1 Earth System Modeling Framework ESMF and the Transformation of Earth System Modeling Sylvia Murphy
Integrated Frameworks for Earth and Space Weather Simulation Timothy Killeen and Cecelia DeLuca National Center for Atmospheric Research, Boulder, Colorado.
1 ESMF in Production at NCEP Mark Iredell Chief NCEP/EMC Global Climate and Weather Modeling Branch May 23, 2006 NCEP: “where America’s climate, weather,
Mesoscale & Microscale Meteorological Division / NCAR ESMF and the Weather Research and Forecast Model John Michalakes, Thomas Henderson Mesoscale and.
1 NGGPS Dynamic Core Requirements Workshop NCEP Future Global Model Requirements and Discussion Mark Iredell, Global Modeling and EMC August 4, 2014.
Interim Review Cupid: An IDE for Model Development and Modeler Training Cecelia DeLuca 1, Rocky Dunlap 2, Spencer Rugaber 2 1 NOAA ESRL/University of Colorado.
The NASA Modeling, Analysis and Prediction (MAP) Modeling Environment Don Anderson NASA HQ Sience Mission Directorate Earth-Sun Division Manager, Modeling,
Earth System Modeling Framework Capabilities Cecelia DeLuca SCD Users Forum May 18, GMAO.
Community infrastructure for building and coupling high performance climate, weather, and coastal models Cecelia DeLuca NOAA / CIRES University of Colorado,
NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT Adoption and field tests of M.I.T General Circulation Model (MITgcm) with ESMF Chris Hill ESMF.
NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT | U MICH First Field Tests of ESMF GMAO Seasonal Forecast NCAR/LANL CCSM NCEP.
Metadata Creation with the Earth System Modeling Framework Ryan O’Kuinghttons – NESII/CIRES/NOAA Kathy Saint – NESII/CSG July 22, 2014.
Fast Parallel Grid Remapping for Unstructured and Structured Grids Robert Oehmke NOAA Cooperative Institute for Research in Environmental Sciences University.
Earth System Modeling Infrastructure Cecelia DeLuca/ESMF-NCAR March 31-April 1, 2009 CHyMP Meeting.
Project Overview GMAO Seasonal Forecast NCAR/LANL CCSM NCEP Forecast GFDL FMS Suite MITgcm NASA GMAO Analysis Climate Data Assimilation.
An Introduction to Software Architecture
What is ESMF and what does it mean to adopt it? 3 rd ESMF Community Meeting Cecelia DeLuca Nancy Collins
Metadata for the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) using the Earth System Modeling Framework (ESMF) Peter Bosler University.
NE II NOAA Environmental Software Infrastructure and Interoperability Program Cecelia DeLuca Sylvia Murphy V. Balaji GO-ESSP August 13, 2009 Germany NE.
Cecelia DeLuca, Don Stark, Chris Hill Arctic System Model Workshop May 20, 2008 Earth System Modeling Framework.
ESMF Development Status and Plans ESMF 4 th Community Meeting Cecelia DeLuca July 21, 2005 Climate Data Assimilation Weather.
Update on ESMF, Earth System Curator, and Earth System CoG Cecelia DeLuca and the ESMF team CCSM Software Engineering Working Group June 23, 2011.
Computational Design of the CCSM Next Generation Coupler Tom Bettge Tony Craig Brian Kauffman National Center for Atmospheric Research Boulder, Colorado.
Initial Results from the Integration of Earth and Space Frameworks Cecelia DeLuca/NCAR, Alan Sussman/University of Maryland, Gabor Toth/University of Michigan.
ESMF Application Status GMAO Seasonal Forecast NCAR/LANL CCSM NCEP Forecast GFDL FMS Suite MITgcm NCEP/GMAO Analysis Climate Data Assimilation.
The use of modeling frameworks to facilitate interoperability Cecelia DeLuca/NCAR (ESMF) Bill Putman/NASA GSFC (MAPL) David Neckels/NCAR.
ESMF Performance Evaluation and Optimization Peggy Li(1), Samson Cheung(2), Gerhard Theurich(2), Cecelia Deluca(3) (1)Jet Propulsion Laboratory, California.
CESM/ESMF Progress Report Mariana Vertenstein NCAR Earth System Laboratory CESM Software Engineering Group (CSEG) NCAR is sponsored by the National Science.
Earth System Modeling Framework Status Cecelia DeLuca NOAA Cooperative Institute for Research in Environmental Sciences University of Colorado, Boulder.
Introduction to the Earth System Modeling Framework International Workshop on Next Generation Climate Models for Advanced High Performance Computing Facilities.
ESMF/Curator Status Cecelia DeLuca CCSM Software Engineering Working Group Boulder, CO March 16, 2007 Climate Data Assimilaton Weather.
Components, Coupling and Concurrency in the Earth System Modeling Framework N. Collins/NCAR, C. DeLuca/NCAR, V. Balaji/GFDL, G. Theurich/SGI, A. da Silva/GSFC,
Earth System Modeling Framework Workshop on “Coupling Technologies for Earth System Modelling : Today and Tomorrow” CERFACS, Toulouse (France) – Dec 15.
ESMF Status and Future Plans Cecelia DeLuca BEI Technical Review Boulder, CO March 13-14, 2007 Climate Data Assimilaton Weather.
Earth System Modeling Framework Python Interface (ESMP) October 2011 Ryan O’Kuinghttons Robert Oehmke Cecelia DeLuca.
Strategic Plan Implementation Cecelia DeLuca/NCAR (ESMF) December 17, 2008 ESMF Board/Interagency Meeting.
Slides for NUOPC ESPC NAEFS ESMF. A NOAA, Navy, Air Force strategic partnership to improve the Nation’s weather forecast capability Vision – a national.
NCEP ESMF GFS Global Spectral Forecast Model Weiyu Yang, Mike Young and Joe Sela ESMF Community Meeting MIT, Cambridge, MA July 21, 2005.
Coupling protocols – software strategy Question 1. Is it useful to create a coupling standard? YES, but … Question 2. Is the best approach to make a single.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH May 14, 2003 Nancy Collins, NCAR Components Workshop, Princeton, NJ Components in the.
ESMF Strategic Discussion Cecelia DeLuca NOAA ESRL/University of Colorado ESMF Executive Board/Interagency Meeting June 12, 2014.
ESMF Regridding Update Robert Oehmke, Peggy Li, Ryan O’Kuinghttons, Mat Rothstein, Joseph Jacob NOAA Cooperative Institute for Research in Environmental.
ESMF Regridding Update Robert Oehmke Ryan O’Kuinghttons Amik St. Cyr.
1 National Environmental Modeling System (NEMS) Status M. Iredell and EMC Staff.
Earth System Curator and Model Metadata Discovery and Display for CMIP5 Sylvia Murphy and Cecelia Deluca (NOAA/CIRES) Hannah Wilcox (NCAR/CISL) Metafor.
The Earth System Modeling Framework Robert Oehmke, Gerhard Theurich, Cecelia DeLuca NOAA Cooperative Institute for Research in Environmental Sciences University.
Running CESM An overview
ESMF,WRF and ROMS. Purposes Not a tutorial Not a tutorial Educational and conceptual Educational and conceptual Relation to our work Relation to our work.
Building Community and Capability through Common Infrastructure: ESMF and the Earth System Curator Cecelia DeLuca MAP Meeting College.
Enhancements for Hydrological Modeling in ESMF Cecelia DeLuca/NCAR (ESMF) December 19, 2008 AGU Fall Meeting.
Emergence of a Common Modeling Architecture for Earth System Science American Geophysical Union December 13, 2010 Cecelia DeLuca NOAA/CIRES.
ESMF and the future of end-to-end modeling Sylvia Murphy National Center for Atmospheric Research
State of ESMF: The NUOPC Layer Gerhard Theurich NRL/SAIC ESMF Executive Board / Interagency Working Group Meeting June 12, 2014.
The NOAA Environmental Modeling System at NCEP Mark Iredell and the NEMS group NOAA/NWS/NCEP Environmental Modeling Center June 12, 2014.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH C. DeLuca/NCAR, J. Anderson/NCAR, V. Balaji/GFDL, B. Boville/NCAR, N. Collins/NCAR,
Metadata Development in the Earth System Curator Spanning the Gap Between Models and Datasets Rocky Dunlap, Georgia Tech 5 th GO-ESSP Community Meeting.
A Quick Tour of the NOAA Environmental Software Infrastructure and Interoperability Group Cecelia DeLuca Dr. Robert Detrick visit March 28, 2012
GMAO Seasonal Forecast
ESPC Air-Ocean-Land-Ice Global Coupled Prediction
ESMF Regridding Update
Joint GEOS-Chem and NCAR Modeling Workshop:
An Introduction to Software Architecture
Metadata Development in the Earth System Curator
A brief introduction to NEMS
Presentation transcript:

ESMF Town Hall Meeting AGU Fall Meeting 2010 San Francisco Gerhard Theurich, Fei Liu, Peggy Li, Cecelia DeLuca NOAA/CIRES December 15,

Outline Overview and Architecture Support and Extras Closer Look at Features Projects and Applications

Motivation In climate research and numerical weather prediction.. increased emphasis on detailed representation of individual physical processes; requires many teams of specialists to contribute components to an overall modeling system In computing technology... increase in hardware and software complexity in high-performance computing, as we shift toward the use of scalable computing architectures In software … emergence of frameworks to promote code reuse and interoperability The ESMF is a focused community effort to tame the complexity of models and the computing environment. It leverages, unifies and extends existing software frameworks, creating new opportunities for scientific contribution and collaboration.

Evolution Phase 1: NASA’s Earth Science Technology Office ran a solicitation to develop an Earth System Modeling Framework (ESMF). A multi-agency collaboration (NASA/NSF/DOE/NOAA) won the award. The core development team was located at NCAR. A prototype ESMF software package (version 2r) demonstrated feasibility. Phase 2: New sponsors included Department of Defense and NOAA. Many new applications and requirements were brought into the project, motivating a complete redesign of framework data structures (version 3r). Phase 3: The core development team moved to NOAA/CIRES for closer alignment with federal models. Basic framework development will be complete with version 5r (ports, bugs, feature requests, user support etc. still require resources). The focus is on increasing adoption and creating a community of interoperable codes.

Components ESMF is based on the idea of components – sections of code that are wrapped in standard interfaces Components can be arranged hierarchically, helping to organize the structure of complex models Different modeling groups may create different kinds or levels of components ESMF components in the GEOS-5 atmospheric GCM

Architecture Low Level Utilities Fields and Grids Layer Model Layer Components Layer Gridded Components Coupler Components ESMF Infrastructure User Code ESMF Superstructure MPI, NetCDF, … External Libraries ESMF provides a superstructure for assembling geophysical components into applications. ESMF provides an infrastructure that modelers use to – Generate and apply interpolation weights – Handle metadata, time management, data I/O and communications, and other functions – Access third party libraries

Standard Interfaces All ESMF components have the same three standard methods: – Initialize – Run – Finalize Each standard method has the same simple interface: call ESMF_GridCompRun (myComp, importState, exportState, clock, …) Where: myComp points to the component importState is a structure containing input fields exportState is a structure containing output fields clock contains timestepping information Steps to adopting ESMF Divide the application into components (without ESMF) Copy or reference component input and output data into ESMF data structures Register components with ESMF Set up ESMF couplers for data exchange Interfaces are wrappers and can often be set up in a non-intrusive way

Component Overhead Representation of the overhead for ESMF wrapped native CCSM4 component For this example, ESMF wrapping required NO code changes to scientific modules No significant performance overhead (< 3% is typical) Few code changes for codes that are modular Platform: IBM Power 575, bluefire, at NCAR Model: Community Climate System Model (CCSM) Versions: CCSM_4_0_0_beta42 and ESMF_5_0_0_beta_snapshot_01 Resolution: 1.25 degree x 0.9 degree global grid with 17 vertical levels for both the atmospheric and land model, i.e. 288x192x17 grid. The data resolution for the ocean model is 320x384x60.

Data Representation Options 1.Representation in index space (Arrays) One or more tiles store indices and topology Sparse matrix multiply for remapping with user supplied interpolation weights Highly scalable - no global information held locally, uses distributed directory approach (Devine 2002) for access to randomly distributed objects in an efficient, scalable way 2. Representation in physical space (Fields) Built on Arrays + some form of Grid Grids may be logically rectangular, unstructured mesh, or observational Remapping using parallel interpolation weight generation Also: ArrayBundles or FieldBundles, which group data for convenience and performance optimization Supported Array distributions

Metadata Handling and Usage Documentation of codes and data is critical as Earth system models are employed for decision making! Metadata is represented by the Attribute class as name/value pairs -Document data provenance -Automate some aspects of model execution and coupling Standard metadata is organized by Attribute packages -Aggregate, store, output in XML and other formats Attribute packages include the following conventions -Climate and Forecast (CF) -Select ISO standards -METAFOR Common Information Model (CIM) -These can be linked and nested

Applications of information layer Building an Information and Interoperability Layer Native model data structures Standard data structures Standard metadata Parallel generation and application of interpolation weights Run-time compliance checking of metadata and time behavior Fast parallel I/O Redistribution and other parallel communications Automated documentation of models and simulations (new) Ability to run components in workflows and as web services (new) Field Grid Clock Component Attributes: CF conventions, ISO standards, METAFOR Common Information Model Structured model information stored in ESMF wrappers User data is referenced or copied into ESMF structures modules fields grids timekeeping ESMF data structures

Outline Overview and Architecture Support and Extras Closer Look at Features Projects and Applications

Portability and Testing ESMF is comprehensively tested and extremely portable! Many tests and examples bundled with the software – About 4000 unit tests – An additional, automated test harness to cover the many options related to grids and distributions – Dozens of examples – Dozens of system tests – External demonstrations, showing ESMF linked to applications Users can separately download use test cases, with more realistic problem and data sizes Regression tests run nightly on 24+ platform/compiler combinations

Backwards Compatibility Following the next public release ESMF 5r, ESMF interfaces will be backwards compatible This will provide a solid platform for application development Some newer interfaces will not be included, for example – Location streams – Exchange grids Backwards compatibility will require the use of keywords (for example, rc=localrc) for optional arguments – Set up so users will know at compile time if this was not done

Where to Get Help Documentation and training materials Users Guide, comprehensive Reference Manuals Many examples and system tests Coming with public release 5r – Updated demonstration program – New web-based, user-friendly tutorial format If you’re stuck Write the support line, If you’re really stuck, we can usually arrange a call!

Outline Overview and Architecture Support and Extras Closer Look at Features Projects and Applications

Coupling options in ESMF Fortran or C components Single executable Multiple executable – Web service option – Top level MPMD Coupling communications can be called either from within a coupler or directly from a gridded component – useful when it is inconvenient to return from a component in order to perform a coupling operation Recursive components for nesting higher resolution regions Ensemble management with either concurrent or sequential execution of ensemble members

Grid Remapping Fast parallel computation of interpolation weights Weight generation is separate from weight application (sparse matrix multiply) for flexibility Supports grids that can be represented as combinations of triangular or rectangular elements, in 2D or 3D Bilinear, higher order finite element patch recovery (see below), or conservative interpolation options Pole options: n-point pole, full circle average, no pole Higher order method: – Khoei S.A., Gharehbaghi A. R. The superconvergent patch recovery technique and data transfer operators in 3d plasticity problems. Finite Elements in Analysis and Design, 43(8), – Hung K.C, Gu H., Zong Z. A modified superconvergent patch recovery method and its application to large deformation problems. Finite Elements in Analysis and Design, 40(5-6), 2004.

Remapping Performance ESMF parallel conservative remapping is scalable and accurate All ESMF interpolation weights are generated using unstructured finite element mesh Increases flexibility with 2D and 3D grids Adds overhead to bilinear interpolation Greatly improved performance over existing conservative methods Platform: Cray XT4, jaguar, at ORNL Versions: ESMF_5_2_0_beta_snapshot_07 and SCRIP 1.4 Resolution: - fv0.47x0.63: CAM Finite Volume grid, 576x384 - ne60np4: 0.5 degree cubed sphere grid, 180x180x6

Weight Generation Options Ways to generate interpolation weights: – Online Subroutine calls which calculate weights during run Can get weights or feed directly into ESMF sparse matrix multiply – Offline Application which generates a netCDF weight file from two netCDF grid files Summary of grid remapping options is posted at: /esmf_5_1_0_regridding_status.html

Performance of ESMF sparse matrix multiply Plot shows ESMF sparse multiply used in the Community Climate System Model (CCSM) for atmosphere to ocean grid remapping Comparable performance to native code, slightly better scaling at higher processor counts Versions: ESMF: 400rp2, CCSM: ccsm4_0_rel08 Resolution: f05_t12 (fv 0.47x0.63 atmosphere/land, tripole 0.1 ocean or 576x384 atmosphere/land and 3600x2400 ocean)

Noise reduction in CCSM transport Interp. noise Interpolation noise in a derivative of the zonal wind stress grid index in latitudinal direction ESMF higher order interpolation weights were used to map from a 2- degree Community Atmospheric Model (CAM) grid to an irregularly spaced POP ocean grid (384x320) dTAUx/dy was computed using interpolated fields – this is closely related to the curl of the wind stress, which drives the upper ocean circulation Noise is calculated as deviation of a point from the sum of itself plus four neighbors 33% reduction in noise globally compared to original bilinear interpolation (Image generated by NCAR CGD Oceanography section) Black = bilinear Red = higher-order ESMF v3.1.1 Green = higher order ESMF v4.0.0

Parallel I/O I/O is increasingly a bottleneck in high resolution simulations ESMF parallel I/O based on the PIO library developed by NCAR/DOE Integrated so that the user only sees ESMF data types: – ESMF_ArrayRead(), ESMF_ArrayWrite() – ESMF_FieldRead(), ESMF_FieldWrite() NetCDF and binary formats See: PIO User’s Guide,

Timekeeping Clocks – startTime, stopTime, runDuration, timeStep, many other properties – Forward and reverse modes for running clocks Alarms – Unsticky (turn themselves off after ringing) or sticky alarms TimeInterval and Time data types with many operators (+,-,/,==, more)

Summary of Features Fast parallel remapping: unstructured or logically rectangular grids, 2D and 3D, using bilinear, higher order, or conservative methods, integrated (during runtime) or offline (from files) Core methods are scalable to tens of thousands of processors Supports hybrid (threaded/distributed) programming for optimal performance on many computer architectures Multiple coupling and execution modes for flexibility Time management utility with many calendars, forward/reverse time operations, alarms, and other features Metadata utility that enables comprehensive metadata to be written out in standard formats Runs on 24+ platform/compiler combinations, exhaustive test suite and documentation Couples Fortran or C-based model components

ESMF 5r Represents completion of basic ESMF functions Expected early next year – alpha versions out

Outline Overview and Architecture Support and Extras Closer Look at Features Projects and Applications

A Common Model Architecture Increasingly, models in the U.S. follow a common architecture Atmosphere, ocean, sea ice, land, and/or wave models are components called by a top- level driver/coupler Components use ESMF or ESMF-like interfaces (see left) Many major U.S. weather and climate models either follow this architecture (CCSM/CESM, COAMPS, NEMS), want to follow this architecture for future coupled systems (NOGAPS), or have a different style of driver but could provide components to this architecture (GEOS-5, FMS) Even non-ESMF codes now look like ESMF … ESMF: ESMF_GridCompRun(gridcomp, importState, exportState, clock, phase, blockingFlag, rc) CESM (non-ESMF version): atm_run_mct(clock, gridcomp, importState, exportState) (argument names changed to show equivalence) WRF HYCOM CICE Ice POP Ocean CCSM4/CESM NMM-B Atm PhysNMM-B Atm Dynamics NEMS NMM History GFS Atm PhysGFS Atm Dynamics GFS GFS I/O FV Cub Sph Dycore GEOS-5 GWDGEOS-5 FV Dycore GEOS-5 Atm Dynamics GEOS-5 GSI GEOS-5 Moist Proc GEOS-5 Turbulence GEOS-5 LW RadGEOS-5 Solar Rad GEOS-5 Radiation GEOS-5 Aeros Chem GOCART Strat Chem Param Chem GEOS-5 Atm Chem GEOS-5 Ocean Biogeo GEOS-5 Salt Water Poseidon GEOS-5 Data Ocean GEOS-5 OGCM GEOS-5 Topology GEOS-5 Land Ice GEOS-5 Lake GEOS-5 Veg Dyn GEOS-5 Catchment GEOS-5 Land GEOS-5 Surface GEOS-5 Atm Physics GEOS-5 Hiistory ESMF Model Map 2010 NOAA Department of Defense University NASA Department of Energy National Science Foundation ESMF coupling complete Component (thin lines) Model (thick lines) Legend Ovals show ESMF components and models that are at the working prototype level or beyond. Tracer Advection CLM LandCAM Atm FIM Land Info System HAF GAIM MOM4 SWAN ADCIRCpWASH123 COAMPS WWIII NCOM NOGAPS

A Common Model Architecture The U.S. Earth system modeling community is converging on a common modeling architecture Atmosphere, ocean, sea ice, land, wave, and other models are ESMF or ESMF- like components called by a top-level driver or coupler Some models are modularizing further with nested components A Common Model Architecture

Common Model Architecture in Climate Metadata CMIP5 metadata display in Earth System Grid, developed by the Earth System Curator project in collaboration with E.U. METAFOR

From Common Model Architecture to Interoperability ESMF component interfaces alone do not guarantee technical interoperability – ESMF can be implemented in multiple ways Also need: – A common physical architecture – the scope and relationships of physical components (e.g. land surface as subroutine or component?) – Metadata conventions and usage conventions (e.g. who can modify component data?) – The next steps for modeling infrastructure involve encoding these conventions in software tools and templates

National Unified Operational Prediction Capability National Unified Operational Prediction Capability (NUOPC) is a consortium of operational weather prediction centers Developing a standard implementation of ESMF across NASA, NOAA, Navy, Air Force and other modeling applications Defining a target level of interoperability involving multiple aspects of code – EXAMPLES: Component interface. Components have a standard calling interface to facilitate generic drivers and communication protocols. Standardization does not include specification of what specific fields are actually in the import and export state. Timekeeping. Metadata and conventions for timekeeping enable modelers to understand without code inspection whether components can be coupled together. From: Final Report from the National Unified Operational Prediction Capability (NUOPC) Interim Committee on Common Model Architecture (CMA), June 18, 2009.

NUOPC Compliance Checker Designed as a way to encode and check conventions Can be linked in or not at run-time Presence of standard ESMF Initialize, Run, and Finalize methods and the number of phases in each Timekeeping conforms to NUOPC conventions Fields or FieldBundles (not Arrays/ArrayBundles) are passed between Components Which Fields are passed through import States and export States Required Component and Field metadata is present

Global Interoperability Program Global Interoperability Program (sponsor NOAA) – Support for multi-agency projects that cross domain boundaries and integrate along modeling workflows – Supports ESMF (and other) development and applications – New work with ESMF includes exploration of: self-documenting, end-to- end workflows integration and interfacing with other frameworks increasing usability new computing platforms and algorithms ESMF-enabled CCSM workflow implemented using Kepler