CSEG Update Mariana Vertenstein CCSM Software Engineering Group Mariana Vertenstein CCSM Software Engineering Group.

Slides:



Advertisements
Similar presentations
DOE Global Modeling Strategic Goals Anjuli Bamzai Program Manager Climate Change Prediction Program DOE/OBER/Climate Change Res Div
Advertisements

Alternate Software Development Methodologies
CCSM Testing Status Tony Craig Lawrence Buja Wei Yu CCSM SEWG Meeting Feb 5, 2003.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Global Climate Modeling Research John Drake Computational Climate Dynamics Group Computer.
Phil’s Promised Presentation on POP’s Present Progress, Performance and Penultimate Post-present Plan POP People P. Malone, P. Smith, P. Maltrud, P. Jones,
1 ESMF in Production at NCEP Mark Iredell Chief NCEP/EMC Global Climate and Weather Modeling Branch May 23, 2006 NCEP: “where America’s climate, weather,
Mesoscale & Microscale Meteorological Division / NCAR ESMF and the Weather Research and Forecast Model John Michalakes, Thomas Henderson Mesoscale and.
1 NGGPS Dynamic Core Requirements Workshop NCEP Future Global Model Requirements and Discussion Mark Iredell, Global Modeling and EMC August 4, 2014.
Common Infrastructure for Modeling the Earth
© 2006, Cognizant Technology Solutions. All Rights Reserved. The information contained herein is subject to change without notice. Automation – How to.
WRF-VIC: The Flux Coupling Approach L. Ruby Leung Pacific Northwest National Laboratory BioEarth Project Kickoff Meeting April 11-12, 2011 Pullman, WA.
NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT Adoption and field tests of M.I.T General Circulation Model (MITgcm) with ESMF Chris Hill ESMF.
NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT | U MICH First Field Tests of ESMF GMAO Seasonal Forecast NCAR/LANL CCSM NCEP.
Metadata Creation with the Earth System Modeling Framework Ryan O’Kuinghttons – NESII/CIRES/NOAA Kathy Saint – NESII/CSG July 22, 2014.
CCSM Software Engineering Coordination Plan Tony Craig SEWG Meeting Feb 14-15, 2002 NCAR.
NE II NOAA Environmental Software Infrastructure and Interoperability Program Cecelia DeLuca Sylvia Murphy V. Balaji GO-ESSP August 13, 2009 Germany NE.
ESMF Development Status and Plans ESMF 4 th Community Meeting Cecelia DeLuca July 21, 2005 Climate Data Assimilation Weather.
CESM/RACM/RASM Update May 15, Since Nov, 2011 ccsm4_0_racm28:racm29:racm30 – vic parallelization – vic netcdf files – vic coupling mods and “273.15”
Computational Design of the CCSM Next Generation Coupler Tom Bettge Tony Craig Brian Kauffman National Center for Atmospheric Research Boulder, Colorado.
Overview of ESMF in the Community Climate System Model (CCSM) Erik Kluzek NCAR -- CCSM Software Engineering Group (CSEG) Erik Kluzek NCAR -- CCSM Software.
Report to WGOMD on GFDL Ocean Modelling Activities Stephen Griffies NOAA/GFDL (and CSIRO) IPCC AR4 activities Model developments.
ESMF Application Status GMAO Seasonal Forecast NCAR/LANL CCSM NCEP Forecast GFDL FMS Suite MITgcm NCEP/GMAO Analysis Climate Data Assimilation.
The use of modeling frameworks to facilitate interoperability Cecelia DeLuca/NCAR (ESMF) Bill Putman/NASA GSFC (MAPL) David Neckels/NCAR.
Lessons learned from building and managing the Community Climate System Model David Bailey PCWG liaison (NCAR) Marika Holland PCWG co-chair (NCAR) Elizabeth.
February 2012 Milestone Materials Implicit coupling design document NUOPC Layer software prototype bundled with ESMF Updated NUOPC Layer reference manual.
Overview of the SciDAC Project: Collaborative Design and Development of the CCSM for Terascale Computers PI: Malone(LANL), Drake(ORNL) Co-I (DOE): Ding(LBL),
CESM/ESMF Progress Report Mariana Vertenstein NCAR Earth System Laboratory CESM Software Engineering Group (CSEG) NCAR is sponsored by the National Science.
PetaApps: Update on software engineering and performance J. Dennis M. Vertenstein N. Hearn.
DTC HWRF Task AOP2009 & AOP /01/2009 Ligia Bernardet.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Earth System Modeling Framework Status Cecelia DeLuca NOAA Cooperative Institute for Research in Environmental Sciences University of Colorado, Boulder.
Regional Models in CCSM CCSM/POP/ROMS: Regional Nesting and Coupling Jon Wolfe (CSEG) Mariana Vertenstein (CSEG) Don Stark (ESMF)
Status of the Sea Ice Model Testing of CICE4.0 in the coupled model context is underway Includes numerous SE improvements, improved ridging formulation,
Petascale –LLNL Appro AMD: 9K processors [today] –TJ Watson Blue Gene/L: 40K processors [today] –NY Blue Gene/L: 32K processors –ORNL Cray XT3/4 : 44K.
ROMS as a Component of the Community Climate System Model (CCSM) Enrique Curchitser, IMCS/Rutgers Kate Hedstrom, ARSC/UAF Bill Large, Mariana Vertenstein,
Land Ice Verification and Validation (LIVV) Kit Weak scaling behavior for a large dome- shaped test case. It shows that the scaling behavior of a new run.
1 OASIS3-MCT_3.0 OASIS overview OASIS3-MCT_3.0 Some recent performance results Summary and future efforts A. Craig, S. Valcke, L. Coquart, CERFACS April.
CCSM Portability and Performance, Software Engineering Challenges, and Future Targets Tony Craig National Center for Atmospheric Research Boulder, Colorado,
AMWG Breakout, CCSM Workshop June 25, 2002 Overview of CAM status and simulations Bill Collins and Dave Randall National Center for Atmospheric Research.
Strategic Plan Implementation Cecelia DeLuca/NCAR (ESMF) December 17, 2008 ESMF Board/Interagency Meeting.
Slides for NUOPC ESPC NAEFS ESMF. A NOAA, Navy, Air Force strategic partnership to improve the Nation’s weather forecast capability Vision – a national.
Chemistry-Climate Working Group Meeting (March 22-24, 2006) Background –SSC expectations and the next IPCC (Bill Collins) Summarize where we are now Discuss.
Coupling protocols – software strategy Question 1. Is it useful to create a coupling standard? YES, but … Question 2. Is the best approach to make a single.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH 15 May 2003 Cecelia DeLuca / NCAR 2 nd ESMF Community Meeting Princeton, NJ NSIPP Seasonal.
CCSM Performance, Successes and Challenges Tony Craig NCAR RIST Meeting March 12-14, 2002 Boulder, Colorado, USA.
On the Road to a Sequential CCSM Robert Jacob, Argonne National Laboratory Including work by: Mariana Vertenstein (NCAR), Ray Loy (ANL), Tony Craig (NCAR)
Running CESM An overview
Presented by The SciDAC CCSM Consortium Project John B. Drake Computer Science and Mathematics Division Computational Earth Sciences Group.
Implementation Plan for CCSM 4 CCSM 4 needs to be ready by the end of 2008 for AR5 in early 2013.
ESMF,WRF and ROMS. Purposes Not a tutorial Not a tutorial Educational and conceptual Educational and conceptual Relation to our work Relation to our work.
Presented by LCF Climate Science Computational End Station James B. White III (Trey) Scientific Computing National Center for Computational Sciences Oak.
CCSM Software Engineering Update Tony Craig CCSM SEWG Meeting Feb 4, 2003.
Update on the 2-moment stratiform cloud microphysics scheme in CAM Hugh Morrison and Andrew Gettelman National Center for Atmospheric Research CCSM Atmospheric.
The NOAA Environmental Modeling System at NCEP Mark Iredell and the NEMS group NOAA/NWS/NCEP Environmental Modeling Center June 12, 2014.
To Connect to Wireless Network Boot up with wifi enabled, no encryption Open browser, go to or
The Community Climate System Model (CCSM): An Overview Jim Hurrell Director Climate and Global Dynamics Division Climate and Ecosystem.
Do we / why do we want to develop an ASM? Climate working group for WRF – workshop on model developments for climate studies with WRF (summary of.
SciDAC CCSM Consortium: Software Engineering Update Patrick Worley Oak Ridge National Laboratory (On behalf of all the consorts) Software Engineering Working.
Overview of the CCSM CCSM Software Engineering Group June
GMAO Seasonal Forecast
Community Earth System Model (CESM) for CMIP6
Maintaining software solutions
Mariana Vertenstein (CGD)
Joint GEOS-Chem and NCAR Modeling Workshop:
WRF-GC: on-line two-way coupling of WRF and GEOS-Chem
ABHISHEK SHARMA ARVIND SRINIVASA BABU HEMANT PRASAD 08-OCT-2018
Mariana Vertenstein CCSM Software Engineering Group NCAR
Soil hydrology soil moisture variability problem; interim solution
Progress of Interactions Among CCSM and Other Modeling Efforts
A brief introduction to NEMS
Presentation transcript:

CSEG Update Mariana Vertenstein CCSM Software Engineering Group Mariana Vertenstein CCSM Software Engineering Group

22 Improvements to CCSM scripts  Addition of timing tool that provides automated information to help determine load balance, throughput and cost of a run.  Improvements made to CCSM script infrastructure  Provide the ability to define new “ component sets/modes ”. and as command line options. Makes it easier to run and test new CCSM science (such as addition of component biogeochemistry and atmospheric chemistry).  Simplified user specification of non-default task/thread settings.  Changes improve extensibility and robustness of scripts.  Addition of timing tool that provides automated information to help determine load balance, throughput and cost of a run.  Improvements made to CCSM script infrastructure  Provide the ability to define new “ component sets/modes ”. and as command line options. Makes it easier to run and test new CCSM science (such as addition of component biogeochemistry and atmospheric chemistry).  Simplified user specification of non-default task/thread settings.  Changes improve extensibility and robustness of scripts.

33 Improvements to CCSM scripts (cont)  CCSM test framework rewritten using new functionality.  Introduced new testing functionality (e.g. auto-promotion test).  Introduced new “ test suites ” that now permit different testing levels:  development, pre-tag, post-tag, monthly  currently in process of defining contributions to each test suite  Benefits:  Regression testing will be easier to perform.  Straightforward addition of tests for new CCSM scenarios.  Easier testing of new scenarios (e.g. CLM-CN with CO2 exchange).  More frequent testing of new science.  More robust CCSM code base.  CCSM test framework rewritten using new functionality.  Introduced new testing functionality (e.g. auto-promotion test).  Introduced new “ test suites ” that now permit different testing levels:  development, pre-tag, post-tag, monthly  currently in process of defining contributions to each test suite  Benefits:  Regression testing will be easier to perform.  Straightforward addition of tests for new CCSM scenarios.  Easier testing of new scenarios (e.g. CLM-CN with CO2 exchange).  More frequent testing of new science.  More robust CCSM code base.

44 Porting and Performance  ORNL Cray X1 (phoenix)  Both release and development code base have been ported. CAM FV dycore ported in development code base. Porting this dycore in CCSM proved to be very challenging.  30 years/day on 200 procs (ignorning performance fluctuations)  13 years/day on 108 procs on NCAR bluevista  400 year FV control simulation carried out  ORNL Cray XT3 (jaguar)  Port of development code base is starting now. Release code base is running - will require a validation.  NCAR IBM (bluevista)  Major new OS upgrade will permit over-subscription of nodes. Provides an opportunity for major performance improvements - but will also result in significantly larger parameter space for performance optimization.  ORNL Cray X1 (phoenix)  Both release and development code base have been ported. CAM FV dycore ported in development code base. Porting this dycore in CCSM proved to be very challenging.  30 years/day on 200 procs (ignorning performance fluctuations)  13 years/day on 108 procs on NCAR bluevista  400 year FV control simulation carried out  ORNL Cray XT3 (jaguar)  Port of development code base is starting now. Release code base is running - will require a validation.  NCAR IBM (bluevista)  Major new OS upgrade will permit over-subscription of nodes. Provides an opportunity for major performance improvements - but will also result in significantly larger parameter space for performance optimization. 44

55 Porting and Performance (future)  NCAR IBM Bluegene  Will require single executable concurrent system  Will require parallel I/O (NetCDF and binary)  NCAR IBM Bluegene  Will require single executable concurrent system  Will require parallel I/O (NetCDF and binary)

66 Data Model Project  Serial version rewrite of all data models (datm7, dlnd7, dice7, docn7) completed!  datm7 can now duplicate stand-alone CLM functionality (in serial mode).  docn7 now has both DOM and SOM functionality (SOM scientific verification is close to complete).  All data components now have same functionality.  Can perform spatial interpolation from input data resolution to model resolution.  Can data cycle over subset of years.  Serial version rewrite of all data models (datm7, dlnd7, dice7, docn7) completed!  datm7 can now duplicate stand-alone CLM functionality (in serial mode).  docn7 now has both DOM and SOM functionality (SOM scientific verification is close to complete).  All data components now have same functionality.  Can perform spatial interpolation from input data resolution to model resolution.  Can data cycle over subset of years.

77 Data Model Project (what is next)  Parallelize the new serial data models.  Incorporate parallel data models into currently developed sequential CCSM.  Replace component specific data models with CCSM data models. As examples:  Replace SOM/DOM in CAM with DOCN7  Replace forcing atm driver in CLM with DATM7  Use of only one set of data models will result in more consistent science and remove existing code duplication.  Parallelize the new serial data models.  Incorporate parallel data models into currently developed sequential CCSM.  Replace component specific data models with CCSM data models. As examples:  Replace SOM/DOM in CAM with DOCN7  Replace forcing atm driver in CLM with DATM7  Use of only one set of data models will result in more consistent science and remove existing code duplication. 77

88 Single Executable Concurrent CCSM  CSEG is leveraging the work that Helen He and Chris Ding (SciDAC) have done in order to create a single executable concurrent implementation of CCSM development code.  A somewhat different implementation will be produced to satisfy current CCSM requirements.  Aim to have development CCSM tag (including all CCSM components) by mid-July.  Aim to have a CCSM3.0 release based version released by later in the summer.  Existence of single executable system should improve CCSM portability and debugging.  CSEG is leveraging the work that Helen He and Chris Ding (SciDAC) have done in order to create a single executable concurrent implementation of CCSM development code.  A somewhat different implementation will be produced to satisfy current CCSM requirements.  Aim to have development CCSM tag (including all CCSM components) by mid-July.  Aim to have a CCSM3.0 release based version released by later in the summer.  Existence of single executable system should improve CCSM portability and debugging.

99 Single Executable Sequential CCSM  Major progress made in replacing stand-alone CAM with “ pseudo-sequential CCSM ”.  Both ESMF and MCT will be examined as coupling frameworks. We are committed to creating a sequential interoperable CCSM utilizing the ESMF framework.  ESMF superstructure coding for Stage 1 Evaluation Plan is beginning now. Superstructure design is already in place. Estimated completion date is end of summer. Plan is to incorporate ESMF coupling framework on CAM trunk upon successful completion of Stage 1 evaluation effort.  Major progress made in replacing stand-alone CAM with “ pseudo-sequential CCSM ”.  Both ESMF and MCT will be examined as coupling frameworks. We are committed to creating a sequential interoperable CCSM utilizing the ESMF framework.  ESMF superstructure coding for Stage 1 Evaluation Plan is beginning now. Superstructure design is already in place. Estimated completion date is end of summer. Plan is to incorporate ESMF coupling framework on CAM trunk upon successful completion of Stage 1 evaluation effort.

10 Original stand-alone CAM architecture (CAM3.0) CAM Driver Physics CLMCAM-ICECAM-OCN Dynamics Note that surface models are invoked from within CAM physics - not from a top level driver

Pseudo-Sequential CCSM Top level Architecture Application Driver (FD) CLMCAM-ICECAM-OCN CAM Phys Dyn Couplers (FD) ATM merger (FD) OCN merger (FD) LND merger (FD) ICE merger (FD) Thin coupling layer (FD) Thin coupling layer (FD) Thin coupling layer (FD) Thin coupling layer (FD) Introduce top level framework dependent (FD) application driver to replace CAM driver Introduce top level ESMF clock to coordinate time evolution of all components Introduce new flexible and extensible “thin” coupling layer design

12 Sequential CCSM (cont)  New top level based application driver.  Independent of CAM data structures  Time evolution based upon ESMF general time management utilities  Utilizes CCSM share code for reading input  Initial implementation: MCT  Near term implementation: ESMF  New surface coupling layer.  Initial implementation: MCT  Near term implementation: ESMF  New directory structure for sequential CCSM.  Applies to both MCT and ESMF  Inter-component domain checking in coupling layer.  Initial implementation: MCT  Near term implementation: ESMF  New top level based application driver.  Independent of CAM data structures  Time evolution based upon ESMF general time management utilities  Utilizes CCSM share code for reading input  Initial implementation: MCT  Near term implementation: ESMF  New surface coupling layer.  Initial implementation: MCT  Near term implementation: ESMF  New directory structure for sequential CCSM.  Applies to both MCT and ESMF  Inter-component domain checking in coupling layer.  Initial implementation: MCT  Near term implementation: ESMF

13 CAM Update  Implemented support for non lat-lon grids (Jim Edwards and Pat Worley). Non lat-lon support added to  Boundary data and aerosol data interpolation code.  Physics/Dyamics coupling layer.  Only aqua-planet mode is currently supported.  Significant FV dycore interface refactoring implemented:  Only data on the XY decomposition outside the portable dycore is now exposed.  Added dynamics import/export states, and dynamics component module with init, run, final methods.  CAM testing framework extended to add new platforms (e.g. phoenix) and to perform overnight regression testing.  New features added to CAM tropospheric MOZART.  Implemented support for non lat-lon grids (Jim Edwards and Pat Worley). Non lat-lon support added to  Boundary data and aerosol data interpolation code.  Physics/Dyamics coupling layer.  Only aqua-planet mode is currently supported.  Significant FV dycore interface refactoring implemented:  Only data on the XY decomposition outside the portable dycore is now exposed.  Added dynamics import/export states, and dynamics component module with init, run, final methods.  CAM testing framework extended to add new platforms (e.g. phoenix) and to perform overnight regression testing.  New features added to CAM tropospheric MOZART.

14 CAM Update (what is next)  Refactoring of CAM to run with new dycores.  Incorporation of HOMME dycore into CAM.  Refactoring CAM ’ s history module to encompass non lat-lon output.  Ability to run pseudo-sequential CCSM (not just aqua-planet) with non lat-lon CAM.  Incorporation of parallel I/O into CAM.  Replace all binary I/O with NetCDF I/O.  Creation of new tool to generate CAM namelists (tool should easily extend to other models if desired).  Refactoring of CAM to run with new dycores.  Incorporation of HOMME dycore into CAM.  Refactoring CAM ’ s history module to encompass non lat-lon output.  Ability to run pseudo-sequential CCSM (not just aqua-planet) with non lat-lon CAM.  Incorporation of parallel I/O into CAM.  Replace all binary I/O with NetCDF I/O.  Creation of new tool to generate CAM namelists (tool should easily extend to other models if desired).

15 CAM Update (what is next)  SCAM (Single Column Atmosphere Model) refactoring  Goal is to produce a more flexible and maintainable interfaces.,  Adoption of new components  Incorporation of new ice sheet model (GLIMMER).  Primary work will be done by Bill Lipscomb.  GLIMMER has already been incorporated into CAM build system.  Replacement of CAM-CSIM with CICE and CAM-SOM/DOM with DOCN7.  SCAM (Single Column Atmosphere Model) refactoring  Goal is to produce a more flexible and maintainable interfaces.,  Adoption of new components  Incorporation of new ice sheet model (GLIMMER).  Primary work will be done by Bill Lipscomb.  GLIMMER has already been incorporated into CAM build system.  Replacement of CAM-CSIM with CICE and CAM-SOM/DOM with DOCN7.

16 CLM Update  Finemesh grids have been implemented in CLM. This gives CLM the new capability to run on its own independent grid. The implementation follows the scheme of Hahmann and Dickinson.  CLM still couples to CAM or CCSM via the CAM coarse grid. Mapping is done within the CLM code base.  The implementation does not change answers when the CLM fine grid is identical to the CAM grid.  Stand-alone CAM runs have been run with a T42 coarse grid and a half degree finemesh grid. Results are encouraging.  New downscaling and upscaling interactions are being implemented.  Finemesh grids have been implemented in CLM. This gives CLM the new capability to run on its own independent grid. The implementation follows the scheme of Hahmann and Dickinson.  CLM still couples to CAM or CCSM via the CAM coarse grid. Mapping is done within the CLM code base.  The implementation does not change answers when the CLM fine grid is identical to the CAM grid.  Stand-alone CAM runs have been run with a T42 coarse grid and a half degree finemesh grid. Results are encouraging.  New downscaling and upscaling interactions are being implemented.

17 CLM Update (what is next)  A nested grid capability is being added to CLM, so the model can be run at spatially varied resolution to optimize cost.  A new prognostic canopy air space scheme is being implemented.  Scientific formulation is being finalized.  Software implementation (Forrest Hoffman) will start as soon as the scientific formulation is completed.  CLM offline testing framework will be rewritten in order to test the numerous new features that are being introduced into the system.  Urban code will be incorporated into the main development line of development.  A nested grid capability is being added to CLM, so the model can be run at spatially varied resolution to optimize cost.  A new prognostic canopy air space scheme is being implemented.  Scientific formulation is being finalized.  Software implementation (Forrest Hoffman) will start as soon as the scientific formulation is completed.  CLM offline testing framework will be rewritten in order to test the numerous new features that are being introduced into the system.  Urban code will be incorporated into the main development line of development.

18 POP2 Update  Completed incorporation of major CCSM features into the LANL pop2.1.alpha code. CCSM POP2 code is now in CCSM SVN repository and current CCSM scripts.  Added  New tracer advection scheme (1D Lax-Wendroff with 1D flux limiters).  Near-surface eddy flux, enhanced deep mixing.  Support for new 1-degree grid with Galapagos Islands topography (gx21v4).  Completed incorporation of major CCSM features into the LANL pop2.1.alpha code. CCSM POP2 code is now in CCSM SVN repository and current CCSM scripts.  Added  New tracer advection scheme (1D Lax-Wendroff with 1D flux limiters).  Near-surface eddy flux, enhanced deep mixing.  Support for new 1-degree grid with Galapagos Islands topography (gx21v4).

19 POP2 Update (what is next)  Inclusion of a more efficient barotropic solver designed by John Dennis.  Reduced equatorial viscosity.  A new vertical grid.  Systematic exploration of the above features.  Plans for incorporating the ecosystem model into POP2 will be developed during the CCSM workshop.  Inclusion of a more efficient barotropic solver designed by John Dennis.  Reduced equatorial viscosity.  A new vertical grid.  Systematic exploration of the above features.  Plans for incorporating the ecosystem model into POP2 will be developed during the CCSM workshop.

20 CICE Update  PCWG has decided to incorporate CICE as the new standard CCSM ice model. New name will be “ Community Ice CodE ”.  Testing has been done using CICE 3.1 to establish main differences with CSIM 5.0.  CICE 3.1 changes answers significantly, but still within the realm of the same climate.  CICE 3.14 has additional bug fixes and a few new physical parameterizations. This is the version that will be adopt after more testing is completed.  Goal is to incorporate CICE 4 (with new data structures) into CCSM and into pseudo-sequential CCSM.  PCWG has decided to incorporate CICE as the new standard CCSM ice model. New name will be “ Community Ice CodE ”.  Testing has been done using CICE 3.1 to establish main differences with CSIM 5.0.  CICE 3.1 changes answers significantly, but still within the realm of the same climate.  CICE 3.14 has additional bug fixes and a few new physical parameterizations. This is the version that will be adopt after more testing is completed.  Goal is to incorporate CICE 4 (with new data structures) into CCSM and into pseudo-sequential CCSM.