Introduction to the Earth System Modeling Framework V. Balaji, GFDL Cecelia DeLuca, Chris Hill, MIT

Slides:



Advertisements
Similar presentations
NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT | U MICH Arlindo da Silva, NASA/GSFC/GMAO The Earth System Modeling Framework.
Advertisements

NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT | U MICH Emergence of the Earth System Modeling Framework NSIPP Seasonal Forecast.
Earth System Curator Spanning the Gap Between Models and Datasets.
Metadata Development in the Earth System Curator Spanning the Gap Between Models and Datasets Rocky Dunlap, Georgia Tech.
1 Earth System Modeling Framework ESMF and the Transformation of Earth System Modeling Sylvia Murphy
Integrated Frameworks for Earth and Space Weather Simulation Timothy Killeen and Cecelia DeLuca National Center for Atmospheric Research, Boulder, Colorado.
NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT | U MICH Architecture of the Earth System Modeling Framework GMAO Seasonal.
Introduction to the Earth System Modeling Framework Don Stark Gerhard Theurich Shujia Zhou.
Introduction to the Earth System Modeling Framework Nancy Collins July 22, 2005 Climate Data Assimilation Weather.
Mesoscale & Microscale Meteorological Division / NCAR ESMF and the Weather Research and Forecast Model John Michalakes, Thomas Henderson Mesoscale and.
Chapter 13 Embedded Systems
© , Michael Aivazis DANSE Software Issues Michael Aivazis California Institute of Technology DANSE Software Workshop September 3-8, 2003.
The Earth System Modeling Framework and the Earth System Curator Cecelia DeLuca and the ESMF Joint Specification Team.
Earth System Modeling Framework Capabilities Cecelia DeLuca SCD Users Forum May 18, GMAO.
NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT Adoption and field tests of M.I.T General Circulation Model (MITgcm) with ESMF Chris Hill ESMF.
NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT | U MICH First Field Tests of ESMF GMAO Seasonal Forecast NCAR/LANL CCSM NCEP.
1 NOAA’s Environmental Modeling Plan Stephen Lord Ants Leetmaa November 2004.
Metadata Creation with the Earth System Modeling Framework Ryan O’Kuinghttons – NESII/CIRES/NOAA Kathy Saint – NESII/CSG July 22, 2014.
CCSM Software Engineering Coordination Plan Tony Craig SEWG Meeting Feb 14-15, 2002 NCAR.
Project Overview GMAO Seasonal Forecast NCAR/LANL CCSM NCEP Forecast GFDL FMS Suite MITgcm NASA GMAO Analysis Climate Data Assimilation.
ICOM 5995: Performance Instrumentation and Visualization for High Performance Computer Systems Lecture 7 October 16, 2002 Nayda G. Santiago.
Science Computing BranchGoddard Space Flight Center Code 930 Support for GMI Tom Clune (NASA) Bigyani Das (CSC) Jae-Hoon Kim (CSC)
What is ESMF and what does it mean to adopt it? 3 rd ESMF Community Meeting Cecelia DeLuca Nancy Collins
Metadata for the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) using the Earth System Modeling Framework (ESMF) Peter Bosler University.
NE II NOAA Environmental Software Infrastructure and Interoperability Program Cecelia DeLuca Sylvia Murphy V. Balaji GO-ESSP August 13, 2009 Germany NE.
Coordination of Common Modeling Infrastructure Cecelia DeLuca WGCM/WMP Meeting, Exeter, UK Oct 6, 2005 Climate Data Assimilation Weather.
Cecelia DeLuca, Don Stark, Chris Hill Arctic System Model Workshop May 20, 2008 Earth System Modeling Framework.
ESMF Development Status and Plans ESMF 4 th Community Meeting Cecelia DeLuca July 21, 2005 Climate Data Assimilation Weather.
CCA Common Component Architecture Manoj Krishnan Pacific Northwest National Laboratory MCMD Programming and Implementation Issues.
DOE BER Climate Modeling PI Meeting, Potomac, Maryland, May 12-14, 2014 Funding for this study was provided by the US Department of Energy, BER Program.
Computational Design of the CCSM Next Generation Coupler Tom Bettge Tony Craig Brian Kauffman National Center for Atmospheric Research Boulder, Colorado.
A Metadata Based Approach For Supporting Subsetting Queries Over Parallel HDF5 Datasets Vignesh Santhanagopalan Graduate Student Department Of CSE.
Initial Results from the Integration of Earth and Space Frameworks Cecelia DeLuca/NCAR, Alan Sussman/University of Maryland, Gabor Toth/University of Michigan.
EMI INFSO-RI SA2 - Quality Assurance Alberto Aimar (CERN) SA2 Leader EMI First EC Review 22 June 2011, Brussels.
ESMF Application Status GMAO Seasonal Forecast NCAR/LANL CCSM NCEP Forecast GFDL FMS Suite MITgcm NCEP/GMAO Analysis Climate Data Assimilation.
BLU-ICE and the Distributed Control System Constraints for Software Development Strategies Timothy M. McPhillips Stanford Synchrotron Radiation Laboratory.
The use of modeling frameworks to facilitate interoperability Cecelia DeLuca/NCAR (ESMF) Bill Putman/NASA GSFC (MAPL) David Neckels/NCAR.
Earth System Modeling Framework Status Cecelia DeLuca NOAA Cooperative Institute for Research in Environmental Sciences University of Colorado, Boulder.
Introduction to the Earth System Modeling Framework International Workshop on Next Generation Climate Models for Advanced High Performance Computing Facilities.
Issues Autonomic operation (fault tolerance) Minimize interference to applications Hardware support for new operating systems Resource management (global.
Presented by An Overview of the Common Component Architecture (CCA) The CCA Forum and the Center for Technology for Advanced Scientific Component Software.
Components, Coupling and Concurrency in the Earth System Modeling Framework N. Collins/NCAR, C. DeLuca/NCAR, V. Balaji/GFDL, G. Theurich/SGI, A. da Silva/GSFC,
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
Scalable Systems Software for Terascale Computer Centers Coordinator: Al Geist Participating Organizations ORNL ANL LBNL.
Earth System Modeling Framework Python Interface (ESMP) October 2011 Ryan O’Kuinghttons Robert Oehmke Cecelia DeLuca.
August 2001 Parallelizing ROMS for Distributed Memory Machines using the Scalable Modeling System (SMS) Dan Schaffer NOAA Forecast Systems Laboratory (FSL)
Strategic Plan Implementation Cecelia DeLuca/NCAR (ESMF) December 17, 2008 ESMF Board/Interagency Meeting.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH May 15, 2003 Nancy Collins, NCAR 2nd Community Meeting, Princeton, NJ Earth System.
CCA Common Component Architecture CCA Forum Tutorial Working Group CCA Status and Plans.
NCEP ESMF GFS Global Spectral Forecast Model Weiyu Yang, Mike Young and Joe Sela ESMF Community Meeting MIT, Cambridge, MA July 21, 2005.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH 15 May 2003 Cecelia DeLuca / NCAR 2 nd ESMF Community Meeting Princeton, NJ NSIPP Seasonal.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH May 14, 2003 Nancy Collins, NCAR Components Workshop, Princeton, NJ Components in the.
Extension of the ESMF for Space Weather Cecelia DeLuca SWW April 7, NSIPP Seasonal Forecast.
Earth System Curator and Model Metadata Discovery and Display for CMIP5 Sylvia Murphy and Cecelia Deluca (NOAA/CIRES) Hannah Wilcox (NCAR/CISL) Metafor.
Toward GSI Community Code Louisa Nance, Ming Hu, Hui Shao, Laurie Carson, Hans Huang.
ESMF,WRF and ROMS. Purposes Not a tutorial Not a tutorial Educational and conceptual Educational and conceptual Relation to our work Relation to our work.
Building Community and Capability through Common Infrastructure: ESMF and the Earth System Curator Cecelia DeLuca MAP Meeting College.
SDM Center High-Performance Parallel I/O Libraries (PI) Alok Choudhary, (Co-I) Wei-Keng Liao Northwestern University In Collaboration with the SEA Group.
ESMF Change Review Board Robert Ferraro ESMF Board Meeting Dec 17, 2008 Climate Data Assimilation Weather.
Emergence of a Common Modeling Architecture for Earth System Science American Geophysical Union December 13, 2010 Cecelia DeLuca NOAA/CIRES.
Climate Data Assimilation Weather Earth System Modeling Framework (ESMF) Community Meeting, DODNSFDOENASANOAA.
ESMF and the future of end-to-end modeling Sylvia Murphy National Center for Atmospheric Research
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH C. DeLuca/NCAR, J. Anderson/NCAR, V. Balaji/GFDL, B. Boville/NCAR, N. Collins/NCAR,
Metadata Development in the Earth System Curator Spanning the Gap Between Models and Datasets Rocky Dunlap, Georgia Tech 5 th GO-ESSP Community Meeting.
A Quick Tour of the NOAA Environmental Software Infrastructure and Interoperability Group Cecelia DeLuca Dr. Robert Detrick visit March 28, 2012
GMAO Seasonal Forecast
GMAO Seasonal Forecast
ESMF Governance Cecelia DeLuca NOAA CIRES / NESII April 7, 2017
Metadata Development in the Earth System Curator
Department of Computer Science, University of Tennessee, Knoxville
Presentation transcript:

Introduction to the Earth System Modeling Framework V. Balaji, GFDL Cecelia DeLuca, Chris Hill, MIT Coastal Inundation Workshop 25 January 2006 Climate Data Assimilation Weather

2 OUTLINE Overview ESMF and the Community Development Status Design and Principles of ESMF Resources

3 Motivation and Context In climate research and NWP... increased emphasis on detailed representation of individual physical processes; requires many teams of specialists to contribute components to an overall modeling system In computing technology... increase in hardware and software complexity in high-performance computing, as we shift toward the use of scalable computing architectures In software … development of first-generation frameworks, such as FMS, GEMS, CCA and WRF, that encourage software reuse and interoperability

4 What is ESMF? ESMF provides tools for turning model codes into components with standard interfaces and standard drivers. ESMF provides data structures and common utilities that components use for routine services such as data communications, regridding, time management and message logging. ESMF GOALS 1.Increase scientific productivity by making model components much easier to build, combine, and exchange, and by enabling modelers to take full advantage of high-end computers. 2.Promote new scientific opportunities and services through community building and increased interoperability of codes (impacts in collaboration, code validation and tuning, teaching, migration from research to operations)

5 Each box is an ESMF component Every component has a standard interface so that it is swappable Data in and out of components are packaged as state types with user-defined fields New components can easily be added to the hierarchical system Coupling tools include regridding and redistribution methods Application Example: GEOS-5 AGCM

6 Why Should I Adopt ESMF If I Already Have a Working Model? There is an emerging pool of other ESMF-based science components that you will be able to interoperate with to create applications - a framework for interoperability is only as valuable as the set of groups that use it. It will reduce the amount of infrastructure code that you need to maintain and write, and allow you to focus more resources on science development. ESMF provides solutions to two of the hardest problems in model development: structuring large, multi-component applications so that they are easy to use and extend, and achieving performance portability on a wide variety of parallel architectures. It may be better software (better features, better performance portability, better tested, better documented and better funded into the future) than the infrastructure software that you are currently using. Community development and use means that the ESMF software is widely reviewed and tested, and that you can leverage contributions from other groups.

7 New ESMF-Based Programs Funding for Science, Adoption, and Core Development Modeling, Analysis and Prediction Program for Climate Variability and Change Sponsor: NASA Partners: University of Colorado at Boulder, University of Maryland, Duke University, NASA Goddard Space Flight Center, NASA Langley, NASA Jet Propulsion Laboratory, Georgia Institute of Technology, Portland State University, University of North Dakota, Johns Hopkins University, Goddard Institute for Space Studies, University of Wisconsin, Harvard University, more The NASA Modeling, Analysis and Prediction Program will develop an ESMF-based modeling and analysis environment to study climate variability and change. Battlespace Environments Institute Sponsor: Department of Defense Partners: DoD Naval Research Laboratory, DoD Fleet Numerical, DoD Army ERDC, DoD Air Force Air Force Weather Agency The Battlespace Environments Institute is developing integrated Earth and space forecasting systems that use ESMF as a standard for component coupling. Integrated Dynamics through Earth’s Atmosphere and Space Weather Initiatives Sponsors: NASA, NSF Partners: University of Michigan/SWMF, Boston University/CISM, University of Maryland, NASA Goddard Space Flight Center, NOAA CIRES ESMF developers are working with the University of Michigan and others to develop the capability to couple together Earth and space software components. Spanning the Gap Between Models and Datasets: Earth System Curator Sponsor: NSF Partners: Princeton University, Georgia Institute of Technology, Massachusetts Institute of Technology, PCMDI, NOAA GFDL, NOAA PMEL, DOE ESG The ESMF team is working with data specialists to extend and unify climate model and dataset descriptors, and to create, based on this metadata, an end-to-end knowledge environment.

8 ESMF Impacts ESMF impacts a very broad set of research and operational areas that require high performance, multi-component modeling and data assimilation systems, including: Climate prediction Weather forecasting Seasonal prediction Basic Earth and planetary system research at various time and spatial scales Emergency response Ecosystem modeling Battlespace simulation and integrated Earth/space forecasting Space weather (through coordination with related space weather frameworks) Other HPC domains, through migration of non-domain specific capabilities from ESMF – facilitated by ESMF interoperability with generic frameworks, e.g. CCA

9 OUTLINE Overview ESMF and the Community Development Status Design and Principles of ESMF Resources

10 Collaborative Development Users define development priorities via a Change Review Board Users contribute to the framework design through public design reviews Users help to test the framework implementation ~15% of ESMF source code is currently from user contributions ◦IO from WRF ◦Resource file manager from GMAO ◦Regridding from Los Alamos ◦3D grids from University of Michigan ◦C++ component interfaces from NRL ◦More contributions in progress

11 Open Source Development Open source license (GPL) Open source environment (SourceForge) Open repositories: web-browsable CVS repositories accessible from the ESMF website ◦for source code ◦for contributions (currently porting contributions and performance testing) Open testing: tests are bundled with the ESMF distribution and can be run by users Open port status: results of nightly tests on many platforms are web-browsable Open metrics: test coverage, lines of code, requirements status are updated regularly and are web-browsable

12 Open Source Constraints ESMF does not allow unmoderated check-ins to its main source CVS repository (though there is minimal check-in oversight for the contributions repository) ESMF has a co-located, line managed Core Team whose members are dedicated to framework implementation and support – it does not rely on volunteer labor ESMF actively sets priorities based on user needs and feedback ESMF requires that contributions follow project conventions and standards for code and documentation ESMF schedules regular releases and meetings The above are necessary for development to proceed at the pace desired by sponsors and users, and to provide the level of quality and customer support necessary for codes in this domain

13 Related Projects PRISM is an ongoing European Earth system modeling infrastructure project. Involves current state-of-the-art atmosphere, ocean, sea-ice, atmospheric chemistry, land- surface and ocean-biogeochemistry models. 22 partners: leading climate researchers and computer vendors, includes MPI, KNMI, UK Met Office, CERFACS, ECMWF, DMI. ESMF and PRISM are working together through a NASA MAP grant to merge frameworks and develop common conventions. ESMF and PRISM lead a WCRP/WMP task force on strategies for developing international common modeling infrastructure. CCA is creating a minimal interface and sets of tools for linking high performance components. CCA can be used to implement frameworks and standards developed in specific domains (such as ESMF). Collaborators include LANL, ANL, LLNL, ORNL, Sandia, University of Tennessee, and many more. There is ongoing ESMF collaboration with CCA/LANL on language interoperability. Working prototype demonstrating CCA/ESMF interoperability, presented at SC2003.

14 OUTLINE Overview ESMF and the Community Development Status Design and Principles of ESMF Resources

15 ESMF Development Status Overall architecture well-defined and well-accepted Components and low-level communications stable Rectilinear grids with regular and arbitrary distributions implemented On-line parallel regridding (bilinear, 1 st order conservative) completed and optimized Other parallel methods, e.g. halo, redistribution, low-level comms implemented Utilities such as time manager, logging, and configuration manager usable and adding features Virtual machine with interface to shared / distributed memory implemented, hooks for load balancing implemented

16 ESMF Distribution Summary Fortran interfaces and complete documentation Many C++ interfaces, no manuals yet Serial or parallel execution (mpiuni stub library) Sequential or concurrent execution Single executable (SPMD) and limited multiple executable (MPMD) support

17 ESMF Platform Support IBM AIX (32 and 64 bit addressing) SGI IRIX64 (32 and 64 bit addressing) SGI Altix (64 bit addressing) Cray X1 (64 bit addressing) Compaq OSF1 (64 bit addressing) Linux Intel (32 and 64 bit addressing, with mpich and lam) Linux PGI (32 and 64 bit addressing, with mpich) Linux NAG (32 bit addressing, with mpich) Linux Absoft (32 bit addressing, with mpich) Linux Lahey (32 bit addressing, with mpich) Mac OS X with xlf (32 bit addressing, with lam) Mac OS X with absoft (32 bit addressing, with lam) Mac OS X with NAG (32 bit addressing, with lam) User-contributed g95 support

18 Some Metrics … Test suite currently consists of ◦~1800 unit tests ◦~15 system tests ◦~35 examples runs every night on ~12 platforms ~291 ESMF interfaces implemented, ~278 fully or partially tested, ~95% fully or partially tested. ~170,000 SLOC ~1000 downloads

19 ESMF Near-Term Priorities, FY06 Read/write interpolation weights and more flexible interfaces for regridding Support for general curvilinear coordinates Reworked design and implementation of array/grid/field interfaces and array-level communications Grid masks and merges Unstructured grids Asynchronous I/O

20 Planned ESMF Extensions 1.Looser couplings: support for multiple executable and Grid-enabled versions of ESMF 2.Support for representing, partitioning, communicating with, and regridding unstructured grids and semi-structured grids 3.Support for advanced I/O, including support for asynchronous I/O, checkpoint/restart, and multiple archival mechanisms (e.g. NetCDF, HDF5, binary, etc.) 4.Support for data assimilation systems, including data structures for observational data and adjoints for ESMF methods 5.Support for nested, moving grids and adaptive grids 6.Support for regridding in three dimensions and between different coordinate systems 7.Ongoing optimization and load balancing

21 OUTLINE Overview ESMF and the Community Development Status Design and Principles of ESMF Resources

22 Computational Characteristics of Weather/Climate Mix of global transforms and local communications Load balancing for diurnal cycle, event (e.g. storm) tracking Applications typically require 10s of GFLOPS, 100s of PEs – but can go to 10s of TFLOPS, 1000s of PEs Required Unix/Linux platforms span laptop to Earth Simulator Multi-component applications: component hierarchies, ensembles, and exchanges; components in multiple contexts Data and grid transformations between components Applications may be MPMD/SPMD, concurrent/sequential, combinations Parallelization via MPI, OpenMP, shmem, combinations Large applications (typically 100,000+ lines of source code) Platforms assi m sea ice ocean land atm physi cs dycor e assim_at m atmlan d Seasonal Forecast coupler

23 Design Strategy: Hierarchical Applications Since each ESMF application is also a Gridded Component, entire ESMF applications can be nested within larger applications. This strategy can be used to systematically compose very large, multi-component codes.

24 Design Strategy: Modularity Gridded Components don’t have access to the internals of other Gridded Components, and don’t store any coupling information. Gridded Components pass their States to other components through their argument list. Since components are not hard-wired into particular configurations and do not carry coupling information, components can be used more easily in multiple contexts. atm_comp NWP application Seasonal prediction Standalone for basic research

25 Design Strategy: Flexibility Users write their own drivers as well as their own Gridded Components and Coupler Components Users decide on their own control flow Hub and Spokes Coupling Pairwise Coupling

26 Design Strategy: Communication Within Components All communication in ESMF is handled within components. This means that if an atmosphere is coupled to an ocean, then the Coupler Component is defined on both atmosphere and ocean processors. ocn_comp atm_comp processors atm2ocn _coupler

27 Design Strategy: Uniform Communication API The same programming interface is used for shared memory, distributed memory, and combinations thereof. This buffers the user from variations and changes in the underlying platforms. The idea is to create interfaces that are performance sensitive to machine architectures without being discouragingly complicated. Users can use their own OpenMP and MPI directives together with ESMF communications ESMF sets up communications in a way that is sensitive to the computing platform and the application structure

28 ESMF Class Structure DELayout Communications State Data imported or exported Bundle Collection of fields GridComp Land, ocean, atm, … model F90 Superstructure Infrastructure Field Physical field, e.g. pressure Grid LogRect, Unstruct, etc. Data Communications C++ Regrid Computes interp weights CplComp Xfers between GridComps Utilities Virtual Machine, TimeMgr, LogErr, IO, ConfigAttr, Base etc. Array Hybrid F90/C++ arrays Route Stores comm paths DistGrid Grid decomposition PhysGrid Math description

29 ESMF Superstructure Classes Gridded Component ◦Models, data assimilation systems - “real code” Coupler Component ◦Data transformations and transfers between Gridded Components State – Packages of data sent between Components Application Driver – Generic driver

30 ESMF Infrastructure Data Classes Model data is contained in a hierarchy of multi-use classes. The user can reference a Fortran array to an Array or Field, or retrieve a Fortran array out of an Array or Field. Array – holds a Fortran array (with other info, such as halo size) Field – holds an Array, an associated Grid, and metadata Bundle – collection of Fields on the same Grid bundled together for convenience, data locality, latency reduction during communications Supporting these data classes is the Grid class, which represents a numerical grid

31 ESMF Communications Halo ◦Updates edge data for consistency between partitions Redistribution ◦No interpolation, only changes how the data is decomposed Regrid ◦Based on SCRIP package from Los Alamos ◦Methods include bilinear, conservative Bundle, Field, Array-level interfaces

32 ESMF Utilities Time Manager Configuration Attributes (replaces namelists) Message logging Communication libraries Regridding library (parallelized, on-line SCRIP) IO (barely implemented) Performance profiling (not implemented yet, may simply use Tau)

33 Adoption Strategies: Top Down 1.Decide how to organize the application as discrete Gridded and Coupler Components. The developer might need to reorganize code so that individual components are cleanly separated and their interactions consist of a minimal number of data exchanges. 2.Divide the code for each component into initialize, run, and finalize methods. These methods can be multi-phase, e.g., init_1, init_2. 3.Pack any data that will be transferred between components into ESMF Import and Export States in the form of ESMF Bundles, Fields, and Arrays. User data must match its ESMF descriptions exactly. 4.The user must describe the distribution of grids over resources on a parallel computer via the VM and DELayout. 5.Pack time information into ESMF time management data structures. 6.Using code templates provided in the ESMF distribution, create ESMF Gridded and Coupler Components to represent each component in the user code. 7.Write a set services routine that sets ESMF entry points for each user component’s initialize, run, and finalize methods. 8.Run the application using an ESMF Application Driver.

34 Adoption Strategies: Bottom Up Adoption of infrastructure utilities and data structures can follow many different paths. The calendar management utility is a popular place to start, since there is enough functionality in the ESMF time manager to merit the effort required to integrate it into codes and bundle it with an application.

35 OUTLINE Overview ESMF and the Community Development Status Design and Principles of ESMF Resources

36 Documentation Users Guide ◦Installation, quick start and demo, architectural overview, glossary Reference Manual ◦Overall framework rules and behavior ◦Method interfaces, usage, examples, and restrictions ◦Design and implementation notes Developers Guide ◦Documentation and code conventions ◦Definition of compliance Requirements Document Implementation Report ◦C++/Fortran interoperation strategy (Draft) Project Plan ◦Goals, organizational structure, activities

37 User Support All requests go through the list so that they can be archived and Support policy is on the ESMF website Support archives and bug reports are on the ESMF website - see > Developmenthttp:// Bug reports are under Bugs and support requests are under Lists.

38 Testing and Validation Pages Accessible from the Development link on the ESMF website Detailed explanations of system tests Supported platforms and information about each Links to regression test archives Weekly regression test schedule

39 Latest Information For scheduling and release information, see: > Development This includes latest releases, known bugs, and supported platforms. Task lists, bug reports, and support requests are tracked on the ESMF SourceForge site: