Download presentation
Presentation is loading. Please wait.
Published byJennifer Watts Modified over 9 years ago
1
V. Balaji (vb@gfdl.gov), SGI/GFDL Princeton University First PRISM Project Meeting Toulouse, 22 May 2002 NASA/GSFC A High-Performance Framework for Earth Science Modeling and Data Assimilation
2
Who are we? NSF NCAR Tim Killeen, PI Byron Boville Cecelia DeLuca Roberta Johnson John Michalakes MIT John Marshall, PI Chris Hill NASA DAO Arlindo da Silva, PI Leonid Zaslavsky Will Sawyer NASA NSIPP Max Suarez Michele Rienecker Christian Keppenne NOAA GFDL Ants Leetmaa V. Balaji Robert Hallberg Jeff Anderson NOAA NCEP Stephen Lord Mark Iredell Mike Young John Derber DOE Los Alamos National Lab Phil Jones DOE Argonne National Lab Jay Larson Rob Jacob University of Michigan Quentin Stout
3
NASA DAO PI Data Assimilation Deployment Part III Proposal Specific Milestones Part III Prognostic Model Deployment MIT PI Part II Proposal Specific Milestones Part II NSF NCAR PI Core Framework Development Part I Proposal Specific Milestones Part I Project Organization NASA ESS Joint Milestones Joint Specification Team Requirements Analysis System Architecture API Specification
4
Outline 1.Background 2.ESMF Objectives and Scientific Benefits 3.ESMF Overview 4.Final Milestones 5.Development Plan 6.Beyond 2004: ESMF Evolution
5
Technological Trends In climate research and NWP... increased emphasis on detailed representation of individual physical processes; requires many teams of specialists to contribute components to an overall coupled system In computing technology... increase in hardware and software complexity in high- performance computing, as we shift toward the use of scalable computing architectures
6
Community Response Modernization of modeling software Abstraction of underlying hardware to provide uniform programming model across vector, uniprocessor and scalable architectures Distributed development model characterized by many contributing authors; use of high-level language features for abstraction to facilitate development process Modular design for interchangeable dynamical cores and physical parameterizations, development of community-wide standards for components Development of prototype frameworks GFDL (FMS), NASA/GSFC (GEMS) Other framework-ready packages: NCAR/NCEP (WRF), NCAR/DOE (MCT) The ESMF aims to unify and extend these efforts
7
Objectives of the ESMF 1.Facilitate the exchange of scientific codes (interoperability) 2.Promote the reuse of standardized technical software while preserving computational efficiency 3.Focus community resources to deal with changes in computer architecture 4.Present the computer industry and computer scientists with a unified and well defined task 5.Share overhead costs of the housekeeping aspects of software development 6.Provide greater institutional continuity to model development efforts
8
Scientific Benefits ESMF accelerates advances in Earth System Science 1.Eliminates software barriers to collaboration among organizations – Easy exchange of model components accelerates progress in NWP and climate modeling – Independently developed models and data assimilation methods can be combined and tested – Coupled model development becomes truly distributed process – Advances from smaller academic groups easily adopted by large modeling centers
9
Scientific Benefits, cont. ESMF accelerates advances in Earth System Science 2.Facilitates development of new interdisciplinary collaborations – Simplifies extension of climate models to upper atmosphere – Accelerates inclusion of advanced biogeochemical components into climate models – Develops clear path for many other communities to use, improve, and extend climate models – Many new model components gain easy access to power of data assimilation
10
Outline 1.Background 2.ESMF Objectives and Scientific Benefits 3.ESMF Overview 4.Final Milestones 5.Development Plan 6.Beyond 2004: ESMF Evolution
11
Design Principles Modularity data-hiding, encapsulation, self-sufficiency; Portability adhere to official language standards, use community-standard software packages, comply with internal standards Performance minimize abstraction penalties of using a framework Flexibility address a wide variety of climate issues by configuring particular models out of a wide choice of available components and modules Extensibility design to anticipate and accommodate future needs Community encourage users to contribute components, develop in open source environment
12
Framework Architecture Components and Coupling gridded component interface collective data transfers/coupling collective I/O Fields and Grids Fields field description/metadata field and field set data field /O Grids grid description/metadata grid decomposition Parallel Utilities transpose, halo, etc. abstracted machine layout Low-Level Utilities event alarms performance profiling I/O primitives communication primitives, etc.
13
Application Architecture External Libraries Low Level Utilities Fields and Grids Layer Model Layer Coupling Layer ESMF Infrastructure User Code ESMF Superstructure BLAS, MPI, NetCDF, …
14
Functionality Classes ESMF Infrastructure DISTGRID: Distributed grid operations (transpose, halo, etc.) PHYSGRID: physical grid specification, metric operations. REGRID: interpolation of data between grids, ungridded data. IO: on distributed data KERN: Management of distributed memory, data-sharing for shared and distributed memory. TMGR: Time management, alarms, time and calendar utilities PROF: Performance profiling and logging, adaptive load-balancing. ERROR: Error handling ESMF Superstructure CONTROL: assignment of components to processor sets, scheduling of components and inter-component exchange. Inter- component signals, including checkpointing of complete model configurations. COUPLER: Validation of exchange packets. Blocking and non- blocking transfer of boundary data between component models. Conservation verification. COMPONENT: specification of required interfaces for components.
15
–ESMF will be usable by models written in F90/C/C++. –ESMF will be usable by models requiring adjoint capability. –ESMF will support SPMD and MPMD coupling. –ESMF will support several I/O formats (principally netCDF). –ESMF will have uniform syntax across platforms. Other features
16
Adoption Metric E x Measures the fraction of framework functionality classes used by an application: N U number of functionality classes used by model N C number of functionality classes applicable to model Guidelines for measuring adoption and other metrics will appear in the Software Developer’s Guide Development Metrics
17
Levels of ESMF Compliance –Compliance: E x ≥ 0.4 Extensive use of infrastructure and standardized coupling interface –Partial Compliance: E x ≥ 0.2 Partial use of infrastructure and coupling services Ease of adoption Metric –Measures the effort required to achieve a given adoption metric Performance Metrics –Throughput, time to solution, efficiency of scaling Development Metrics, cont.
18
Target Platforms ESMF will target broad range of platforms –Major center hardware, e.g. SMP nodes – SP, SGI O3K, Alpha – 1000+ processors – Commodity hardware, e.g. Linux clusters, desktops – x86 (P4,Athlon) + interconnect – 64 processors $140K, 10-30GFlop/s
19
Commodity System Development ESMF distribution for commodity hardware –full ESMF superstructure and infrastructure –optimized/specialized backend software and library code Selection of pre-configured, documented JMC codes running on commodity cluster On-line documentation –downloads, on-line installation and construction instructions
20
Outline 1.Background 2.ESMF Objectives and Scientific Benefits 3.ESMF Overview 4.Final Milestones 5.Development Plan 6.Beyond 2004: ESMF Evolution
21
Joint Milestone Codeset I IDPart I JMC: EVA Suite aspectral simulation at T42 bspectral simulation at T170 cgridpoint simulation at 1/4° x 1/4° or equivalent dcomponent based on a physical domain other than the ocean or atmosphere, 2° x 2.5° or equivalent esimplified 3D-VAR system with 200K observations/day fcfc synthetic coupled SPMD system gcgc synthetic coupled MPMD system
22
Joint Milestone Codeset II SourceIDPart II JMC: Modeling Applications GFDLhFMS B-grid atmosphere at N45L18 iFMS spectral atmosphere at T63L18 jFMS MOM4 ocean model at 2°x2°xL40 kFMS HIM isopycnal C-language ocean model at 1/6°x1/6°L22 MITlclc MITgcm coupled atmosphere/ocean at 2.8°x2.8°, atmosphere L5, ocean L15 mMITgcm regional and global ocean at 15kmL30 NSIPPncnc NSIPP atmospheric GCM at 2°x2.5°xL34 coupled with NSIPP ocean GCM at 2/3°x1.25°L20 NCAR/LANLococ CCSM2 including CAM with Eulerian spectral dynamics and CLM at T42L26 coupled with POP ocean and data ice model at 1°x1°L40
23
Joint Milestone Codeset III SourceIDPart III JMC: Data Assimilation Applications DAOpPSAS based analysis system with 2O0K observations/day qcqc CAM with finite volume dynamics at 2°x2.5°L55, including CLM NCEPrGlobal atmospheric spectral model at T170L42 sSSI analysis system with 250K observations/day, 2 tracers tWRF regional atmospheric model at 22km resolution CONUS forecast 345x569L50 NSIPPucuc ODAS with OI analysis system at 1.25°x1.25°L20 resolution with ~10K observations/day MITvMITgcm 2.8° century / millennium adjoint sensitivity
24
Final Milestones Tested, optimized core ESMF software – Many platforms, including commodity clusters All JMC codes will achieve full ESMF compliance – Major Modeling Efforts CCSM, FMS, MIT, NCEP, GEMS, WRF – Major Data Assimilation Efforts Atmospheric: NCEP, DAO Oceanic: NSIPP, MIT ESMF interoperability demonstrations Demonstrated by running JMC codes using ESMF coupling services, including models that have never been coupled before
25
Demonstration Experiments Code, documentation, input data – Available through the Web – Step-by-step instructions on how to reproduce experiments Experiments to span several time scales, may include – NWP, mid-range forecast experiments – Coupled seasonal forecasts – Interannual/decadal variability – Centennial simulations Standard periods – Forecasts, simulations to be performed on commonly established periods
26
Interoperability Demo MODEL SCIENCE IMPACT 1GFDL FMS B-grid atm MITgcm oceanGlobal biogeochemistry (CO 2, O 2 ), SI timescales. 2GFDL FMS MOM4NCEP forecastNCEP seasonal forecasting system. 3NSIPP oceanLANL CICESea ice model for SI, allows extension of SI system to centennial time scales. 4NSIPP atmDAO analysisAssimilated initial state for SI. 5DAO analysisNCEP modelIntercomparison of systems for NASA/NOAA joint center for satellite data assimilation. 6DAO CAM-fvNCEP analysisIntercomparison of systems for NASA/NOAA joint center for satellite data assimilation. 7NCAR CAM EulMITgcm oceanImproved climate predictive capability: climate sensitivity to large component interchange, optimized initial conditions. 8NCEP WRFGFDL MOM4Development of hurricane prediction capability. 3 interoperability experiments completed at Milestone I, 5 more at Milestone J
27
Outline 1.Background 2.ESMF Objectives and Scientific Benefits 3.ESMF Overview 4.Final Milestones 5.Development Plan 6.Beyond 2004: ESMF Evolution
28
Development Plan Requirements – Will involve intensive interaction between Core ESMF team and a dispersed set of collaborating scientists and software engineers – High-level results presented in General Requirements Document; detailed requirements documents for specific components prepared later Design – Design study includes examination of FMS, GEMS, and other existing Earth science frameworks – Results presented in Architecture Report, detailed design documents for specific components prepared later
29
Development Plan, cont. Implementation – Implementation study will determine implementation language, language interoperability strategy, programming model – Examine viability of supporting software tools, e.g. CCA, Cactus – Results presented in Implementation Report Best Practices – Early distribution of Software Developer’s Guide will encourage consistent standards, conventions and practices throughout the project
30
Development Strategy Collaborative, organized, efficient Modular, object-oriented software design Open, collaborative web development environment based on SourceForge Communication via teleconferences, mailing lists and ESMF website Reviews for requirements, design, code and documentation ProTeX, LaTeX and Latex2html tools to support integrated and easy to maintain documentation Defect tracking CVS source code control
31
Executive Committee Advisory Board Prognostic Model Deployment Core Software Development Team Core Software Development Team NSF NCAR PI MIT PI NASA DAO PI Technical Lead Modeling Applications Team Modeling Applications Team Data Assimilation Applications Team Data Assimilation Applications Team E/PO Director Technical Lead Technical Oversight Team Data Assimilation Deployment Core Software Development Technical Oversight Team Technical Oversight Team Joint Specification Team Requirements Analysis – Architecture - API Specification Technical Oversight Team Technical Oversight Team ESMF Management and Coordination Technical Oversight Teams Low-Level Infrastructure Fields and Grids Coupling Superstructure Technical Oversight Teams Low-Level Infrastructure Fields and Grids Coupling Superstructure Management Structure
32
Community Involvement Review by Earth science community twice during framework development More frequent Request for Comments Closer and more frequent interaction with other ESS funded projects – University of Michigan A High-Performance Adaptive Simulation Framework for Space-Weather Modeling (SWMF) – UCLA / LANL Increasing Interoperability of an Earth System Model: Atmosphere-Ocean Dynamics and Tracer Transports – GSFC HSB/COLA Land Information System (LIS)
33
Beyond 2004:ESMF Evolution Maintenance, support and management – NCAR commitment to maintain and support core ESMF software – NCAR commitment to develop ESMF, with focus and level contingent on outside funding – Persistence of Executive Committee and Advisory Board Technical evolution – Functional extension: Support for advanced data assimilation algorithms: error covariance operators, infrastructure for generic variational algorithms, etc. additional grids, new domains – Earth System Modeling Environment, including web/GUI interface, databases of components and experiments, links to GRID services
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.