Download presentation
Presentation is loading. Please wait.
Published byHollie French Modified over 9 years ago
1
Overview of the SciDAC Project: Collaborative Design and Development of the CCSM for Terascale Computers PI: Malone(LANL), Drake(ORNL) Co-I (DOE): Ding(LBL), Duffy(LLNL), Erickson(ORNL), Foster(ANL), Ghan(PNL), Jacob(ANL), Jones(LANL), Larson(ANL), Mirin(LLNL), Rotman(LLNL), Taylor(ANL), Worley(ORNL) Co-I (NCAR&NASA): Bettge(NCAR), Kiehl(NCAR), Craig(NCAR), Deluca(NCAR), Lin(DAO), Washington(NCAR), Williamson(NCAR)
2
Goals for SciDAC CCSM Collaborative Development Comprehensive treatment of physical and chemical processes –High resolution ocean and atmosphere support –Hybrid vertical coordinate in ocean code –Tropospheric chemistry package –Biogeochemistry Modular “packages” with well defined interfaces and testing procedures –Atm, Coupler (Avant Garde), Ice, Ocn, Lnd Performance optimized yet portable and adaptable for utilization of emerging architectural features of terascale computers Ready for DOE, NSF and NASA applications –High resolution historical and climate change scenario studies –Carbon cycle studies and data assimilation for climate
3
Climate Science Enabled Science 13April 2001: “Detection of Anthropogenic Climate Change in the Worlds Oceans,” Barnett, Pierce, Schnur Method: Ensemble simulations of the DOE Parallel Climate Model (PCM) Results: Detection of Anthropogenic Climate Change in the Worlds Oceans Ensembles establish 95% confidence intervals of model predictions Simulated ocean heat storage matches historical record of rising ocean temperatures Enabling Technology: Parallel Climate Model developed in collaborative effort lead by Warren Washington (NCAR) Terascale computing resources Firsts: Ensemble study with US model and computers Coupled model reproducing ocean response Establishing new level of US model quality
4
Software Infrastructure and Applied Math Challenges ChallengeExisting SolutionsEnhanced Solutions Project management, Version control and testing infrastructure for competing, distributed development teams CVS with monitoring proceduresBitKeeper? SourceForge? Variety of systems and institutions. Evolving Three layer architecture: Library(eg. FFT), Utility(eg. orbital calendar, data transpose) and Model ( eg. radiation physics) F90 modules, vendor math libraries, customized data movement, PILGRM, MCT, …Open design process Optimized math libraries, performance monitors, CCA, NASA Earth System Modeling Framework(ESMF) Accurate, fast dynamical methods: atm, ocn, ice Spectral horizontal, finite volume, conservative semi-Lagrangian advection, hybrid vertical coordinates, fast Helmholtz solvers, two-time level monotone advection Scalable algorithms, grid technologies, nonlinear solvers, new discretization techniques, subcycling explicit barotropic modes, new formulations Incorporating new model components and improving coupled model climate simulation Control simulations, component working group activities, scientific steering and review of non-linear interactions Available cycles, improved analysis capabilities and extensive data handling capabilities, collaborative technologies Unknown sourceISIC and externalInternal to project
5
Software Engineering Challenges
6
Software Tasks Develop comprehensive design documents for each component of the model Implement performance-portable, run-time configurable CCSM on target machines Incorporate emerging programming paradigms and software design practices Develop testing and validation procedures for all component models Incorporate new dynamical components for ocean, atmosphere and sea ice. Incorporate new modular physical and chemical process models
7
Task List Excerpt and Schedule DateComponentMilestoneDeliverableLead Lab 02Q1OceanReviewed requirements documentReportLANL 02Q1LandComplete requirements documentReportNCAR 02Q2OceanHybrid programming model completed in POPReleased codeLANL 02Q2Ocean Complete blending of orthobaric surfaces with z-levels in HYPOP Progress reportLANL 02Q2CouplerLoad balancing in MCTNew release of MCTANL 02Q3AtmosphereDemonstrate tuned and optimized model at T85Validated codeORNL 02Q3AtmosphereImplement 3D block decomposition in all dycoresBenchmarked codeORNL 02Q3Sea iceComplete requirements documentReportLANL 02Q3AtmosphereSubgrid topography scheme applied to dycoresProgress reportPNNL 02Q4 Atmospheric chemistry Complete development of chemistry solverBenchmarked codeLLNL 03Q1AtmosphereSeparate dynamics and transportNew release of dynamical coreORNL 03Q1AtmosphereDemonstrate T31 performance optimizationValidated codeANL 03Q1LandComplete cache-friendly decompositionBenchmarked codeORNL 03Q1Sea iceUse automatic differentiation to tune parameters in CICE Improved values of model parameters ANL 03Q2OceanMLP and dynamic load balancing in POPBenchmark report, released codeLANL 03Q2Sea ice Complete hybrid programming model in CICE with subblocked decomposition and load balancing Benchmarked codeLANL 03Q4 Atmospheric chemistry Test GHG distributions with tropospheric chemistry forcingProgress reportLLNL 041QOcean Finish tracer validation tests in HYPOP (ready for biogeochemistry) LANL 042QCouplerNew data types and structuresNew realease of MCTANL 04Q3AllParallel IORelease of IO libraryLBNL 05Q1OceanComplete validation of HYPOP in coupled model testsValidated codeLANL 05Q1 Atmospheric chemistry Test GHG distributions with stratosphereic chemistry forcing added Progress reportLLNL 05Q3Coupler Integration of coupler with CCSM utility and machine- specific layers New release of MCTANL 06Q2 Atmospheric chemistry Complete addition of aerosol physicsValidated codeNCAR
8
Chemical solver technologies GHG with transport Interactive ozone : sulfur cycle and aerosols Upper ocean dimethyl sulfide Load balancing Off-line chemical simulations (DAM) Support C4MIP Multiple resolution atm model –T31, T42, T85 and T170 –Sub-grid orography precip scheme Scalable dycores –2d, 3d blocks, MLP tests –Optimized and load balancing physics Design documents Chunking for cache performance Load balanced, m to n transfers Land Surface Modeling Toolkit as part of utility layer
9
Next Generation Couplers MCT version 2 –Higher level abstractions, component model registry –Scalability to thousands of processors –Dynamic load balancing –3d fields Interoperation of climate, weather and data assimilation functions MCT as CCA compliant prototype for Earth System Modeling Framework
10
SciDAC CCSM – ISIC Collaboration Goals Improve performance characterizations Accelerate development of mathematical software for climate modeling and analysis Explore novel methods and advance the theory and simulation of geophysical flows Improve software practices for the scientific endeavor of climate change prediction Enhance analysis and data handling methods
11
ISIC Projects Performance Evaluation(D. Bailey) 02Q1 –PCTM, CCSM Instrumentation - P. Worley (ORNL) Earth System Grid (Ian Foster) 02Q3 –Data archive and analysis grid - Bernholdt(ORNL) Data Access (Arie Shoshani) 02Q3 –Improved efficiencies and monitoring - Samatova (ORNL)
12
ISIC Projects Grids (Glimm, Brown) 03Q1 –Ocean nested grids – Smith (LANL), Brown(LLNL) –Smooth grid transformations – Drake(ORNL), Khamayseh(ORNL) –Spectral element refinement strategies – Fournier (NCAR, U.Md), Fisher(ANL),Taylor&Wingate(LANL) –Adaptive models – Joyce Penner (U. Mich)
13
ISIC Projects PDE methods (Phil Collela) 03Q1 –Adaptive diagnostics (Drake-Williamson) –Static adapted western boundary currents Malone(LANL) Solvers (David Keyes) 03Q4 –Baroclinic model (a new discretization) –Krylov methods – Balou Nadiga (LANL) Common Component Architecture (Rob Armstrong) 04Q1 –Model Coupler Toolkit – Larson (ANL) –CCA compliant CCSM
14
SciDAC Climate Projects
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.