1 OASIS3-MCT_3.0 OASIS overview OASIS3-MCT_3.0 Some recent performance results Summary and future efforts A. Craig, S. Valcke, L. Coquart, CERFACS April.

Slides:



Advertisements
Similar presentations
Distributed Processing, Client/Server and Clusters
Advertisements

1 Coven a Framework for High Performance Problem Solving Environments Nathan A. DeBardeleben Walter B. Ligon III Sourabh Pandit Dan C. Stanzione Jr. Parallel.
GridRPC Sources / Credits: IRISA/IFSIC IRISA/INRIA Thierry Priol et. al papers.
1 CW2015, Manchester, 04/ Working Session II - Future Issues Working Session II – Future Issues: interoperability, sharing of models/infrastructure,
490dp Synchronous vs. Asynchronous Invocation Robert Grimm.
Improving Robustness in Distributed Systems Jeremy Russell Software Engineering Honours Project.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Parallel Programming in C with MPI and OpenMP Michael J. Quinn.
Overview SAP Basis Functions. SAP Technical Overview Learning Objectives What the Basis system is How does SAP handle a transaction request Differentiating.
1CPSD NSF/DARPA OPAAL Adaptive Parallelization Strategies using Data-driven Objects Laxmikant Kale First Annual Review October 1999, Iowa City.
Christopher Jeffers August 2012
Components and Concurrency in ESMF Nancy Collins Community Meeting July 21, GMAO Seasonal.
1 CSE 2102 CSE 2102 CSE 2102: Introduction to Software Engineering Ch9: Software Engineering Tools and Environments.
 What is an operating system? What is an operating system?  Where does the OS fit in? Where does the OS fit in?  Services provided by an OS Services.
ICOM 5995: Performance Instrumentation and Visualization for High Performance Computer Systems Lecture 7 October 16, 2002 Nayda G. Santiago.
Metadata for the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) using the Earth System Modeling Framework (ESMF) Peter Bosler University.
MICROPROCESSOR INPUT/OUTPUT
ESMF Development Status and Plans ESMF 4 th Community Meeting Cecelia DeLuca July 21, 2005 Climate Data Assimilation Weather.
CESM/RACM/RASM Update May 15, Since Nov, 2011 ccsm4_0_racm28:racm29:racm30 – vic parallelization – vic netcdf files – vic coupling mods and “273.15”
Computational Design of the CCSM Next Generation Coupler Tom Bettge Tony Craig Brian Kauffman National Center for Atmospheric Research Boulder, Colorado.
A Metadata Based Approach For Supporting Subsetting Queries Over Parallel HDF5 Datasets Vignesh Santhanagopalan Graduate Student Department Of CSE.
© Crown copyright Met Office Atmosphere-Chemistry Model Coupling in the UK Earth System Model (UKESM) Richard Hill, April 2015 Credits: Marc Stringer,
Chapter 3 Parallel Algorithm Design. Outline Task/channel model Task/channel model Algorithm design methodology Algorithm design methodology Case studies.
1 CW 2015, Manchester, 04/ Coupling technology benchmarking in IS-ENES2 Coupling technology benchmarking in IS-ENES2 IS-ENES2 WP10-T3 Evaluation.
Support for Debugging Automatically Parallelized Programs Robert Hood Gabriele Jost CSC/MRJ Technology Solutions NASA.
A Framework for Elastic Execution of Existing MPI Programs Aarthi Raveendran Tekin Bicer Gagan Agrawal 1.
A Framework for Elastic Execution of Existing MPI Programs Aarthi Raveendran Graduate Student Department Of CSE 1.
The use of modeling frameworks to facilitate interoperability Cecelia DeLuca/NCAR (ESMF) Bill Putman/NASA GSFC (MAPL) David Neckels/NCAR.
4.2.1 Programming Models Technology drivers – Node count, scale of parallelism within the node – Heterogeneity – Complex memory hierarchies – Failure rates.
Evaluation of Agent Teamwork High Performance Distributed Computing Middleware. Solomon Lane Agent Teamwork Research Assistant October 2006 – March 2007.
CESM/ESMF Progress Report Mariana Vertenstein NCAR Earth System Laboratory CESM Software Engineering Group (CSEG) NCAR is sponsored by the National Science.
Migration to Rose and High Resolution Modelling Jean-Christophe Rioual, CRUM, Met Office 09/04/2015.
Earth System Modeling Framework Status Cecelia DeLuca NOAA Cooperative Institute for Research in Environmental Sciences University of Colorado, Boulder.
Issues Autonomic operation (fault tolerance) Minimize interference to applications Hardware support for new operating systems Resource management (global.
NIH Resource for Biomolecular Modeling and Bioinformatics Beckman Institute, UIUC NAMD Development Goals L.V. (Sanjay) Kale Professor.
NIH Resource for Biomolecular Modeling and Bioinformatics Beckman Institute, UIUC NAMD Development Goals L.V. (Sanjay) Kale Professor.
Regional Models in CCSM CCSM/POP/ROMS: Regional Nesting and Coupling Jon Wolfe (CSEG) Mariana Vertenstein (CSEG) Don Stark (ESMF)
INFORMATION SYSTEM-SOFTWARE Topic: OPERATING SYSTEM CONCEPTS.
Lecture 4 TTH 03:30AM-04:45PM Dr. Jianjun Hu CSCE569 Parallel Computing University of South Carolina Department of.
Earth System Modeling Framework Python Interface (ESMP) October 2011 Ryan O’Kuinghttons Robert Oehmke Cecelia DeLuca.
CCSM Portability and Performance, Software Engineering Challenges, and Future Targets Tony Craig National Center for Atmospheric Research Boulder, Colorado,
August 2001 Parallelizing ROMS for Distributed Memory Machines using the Scalable Modeling System (SMS) Dan Schaffer NOAA Forecast Systems Laboratory (FSL)
ATmospheric, Meteorological, and Environmental Technologies RAMS Parallel Processing Techniques.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH May 15, 2003 Nancy Collins, NCAR 2nd Community Meeting, Princeton, NJ Earth System.
Silberschatz, Galvin and Gagne  Operating System Concepts UNIT II Operating System Services.
NCEP ESMF GFS Global Spectral Forecast Model Weiyu Yang, Mike Young and Joe Sela ESMF Community Meeting MIT, Cambridge, MA July 21, 2005.
Coupling protocols – software strategy Question 1. Is it useful to create a coupling standard? YES, but … Question 2. Is the best approach to make a single.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH May 14, 2003 Nancy Collins, NCAR Components Workshop, Princeton, NJ Components in the.
ESMF Regridding Update Robert Oehmke Ryan O’Kuinghttons Amik St. Cyr.
Gedae, Inc. Gedae: Auto Coding to a Virtual Machine Authors: William I. Lundgren, Kerry B. Barnes, James W. Steed HPEC 2004.
CCSM Performance, Successes and Challenges Tony Craig NCAR RIST Meeting March 12-14, 2002 Boulder, Colorado, USA.
Full and Para Virtualization
On the Road to a Sequential CCSM Robert Jacob, Argonne National Laboratory Including work by: Mariana Vertenstein (NCAR), Ray Loy (ANL), Tony Craig (NCAR)
Parallelization Strategies Laxmikant Kale. Overview OpenMP Strategies Need for adaptive strategies –Object migration based dynamic load balancing –Minimal.
An update on BFG, The Bespoke Framework Generator Graham Riley (& Rupert Ford, STFC) Coupling Workshop Boulder, Colorado - February 20 th -22 nd.
General requirements for BES III offline & EF selection software Weidong Li.
Motivation: dynamic apps Rocket center applications: –exhibit irregular structure, dynamic behavior, and need adaptive control strategies. Geometries are.
Chapter – 8 Software Tools.
State of ESMF: The NUOPC Layer Gerhard Theurich NRL/SAIC ESMF Executive Board / Interagency Working Group Meeting June 12, 2014.
Introduction to Performance Testing Performance testing is the process of determining the speed or effectiveness of a computer, network, software program.
A TIME-GCM CAM Multi-executable Coupled Model Using ESMF and InterComm Robert Oehmke, Michael Wiltberger, Alan Sussman, Wenbin Wang, and Norman Lo.
OSSIM Technology Overview Mark Lucas. “Awesome” Open Source Software Image Map (OSSIM)
Some of the utilities associated with the development of programs. These program development tools allow users to write and construct programs that the.
1.3 Operating system services An operating system provide services to programs and to the users of the program. It provides an environment for the execution.
Application of Design Patterns to Geometric Decompositions V. Balaji, Thomas L. Clune, Robert W. Numrich and Brice T. Womack.
Experiences and Decisions in Met Office coupled ESM Development
Chapter Goals Describe the application development process and the role of methodologies, models, and tools Compare and contrast programming language generations.
GMAO Seasonal Forecast
Component Frameworks:
Overview of Workflows: Why Use Them?
Parallel Programming in C with MPI and OpenMP
Presentation transcript:

1 OASIS3-MCT_3.0 OASIS overview OASIS3-MCT_3.0 Some recent performance results Summary and future efforts A. Craig, S. Valcke, L. Coquart, CERFACS April 20-22, 2015

2 What is OASIS? Coupling Software used by at least 35 projects, world-wide Based on an “inline” send/recv approach (get and put calls inside components) A few OASIS calls need to be added during component initialization and then fields are coupled via get and put calls during the run phase. Components provide information to the OASIS coupling layer – grids – decompositions (partitions) – coupling variable names Coupling interactions defined in an input file, “namcouple” – associates variable names between components – defines mapping (regridding) operations – specifies lags and sequencing, coupling frequency – includes basic math functions (average, min, max, etc) Has I/O capability – OASIS can read forcing data and pass that data to components – OASIS can write coupling data for analysis and diagnosis

3 OASIS3-MCT: component interfaces Use statement: use mod_oasis Initialization: call oasis_init_comp(...) Grid definition: call oasis_write_grid (...) Local partition definition: call oasis_def_partition (...) Coupling field declaration: call oasis_def_var (...) End of definition phase: call oasis_enddef (…) Coupling field exchange:  in model time stepping loop call oasis_put (…, date, var_array. …) call oasis_get (…, date, var_array, …) user ’ s defined source or target (end-point communication) sending or receiving at appropriate time only automatic averaging/accumulation if requested automatic writing of coupling restart file at end of run Termination: call oasis_terminate (…)

4 OASIS3-MCT: namcouple input file $NFIELDS 20 $END $RUNTIME $END $NLOGPRT 10 $END $STRINGS FSENDOCN FRECVATM fdocn.nc EXPOUT torc lmdz LAG=+3600 P 2 P 0 MAPPING my_remapping_file_bilinear.nc src FSENDATM FRECVOCN fdatm.nc EXPOUT lmdz torc LAG=+1800 P 0 P 2 SCRIPR BILINEAR LR SCALAR LATLON 1 $END Number of Coupling Fields Run Length Debug Level List of Coupling Interactions -field names - coupling periods - grid names - coupling lags - transforms ** can now be generated with OASIS GUI based on CERFACS OPENTEA **

5 OASIS1 -> OASIS2 -> OASIS3: 2D ocean-atmosphere coupling low resolution, low frequency  flexibility, modularity, 2D interpolations + océan atmosphère  OASIS historical overview OASIS1OASIS2 OASIS3 OASIS4 PRISMIS-ENES OASIS3-MCT atmosPhere atmosPhere pe 1 pe 2 pe n pe 3 pe 1 pe 2 pe n pe 3 chemistry chemistry pe 4 OASIS4 / OASIS3-MCT: 2D/3D coupling of high resolution parallel components on massively parallel platforms  parallelism, efficiency, performance OASIS3-MCT_1.0 released July 2012 OASIS3-MCT_2.0 released May 2013 OASIS3-MCT_3.0 release imminent (May 2015) v1v2v3

6 OASIS3-MCT_3.0 New Features Ability to define grids, partitions and variables across and on subsets of component processes Ability to couple within a component and on overlapping processes – component processes can overlap or partly overlap (in a single executable) – can couple fields within a component (ie. physics to dynamics) on same or different grids, same or different decompositions, same or different processes, etc. – user needs to avoid deadlocks, sends are non-blocking, receives are blocking, use namcouple lags to allow for 2-way coupling sequentially. – provides greater flexibility with respect to process layout for science and load balance Memory and Performance upgrades – particularly in initialization phase – interaction with C. Goodyer, NAG, EXA2CT EU project New LUCIA load balancing tool and new memory tracking tool (gptl) Improved error checking and error messages Expanded test cases and testing automation Testing at high resolution (> 1M gridpoints), high processor counts (32k pes), and with large variable counts (> 1k coupling fields) Doxygen documentation

7 OASIS3 -> OASIS3-MCT -> OASIS3-MCT_3.0 Model1 Model2 OASIS Model1 Model2 OASIS3 OASIS3-MCT_1.0 OASIS3-MCT_3.0 Model2p Model2d Model1

8 OASIS3-MCT_3.0 coupling capability System has 2 executables; exe1 (atm) and exe2 (ocn_ice) Executable 1 has 1 component and 1 grid (atm) Executable 2 has 3 components; comp2 (ice), comp3 (ocn), and comp4 comp2 has 1 grid; grid2 (ice) on all comp2 processes comp3 has 3 grids or parts (ocn_phy, ocn_dyn, ocn_io); on varying processes Supports many coupling combinations Prior to OASIS3-MCT_3.0, only coupling “A” was supported K

9 Toy coupled model: ping-pong exchanges between NEMO ORCA025 grid (1021x1442) and Gaussian Reduced T799 grid ( ) Bullx Curie thin nodes; Intel® procs Sandy Bridge EP; IFort , Bullx MPI IBM MareNostrum3: Intel Sandy Bridge processors, Intel MPI OASIS3-MCT performance

10 Toy coupled model: ping-pong exchanges between NEMO ORCA025 grid (1021x1442) and Gaussian Reduced T799 grid ( ) Bullx Curie thin nodes; Intel® procs Sandy Bridge EP; IFort , Bullx MPI Bullx Beaufix; Intel® Xeon Ivy Bridge; Infiniband FDR; IFort. Bullx MPI OASIS3-MCT performance (new results) Results to 32k processes (16k processes for each component) Initialization Time*Ping-Pong Time seconds cores per component * cpl_setup PRELIMINARY RESULTS

11 Toy coupled model: ping-pong exchanges between NEMO ORCA025 grid (1021x1442) and Gaussian Reduced T799 grid ( ) Bullx Curie thin nodes; Intel® procs Sandy Bridge EP; IFort , Bullx MPI OASIS3-MCT Memory Use Measured using the gptl tool MB/process PRELIMINARY RESULTS

12 OASIS3-MCT_3.0 Summary OASIS3-MCT_3.0 release imminent (May 2015) Much greater flexibility to couple components in a single executable across disparate overlapping, non-overlapping, or partly overlapping processes within a single component different grids and/or different decompositions IO nested grids Better performance and more robust Significant effort made to tune and demonstrate viability at high resolution, high core counts, and large numbers of coupling fields.

13 OASIS3-MCT Future Work Features like “Bundled” fields (multiple 2d fields like multiple category ice) Continued evaluation of performance and scaling Review community usage Continue evaluation of ESMF for off-line precomputing of interpolation weights IS-ENES2: Coupling technology benchmark + International Working Committee on Coupling Techologies (IWCCT, Performance of OASIS3-MCT for icosahedral grids Evaluation of Open-PALM (including ONERA CWIPI library) and XIOS (IPSL I/O server) Evaluation of ESMF

14 The end

15