PRISM An Infrastructure Project for Climate Research in Europe by Nils ECMWF Contributions by A. Caubel, P. Constanza, D. Declat, J. Latour, V.

Slides:



Advertisements
Similar presentations
GEMS Kick- off MPI -Hamburg CTM - IFS interfaces GEMS- GRG Review of meeting in January and more recent thoughts Johannes Flemming.
Advertisements

NSF NCAR | NASA GSFC | DOE LANL ANL | NOAA NCEP GFDL | MIT | U MICH Emergence of the Earth System Modeling Framework NSIPP Seasonal Forecast.
Earth System Curator Spanning the Gap Between Models and Datasets.
1 ESMF. May 2003 PRISM Data and Visualisation Data formats and Processing Toolkit Low End Visualisation High End Visualisation Prepared by: Ros Hatcher.
1 OBJECTIVES To generate a web-based system enables to assemble model configurations. to submit these configurations on different.
1 CEOS/WGISS20 – Kyiv – September 13, 2005 Paul Kopp SIPAD New Generation: Dominique Heulet CNES 18, Avenue E.Belin Toulouse Cedex 9 France
Part 1a: Overview of the UM system
PRISM Coupling and I/O System G. Berti, P. Bourcier, A. Caubel, D. Declat, M.-A. Foujols, J. Latour, S. Legutke, J. Polcher, R. Redler, H. Ritzdorf, T.
1 NODC, Russia GISC & DCPC developers meeting Langen, 29 – 31 March E2EDM technology implementation for WIS GISC development S. Sukhonosov, S. Belov.
1 CW2015, Manchester, 04/ Working Session II - Future Issues Working Session II – Future Issues: interoperability, sharing of models/infrastructure,
Integrated Frameworks for Earth and Space Weather Simulation Timothy Killeen and Cecelia DeLuca National Center for Atmospheric Research, Boulder, Colorado.
DCS Architecture Bob Krzaczek. Key Design Requirement Distilled from the DCS Mission statement and the results of the Conceptual Design Review (June 1999):
Departamento de Física
European Network for Earth System Modeling (ENES) The PRISM Project Guy P. Brasseur Max Planck Institute for Meteorology Hamburg, Germany.
1 Geophysical Fluid Dynamics Laboratory Review June 30 - July 2, 2009.
TPAC Digital Library Talk Overview Presenter:Glenn Hyland Tasmanian Partnership for Advanced Computing & Australian Antarctic Division Outline: TPAC Overview.
The PRISM infrastructure for Earth system models Eric Guilyardi, CGAM/IPSL and the PRISM Team Background and drivers PRISM project achievements The future.
GRACE Project IST EGAAP meeting – Den Haag, 25/11/2004 Giuseppe Sisto – Telecom Italia Lab.
Metadata Creation with the Earth System Modeling Framework Ryan O’Kuinghttons – NESII/CIRES/NOAA Kathy Saint – NESII/CSG July 22, 2014.
IS-ENES Kick-off meeting Paris, March 2009 Overview of JRA2 European ESM: Performance Enhancement Graham Riley, University of Manchester IS-ENES Kick-off.
OASIS3 and OASIS4 : the PRISM couplers G. Berti, P. Bourcier, A. Caubel, D. Declat, M.-A. Foujols, J. Latour, S. Legutke, J. Polcher, R. Redler, H. Ritzdorf,
CIM – The Common Information Model in Climate Research
Metadata for the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) using the Earth System Modeling Framework (ESMF) Peter Bosler University.
NE II NOAA Environmental Software Infrastructure and Interoperability Program Cecelia DeLuca Sylvia Murphy V. Balaji GO-ESSP August 13, 2009 Germany NE.
ENES and PRISM: A European approach to Earth System modelling Sophie Valcke, CERFACS and the PRISM team across Europe.
A European Software Infrastucture Project to support Earth System Modelling Funded by the European Commission under contract number EVR
Computational Design of the CCSM Next Generation Coupler Tom Bettge Tony Craig Brian Kauffman National Center for Atmospheric Research Boulder, Colorado.
GumTree Feature Overview Tony Lam Data Acquisition Team Bragg Institute eScience Workshop 2006.
1 CW 2015, Manchester, 04/ Coupling technology benchmarking in IS-ENES2 Coupling technology benchmarking in IS-ENES2 IS-ENES2 WP10-T3 Evaluation.
HERA/LHC Workshop, MC Tools working group, HzTool, JetWeb and CEDAR Tools for validating and tuning MC models Ben Waugh, UCL Workshop on.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
기후모델 : 기후변화연구의 인프라 Climate Model: Infrastructure for Climate Change Research Wonsun Park Leibniz Institute of Marine Sciences Kiel, Germany KOFST Ultra.
GEOS-Chem Chemical Transport Model: Current Status and Future Plans Daniel J. Jacob, GEOS-Chem Model Scientist Harvard University.
The european ITM Task Force data structure F. Imbeaux.
Migration to Rose and High Resolution Modelling Jean-Christophe Rioual, CRUM, Met Office 09/04/2015.
Migrating Desktop Marcin Płóciennik Marcin Płóciennik Kick-off Meeting, Santander, Graphical.
Earth System Modeling Framework Status Cecelia DeLuca NOAA Cooperative Institute for Research in Environmental Sciences University of Colorado, Boulder.
- EGU 2010 ESSI May Building on the CMIP5 effort to prepare next steps : integrate community related effort in the every day workflow to.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
SEE-GRID-SCI The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no.
1 PRogramme for Integrated earth System Modelling (PRISM) An Infrastructure Project for Climate Research in Europe.
PRISM overview PRISM overview: from the FP5 project to a sustained effort ENES Meeting, Vienna, 2 April 2006 Scope and drivers PRISM project: achievements.
1 OASIS3-MCT_3.0 OASIS overview OASIS3-MCT_3.0 Some recent performance results Summary and future efforts A. Craig, S. Valcke, L. Coquart, CERFACS April.
Earth System Modeling Framework Python Interface (ESMP) October 2011 Ryan O’Kuinghttons Robert Oehmke Cecelia DeLuca.
22-24 Feb 05 Workshop DKRZ / M&D Stephanie Legutke Introduction to the Standard Compile Environment (SCE) of the Integrated Model & Data Infrastructure.
CCSM Portability and Performance, Software Engineering Challenges, and Future Targets Tony Craig National Center for Atmospheric Research Boulder, Colorado,
NCEP ESMF GFS Global Spectral Forecast Model Weiyu Yang, Mike Young and Joe Sela ESMF Community Meeting MIT, Cambridge, MA July 21, 2005.
PRISM WP4a Visualization Meeting DKRZ / MPI-MaD Hamburg 29th January 2002 PRISM WP4a Visualization Meeting DKRZ / MPI-MaD Hamburg 29 th January 2002.
CCSM Performance, Successes and Challenges Tony Craig NCAR RIST Meeting March 12-14, 2002 Boulder, Colorado, USA.
Introduction to the Standard Running Environment (SRE) of the Integrating Model & Data Infrastructure Heinrich Widmann, M&D.
H. Widmann (M&D) Data Discovery and Processing within C3Grid GO-ESSP/LLNL / June, 19 th 2006 / 1 Data Discovery and Basic Processing within the German.
Lautenschlager + Thiemann (M&D/MPI-M) / / 1 Introduction Course 2006 Services and Facilities of DKRZ and M&D Integrating Model and Data Infrastructure.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
GODAE IGST X Exeter, Nov Mercator Status 1. The system 2. Some users 3. The consortium 4. Mersea & plans
1 Accomplishments. 2 Overview of Accomplishments  Sustaining the Production Earth System Grid Serving the current needs of the climate modeling community.
Welcome to the PRECIS training workshop
Curator: Gap Analysis (from a schema perspective) Rocky Dunlap Spencer Rugaber Georgia Tech.
ESMF,WRF and ROMS. Purposes Not a tutorial Not a tutorial Educational and conceptual Educational and conceptual Relation to our work Relation to our work.
SDM Center High-Performance Parallel I/O Libraries (PI) Alok Choudhary, (Co-I) Wei-Keng Liao Northwestern University In Collaboration with the SEA Group.
WP4 - Strengthening the European Network on Earth System Modelling IS-ENES kick-off meeting – March 30-31, Paris Partners and general objectives CERFACS.
Page 1© Crown copyright 2005 Met Office plans for sea ice model development within a flexible modelling framework Helene Banks Martin Best, Ann Keen and.
Slide 1 NEMOVAR-LEFE Workshop 22/ Slide 1 Current status of NEMOVAR Kristian Mogensen.
Climate-SDM (1) Climate analysis use case –Described by: Marcia Branstetter Use case description –Data obtained from ESG –Using a sequence steps in analysis,
ETICS An Environment for Distributed Software Development in Aerospace Applications SpaceTransfer09 Hannover Messe, April 2009.
Page : 1 SC2004 Pittsburgh, November 12, 2004 DEISA : integrating HPC infrastructures in Europe DEISA : integrating HPC infrastructures in Europe Victor.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks GOCDB4 Gilles Mathieu, RAL-STFC, UK An introduction.
A Quick Tour of the NOAA Environmental Software Infrastructure and Interoperability Group Cecelia DeLuca Dr. Robert Detrick visit March 28, 2012
VisIt Project Overview
Met Office, Exeter, March 15-16, 2004
Flanders Marine Institute (VLIZ)
Presentation transcript:

PRISM An Infrastructure Project for Climate Research in Europe by Nils ECMWF Contributions by A. Caubel, P. Constanza, D. Declat, J. Latour, V. Gayler, E. Guilyardi, C. Larsson, S. Legutke, R. Redler, H. Ritzdorf, T. Schoenemeyer, S. Valcke, R. Vogelsang and many others in the PRISM community... ESMF 3 rd Community Meeting, Boulder, July 15, 2004

Overview What is PRISM? European Interest, partners PRISM Objectives PRISM Model Components PRISM Approach Architecture and User Interface Current Status and Perspective

What is PRISM? PRogram for Integrated Earth System Modelling A European project for climate modelling involving 22 partners, 12/2001 – 12/2004 Funded by the European Commission (4.8 M€) Involves state-of-the-art atmosphere, ocean, sea- ice, atmospheric chemistry, land-surface and ocean-biogeochemistry models

PRISM partners MPI-M, Germany (Guy Brasseur, coordinator) KNMI, The Netherlands (Gerbrand Komen, co- coordinator) MPI-M&D, Germany MetOffice, United Kingdom UREADMY, United Kingdom IPSL, France Météo-France, France CERFACS, France DMI, Denmark SHMI, Sweden NERSC, Norway CSCS/ETH, Switzerland INGV, Italy MPI-BGC, Germany PIK, Germany ECMWF UCL-ASTR, Belgium NEC-ESS, Germany FECIT/Fujitsu, France SGI, Germany SUN, Germany NEC-CCRLE, Germany

Help scientists to spend more time on science Provide software infrastructure to easily assemble earth system coupled models based on existing state-of-art components models launch/monitor complex/ensembles earth system simulations access, analyse and share results across wide community Define and promote technical and scientific standards for Earth System modelling

Technical and scientific standards Scientific: Global parameters Physical interfaces Technical: Coupler and I/O Data format and grids Architecture and User Interface Diagnostics and visualization Coding and quality

PRISM model components Atmosphere: Météo-France (ARPEGE), MPG-IMET (ECHAM), IPSL (LMDZ), MetOffice (Unified Model), UREADMY, INGV Atmospheric Chemistry: MPG-IMET, UREADMY, IPSL, MetOffice, Météo-France, KNMI Land Surface: IPSL (Orchidée), MetOffice, MPG- IMET, UREADMY, Météo-France (ISBA) Sea Ice: NERSC, UCL-ASTR, MetOffice, IPSL, MPG-IMET Ocean Biogeochemistry: MPI-BGC, IPSL, MPG- IMET, MetOffice Ocean: UREADMY, MetOffice (FOAM), MPI-M (HOPE), IPSL (OPA/ORCA) Regional Climate: SHMI, DMI, MetOffice Coupler: CERFACS, NEC, CCRLE, FECIT, SGI, MPI-MAD

ESMF - PRISM PRISM ESMF Running environment Superstructure User code Code Infrastructure

Coupling software and its evolution in PRISM Coupler: OASIS 3.0/4.0 (~10 years experience) Prism system model interface library: PSMILe MPI1 or MPI2 direct communication between models with same grid otherwise repartitioning using a transformer modularity: prism_put() and prism_get() to implement in existing models

OASIS coupler: Ocean Atmosphere Sea Ice Soil Historical review: Developed since 1991 in CERFACS to couple existing GCMs. At the time: Models at relatively low resolution (~ pts) Small number of 2D coupling fields (~10) Low coupling frequency (~once/day)  flexibility was very important, efficiency not so much! |--  |--- PRISM  OASIS 1  OASIS 2  OASIS3   OASIS4 

CGAM-Reading (UK) HadAM3 - ORCA2 Southampton University (UK) Inter. Atm -OCCAM lite UCL (Belgium) LMDz - CLIO SMHI (Sweden) ECHAM - RCARCA(region.) – RCO(region.) U. of Bergen (Norway) MM5 -ROMS KNMI (Netherlands) ECHAM5 -MPI-OM DMI (Danemark) ECHAM -HIRLAM INGV (Italy) ECHAM5 – MPI-OM IRI (USA) ECHAM4 - MOM3 JAMSTEC (Japan) ECHAM4 -OPA 8.2 BMRC (Australia) BAM - MOM4BAM3 - ACOM2 U. of Tasmania (Australia)Data Atm. -MOM4 CAS,IIT Delhi (India)MM5 - POM OASIS community

CERFACS (France) ARPEGE3 -ORCA2LIM ARPEGE3 -OPA 8.1ARPEGE3 - OPAICE METEO-FRANCE (France) ARPEGE4 - ORCA2ARPEGE medias -OPAmed ARPEGE3 - OPA8.1ARPEGE2 - OPA TDH IPSL- LODYC, LMD, LSCE (France) LMDz - ORCA2LIM LMDz - ORCA4LMDz - OPA ATL3/ATL1 IFS - OPA 8.1ECHAM4 -ORCA2 MERCATOR (France) PAM(OPA) MPI - M&D (Germany) ECHAM5 -MPI-OMECHAM5 -C-HOPE PUMA -C-HOPEEMAD -E-HOPE ECHAM5 -E-HOPEECHAM4 -E-HOPE ECMWF (UK) IFS Cy23r4 -E-HOPE IFS Cy15r8 -E-HOPE OASIS community

Oasis2 and Oasis3 Flexibility, modularity:  Coupler and PSMILe act according to user-defined coupling configuration (text file): number of models and coupling fields coupling frequencies and transformations for each field I/O or coupling mode (transparent for model) Mono-process coupler 2D scalar coupling fields interpolation (SCRIP1.4) PRISM System Model Interface Library PSMILe coupling fields exchange (MPI1 & MPI2) I/O actions (GFDL mpp_io) A A A file A A A O O O O Oasis3

Oasis4 – new demands Higher resolution, parallel and scalable models Higher coupling frequencies desirable Higher number of models and (3D) coupling fields Massively parallel platforms Need to optimise and parallelise the coupler

OASIS4 is composed of: a Driver a Transformer a new PRISM System Model Interface Library

Interface and data flow

MPI parallel communication including repartitioning parallel multigrid 3D neighbourhood search and calculation of communication patterns in each source process PSMILe extraction of useful part of source field only OASIS4 OB C C C O1 C C parallel I/O: single file, distributed files: GFDL mpp_io parallel file: parNetCDF parallel Transformer: loops over PSMILe requests flexibility and modularity same as Oasis3 T OB C C C O2 C C

Driver T OCE ATM LAND fileV4 Definition Phase OCE PMIOD V1: out, metadata V1 V2: in, metadata V2 ATM PMIOD V1: in, metadata V1 V2: out, metadata V2 V3: out, metadata V3 LAND PMIOD V3: in, metadata V3 V4: in, metadata V4 ATM-LAND AD OCE AD OCE SMIOC V1 : to ATM, T1 to file V1 V2 : from ATM, T2 user ATM SMIOC V1 : from OCE, T1 V2: to OCE, T2 V3 : to LAND user LAND SMIOC V3 : from ATM V4 : from fileV4 user Composition Phase user SCC ATM:... OCE:... LAND:... user Deployment Phase V1 V2 V3 fileV1 V4

Oasis – Current status OASIS3_prism_2-2 available OASIS4 prototype available OASIS4 final PRISM version due 12/2004

File formats and grids NetCDF for grid and restart auxiliary files CF convention under development, extending the COARDS conventions XML for model and script meta-data input (Fortran namelist and shell replacement)

System Architecture and User Interface PRISM architecture to provide an efficient climate modelling infrastructure to users and developers through: Standardised interfaces Remote functionality Centralised administration Distributed resources

Standard compile (SCE) and run environment (SRE) Finalizing the SCE and SRE for the PRISM models The system comprises 15 models (arpege_climat4 echam5 hamocc lim lmdz mozart mpi-om oasis3 opa orchidee pisces toy4opa toyatm toyche toyoce) which are adapted to a varying degree to the PRISM standards and can run in several combinations. Most of the models have been tested on a variety of platforms (NEC (SX), SGI (MIPS or IA64SGI), FUJITSU (VPP), IBM (power4)) A tedious task without flashy graphics but very useful !!!

PrepIFS

PrepIFS – earth system modeling via the Internet …

OASIS4 – GUI support XML designed to be read by machines not humans !!!

SMS/WebCdp job scheduling and monitoring Complex automated scheduling Macro-parallelism Flexible inter- dependencies Interactive control Visual structure of large systems Used for all operational and research activities at ECMWF ( ~10 years)

Further tools … Diagnostics – Prism processing and visualization software (COCO, CDAT/VCS and VTK), at ECMWF MARS/Vis5D/Metview Web GUI – database and diagnostics web interface using web-access server technology, (DODS and LAS), at ECMWF Web-MARS

A PRISM sustained team Document by PRISM Steering Group proposing establishment of a PRISM sustained team of 7 people sent to European Climate modelling Community (June 04) First preparation meeting: August 17 th, 2004 Target: Signature of Consortium Agreement: 01/2005 MPI (Germany), CERFACS (France), ECMWF (EU), CNRS (France), MetOffice (UK), NCAS(UK), CCRLE (Germany) already expressed strong interest. Additional FP6 funding (March 2005)? Enthusiasm is still high !!!

Further information

Access and use of XML information Coupling and I/O of n parallel applications with m components. Coupling exchange with repartitioning, direct or through the Transformer Interpolations: PSMILe (non-exact) parallel neighbourhood search 3D 2D nearest-neighbour, 3D linear, 2D linear I/O: single and parallel mode Coupling and I/O exchange from one source to many targets Local transformations (scatter, gather, add or mult scalar, statistics) Basic time transformation (average, accumul, min, max) Oasis4: in the prototype (05/2004)

PSMILe API for model access to SCC and SMIOC information Interpolation: More schemes (conservative, 3D, etc.) Exact parallel neighbour search Transformer parallelisation (almost completed) Field reduction, combination Full support of vector and bundle fields (I/O OK) I/O: distributed mode (parNetCDF) Adaptive grids Unstructured grids Oasis4: still to be done