Download presentation
Presentation is loading. Please wait.
Published byAmos Foster Modified over 9 years ago
1
The PRISM infrastructure for Earth system models Eric Guilyardi, CGAM/IPSL and the PRISM Team Background and drivers PRISM project achievements The future
2
The PRISM infrastructure Why a common software infrastructure ? Earth system modelling expertise widely distributed Geographically Thematically
3
The PRISM infrastructure Why a common software infrastructure ? Earth system modelling expertise widely distributed –Scientific motivation = facilitate sharing of scientific expertise and of models –Technical motivation = the technical challenges are large compared with available effort Need to keep scientific diversity while increasing efficiency – scientific and technical Need for concerted effort in view of initiatives elsewhere: –The Frontier Project, Japan –The Earth System Modelling Framework, US
4
The PRISM infrastructure « Share Earth System Modelling software infrastructure across community » To: share development, maintenance and support aid performance on a variety of platforms standardize model software environment ease use of different climate model components PRISM concept
5
The PRISM infrastructure Expected benefits high performance ESM software, developed by dedicated IT experts, available to institutes/teams at low cost: - helps scientists to focus on science - helps key scientific diversity (survival of smallers groups) Easier to assemble ESMs based on community models shared infrastructure = increased scientific exchanges computer manufacturers inclined to contribute: - efficiency (porting, optimisation) on variety of platforms - next generation platforms optimized for ESM needs - easier procurements and benchmarking - reduced computing costs
6
The PRISM infrastructure Software structure of an Earth System Model Running environment Coupling infrastructure Scientific core Supporting software Share (i.e. f90)
7
The PRISM infrastructure Hardware Fortran Compiler Earth System model (Science + support + environment) The long term view Today Modeller IT expert Standard support Library (incl. Env.) Hardware Fortran Compiler Earth System model (Science) Tomorrow Climate science work Towards standard ESM support library(ies)
8
The PRISM infrastructure The PRISM project Program for integrated Earth System Modelling –22 partners –3 Years, from Dec 2001 - Nov 2004 –5 Mill. € funding, FP5 of the EC (~80 py) –Coordinators: G.Brasseur and G.Komen
9
The PRISM infrastructure PRISMinfrastructure - General principles - Constraints from physical interfaces,… - Coupler and I/O - Compile/run environment - GUI - Visualisation and diagnostics - requirements - beta testing - feedback The community models The science : The technical developments: The modelers/users: - Atmosphere - Atmos. Chemistry - Ocean - Ocean biogeochemistry - Sea-ice - Land surface - … Let’s NOT re-invent the wheel ! System specifications
10
The PRISM infrastructure System specifications - the people Reinhard Budich - MPI, Hamburg Andrea Carril - INGV, Bologna Mick Carter - Hadley Center, Exeter Patrice Constanza - MPI/M&D, Hamburg Jérome Cuny - UCL, Louvain-la-Neuve Damien Declat - CERFACS, Toulouse Ralf Döscher - SMHI, Stockholm Thierry Fichefet - UCL, Louvain-la-Neuve Marie-Alice Foujols - IPSL, Paris Veronika Gayler - MPI/M&D, Hamburg Eric Guilyardi* - CGAM, Reading and LSCE Rosalyn Hatcher - Hadley Center, Exeter Miles Kastowsky MPI/BCG, Iena Luis Kornblueh - MPI, Hamburg Claes Larsson - ECMWF, Reading Stefanie Legutke - MPI/M&D, Hamburg Corinne Le Quéré - MPI/BCG, Iena Angelo Mangili - CSCS, Zurich Anne de Montety - UCL, Louvain-la-Neuve Serge Planton - Météo-France, Toulouse Jan Polcher - LMD/IPSL, Paris René Redler, NEC CCRLE, Sankt Augustin Martin Stendel - DMI, Copenhagen Sophie Valcke - CERFACS, Toulouse Peter van Velthoven - KNMI, De Bilt Reiner Vogelsang - SGI, Grasbrunn Nils Wedi - ECMWF, Reading * Chair
11
The PRISM infrastructure 1. a standard coupler and I/O software, OASIS3 (CERFACS) and OASIS4 2. a standard compiling environment (SCE) at the scripting level 3. a standard running environment (SRE) at the scripting level 4. a Graphical User Interface (GUI) to the SCE (PrepIFS, ECMWF) 5. a GUI to the SRE for monitoring the coupled model run (SMS, ECMWF) 6. standard diagnostic and visualisation tools PRISM achievements (so far): Adaptation of community Earth System component models (GCMs) and demonstration coupled configurations A well co-ordinated network of expertise Community buy-in and trust-building Software environment (the tool box):
12
The PRISM infrastructure The PRISM shells Standard Running Environment Standard Compile Environ. PSMILe (coupling and I/O) Historic I/O Outer shells Inner shell Scientific core
13
The PRISM infrastructure Adapting Earth System Components to PRISM SRE SCE UserInterface Levels of adaptation + PSMILe + PMIOD PRISM Model Interface Library Potential Model IO Description
14
The PRISM infrastructure Configuration management and deployment Binary executables SCE UserInterface SRE Transf. Driver disks
15
The PRISM infrastructure Internet PRISM GUI remote functionality User PRISMRepositories (CSCS, MPI) Config. Instrumentedsites Transf. Driver Architecture A Transf. Driver Architect. B Transf. Driver Architect. CDeploy PrepIFS/SMS Web services
16
The PRISM infrastructure Standard scripting environments Standard Compiling Environment SCE Standard Runtime Environment SRE
17
The PRISM infrastructure Data processing and visualisation
18
The PRISM infrastructure Demonstration experiments Platforms Assembled Coupled models CGAM contribution (Jeff Cole)
19
The PRISM infrastructure Development coordinators The coupler and I/O - Sophie Valcke (CERFACS) The standard environments - Stephanie Legutke (MPI) The user interface and web services - Claes Larsson (ECMWF) Analysis and visualisation - Mick Carter (Hadley Centre) The assembled models - Stephanie Legutke (MPI) The demonstration experiments - Andrea Carril (INGV)
20
The PRISM infrastructure Community buy-in Growing ! –Workshops and seminars –15 pionneer models adapted (institutes involvement) –9 test super-computers intrumented –Models distributed under PRISM env. (ECHAM5, OPA 9.0) –Community programmes relying on PRISM framework (ENSEMBLES, COSMOS, MERSEA, GMES, NERC,…) To go further: – PRISM perspective: maintain and develop tool box – Institute perspective: get timing and involvement in next steps right
21
The PRISM infrastructure Collaborations Active collaborations: ESMF (supporting software, PMIOD, MOM4) FLUME (PRISM software) PCMDI (visualisation, PMIOD) CF group (CF names) NERC (BADC & CGAM) (meta-data, PMIOD) M&D, MPI (data) Earth Simulator (install PRISM system V.0) Coupling infrastructure Scientific core Supporting software Running environment PRISM has put Europe in the loop for community-wide convergence on basic standards in ES modelling
22
The PRISM infrastructure PRISM Final Project Meeting De Bilt, October 7-8, 2004
23
The PRISM infrastructure The future PRISM has delivered a tool box, a network of expertise and demonstrations Community buy-in growing Key need for sustainability of –tool box maintenance/development (new features) –network of expertise PRISM sustained initiative Set-up meeting held in Paris Oct 27 2004
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.