Download presentation
Presentation is loading. Please wait.
Published byMeredith Reeves Modified over 9 years ago
1
ENES and PRISM: A European approach to Earth System modelling Sophie Valcke, CERFACS and the PRISM team across Europe
2
Computing in Atmospheric Sciences Workshop 2003 Slide 2 Outline ENES The PRISM project: partners goals model components standard physical interfaces architecture and GUI PRISM first coupler: Oasis3 (08/2003): configuration communication interpolations/transformations PRISM final coupler (12/2004) configuration, communication, interpolations/transformations
3
Computing in Atmospheric Sciences Workshop 2003 Slide 3 ENES Climate research in Europe: Societal/political needs in Europe high (IPCC, mitigation,…) Recognised excellence; scientific diversity (models, approaches,…) How to organise Earth System modelling in Europe ? The « one-big-centre-does-it-all » not suitable: - expertise lies within national centres - diversity is key to research Need for shared infrastructures: - software (PRISM) - hardware
4
Computing in Atmospheric Sciences Workshop 2003 Slide 4 ENES ENES: European Network for Earth System modelling «Think tank » to organize, plan and seek funding for efficient distributed Earth System modelling in Europe Follows a EuroClivar recommendation Open to any institute/industry (MoU) Coordinated by Guy Brasseur (MPI, Hamburg) 50 members so far (http://enes.org)
5
Computing in Atmospheric Sciences Workshop 2003 Slide 5 A long term strategy: 1.Jointly develop shared software infrastructure for Earth System modelling 2.Provide European integrated service to access and use this infrastructure 3.Provide and manage hyper-computing access by 2010 PRISM ENES
6
Computing in Atmospheric Sciences Workshop 2003 Slide 6 The PRISM project PRISM: PRogram for Integrated Earth System Modelling A European project, started December 2001, funded for 3 years by the European Commission (4.8 M€) Coordinators: Guy Brasseur (MPI, Hamburg) Gerbrand Komen (KNMI, Amsterdam) PRISM Director: Reinhard Budich (MPI)
7
Computing in Atmospheric Sciences Workshop 2003 Slide 7 => 22 partners: leading climate research institutes and computer vendors MPG-IMET, Germany KNMI, Netherlands MPI-MAD, Germany Met-Office, UK UREADMY, UK IPSL, France Météo-France, France CERFACS, France DMI, Denmark SHMI, Sweden NERSC, Norway ETH Zurich, Switzerland ING, Italy MPI-BGC, Germany PIK, Germany ECMWF, Europe UCL-ASTR, Belgium NEC Europe FECIT/Fujitsu SGI Europe SUN Europe PRISM partners
8
Computing in Atmospheric Sciences Workshop 2003 Slide 8 Provide software infrastructure to easily assemble Earth system coupled models based on existing state-of-art European component models launch/monitor complex/ensemble Earth system simulations PRISM goals Help climate modellers spend more time on science:
9
Computing in Atmospheric Sciences Workshop 2003 Slide 9 Define and promote technical and scientific standards for ESM: Scientific standards: Physical interfaces between model components Global Earth System parameters Technical standards: Compiling, running, post-processing environment Architecture and Graphical User Interface Coupler and I/O software Data and grid format Coding and quality Interaction with other groups (ESMF, ESG/NOMADS, CF, RPN?,...) PRISM goals
10
Computing in Atmospheric Sciences Workshop 2003 Slide 10 Atmosphere: Météo-France (ARPEGE), MPG- IMET (ECHAM), IPSL (LMDZ), MetOffice (Unified Model), UREADMY, INGV Atmospheric Chemistry: MPG-IMET, UREADMY, IPSL, MetOffice, Météo-France, KNMI Land Surface: IPSL (Orchidée), MetOffice, MPG-IMET, UREADMY, Météo- France (ISBA) Sea Ice: NERSC, UCL-ASTR, MetOffice, IPSL, MPG-IMET Ocean Biogeochemistry : MPI-BGC, IPSL, MPG- IMET, MetOffice Ocean: UREADMY, MetOffice (FOAM), MPI-M (HOPE), IPSL (OPA/ORCA) Regional Climate: SHMI, DMI, MetOffice Coupler: CERFACS, NEC, CCRLE, FECIT, SGI, MPI-MAD PRISM model components
11
Computing in Atmospheric Sciences Workshop 2003 Slide 11 Ocean model 1- Rainfall + int. energy 2- Snowfall + int. energy 3- Incoming solar radiat. 4- Solar zenith angle 5- Fraction of diffuse solar radiation 6- Downward infrared radiation 7- Sensitivity of atmos temp. & humidity to surf. fluxes 1*- Sensible heat flux 2*- Surf. emissivity 3*- Albedo, direct 4*- Albedo, diffuse 5*- Surf. radiative temp. 6*- Evaporation + int. energy [+ Q lat ] 7*- Wind stress 8- Subgrid fractions 1- Surface pressure 2-4 Air temperature, humidity and wind 5- Wind module 6- Height of these 4 variables 1*- C d 2*- C e 3*- C h 1 x - Non solar heat flux 2 x - Solar radiation 3 x - Fresh water flux 4 x - Salt flux 5 x - Wind stress 6 x - U^3 7 x - Mass of snow and ice 8- Subgrid fractions 1*- Surf. Temp 2*- Surf. Roughness 3*- Displacement height 4 x - Surface velocity 1- Continental runoff + internal Energy 1-2 Temp./Salinity at sea-ice base 3- Sea surface temperature 4- Surf. radiative temp. 5- Surface ocean current 6- Sea surface salinity 7- Surface height 8- Absorbed solar radiation (in first oceanic layer) Iceberg parameters 1 2 3 4 5 6 7 8 Ocean surface module Surface layer turbulence Sea ice model wave model +3 4 1 2 Atmosphere model Land surface model Note on subgrid fraction dependance: <> x - Sea Ice categories (incl. open ocean) <>*- Sea Ice or Land Surf. categories A proposal for PRISM standard O-A-SI physical interfaces:
12
Computing in Atmospheric Sciences Workshop 2003 Slide 12 PRISM central server + PRISM local sites GUI: adaptation of ECMWF prepIFS and SMS scheduler User/ developer Prism central server Prism local site Web PRISM architecture and GUI:
13
Computing in Atmospheric Sciences Workshop 2003 Slide 13 Based on Oasis developed since 1991 in CERFACS to couple existing GCMs developed independently at the time: Models at relatively low resolution (~10000-20000 pts) Small number of 2D coupling fields (~10) Low coupling frequency (~once/day) flexibility was very important, efficiency not so much! performs: synchronisation of the component models coupling fields exchange and interpolation I/O actions tested on VPP5000, NEC SX5, SGI Octane and O3000, Compaq Alpha cluster, Linux PC cluster (MPI-Globus) PRISM first coupler: Oasis3
14
Computing in Atmospheric Sciences Workshop 2003 Slide 14 Oasis regular users: CERFACS METEO-FRANCE (France) IPSL- LODYC, LMD, LSCE (France) ECMWF (UK) UCL (Belgium) MPI - M&D (Germany) SMHI (Sweden) BMRC (Australia) IRI (USA) …and : AWI (Germany) PIK (Germany) Met Office (UK) UGAMP (UK) KNMI (Netherlands) CSIRO (Australia) FSU/COAPS (USA) LASG (China) JAMSTEC (Japan) …? PRISM project first coupler: Oasis3
15
Computing in Atmospheric Sciences Workshop 2003 Slide 15 PRISM project first coupler: Oasis3 Oasis3 configuration: In text file namcouple read by Oasis3 at the beginning of the run, e.g. total run time number and names of component models number and names of coupling fields; for each field: source and target symbolic name coupling and/or I/O status, coupling or I/O period transformations/interpolations … Component model grid (longitudes, latitudes, masks, mesh surfaces, mesh corner locations) must be available in binary or NetCDF files.
16
Computing in Atmospheric Sciences Workshop 2003 Slide 16 PRISM project first coupler: Oasis3 Oasis3 communication: New PRISM System model interface (PSMILe) based on MPI1 or MPI2 message passing Parallel communication between parallel models and Oasis3 interpolation process A A A B B B A A A file A A A O O O O Oasis3 Direct communication between models with same grid and partitioning I/O functionality (automatic switch between coupled and forced mode) Modularity: at each model time step, exchange is performed or not depending on user’s specifications in namcouple. Automatic time integration depending on user’s specification
17
Computing in Atmospheric Sciences Workshop 2003 Slide 17 Oasis3 interpolations/transformations PRISM project first coupler: Oasis3 A A A O O O O Oasis3 => performed by separate sequential process => on 2D scalar fields only Interfacing with RPN Fast Scalar INTerpolator package nearest-neighbour, bilinear, bicubic for regular Lat-Lon grids Interfacing with SCRIP1.4 library (Los Alamos Software Release LACC 98-45): nearest-neighbour, 1st and 2nd order conservative remapping for all grids bilinear and bicubic interpolation for «logically-rectangular» grids Bilinear and bicubic interpolation for reduced atmospheric grids Other spatial transformations: flux correction, merging, etc. General algebraic operations
18
Computing in Atmospheric Sciences Workshop 2003 Slide 18 PRISM project final coupler Higher resolution, parallel and scalable models Higher coupling frequencies desirable Higher number of models and (3D) coupling fields Need to optimise and parallelise the coupler The final PRISM coupler will be composed of: a Driver a Transformer a new PRISM System Model Interface Library
19
Computing in Atmospheric Sciences Workshop 2003 Slide 19 PRISM project final coupler Final coupler configuration (XML files): The user chooses the models through the GUI. Each component model comes with: an Application Description (AD) a Potential Model Input and Output Description (PMIOD). The user configures his particular coupled run through the GUI : total run time, etc. for each field described in the PMIOD: source or target coupling or I/O status, coupling or I/O period transformations/interpolations, etc. Based on the user’s choice, the GUI produces the XML configuration files. At run-time the Driver reads and distributes configuration information. The PSMILes and Transformer act accordingly to the user’s specifications.
20
Computing in Atmospheric Sciences Workshop 2003 Slide 20 PRISM project final coupler Final coupler communication: More elaborate PSMILe based on MPI1 or MPI2 (grid definition transferred through the PSMILe API) Modularity as for Oasis3: at each model time step, exchange is performed or not depending on user’s specifications. As for Oasis3, automatic time integration As for Oasis3, I/O functionality (automatic switch between coupled and forced mode) Parallel communication: as for Oasis3 + repartitioning. OB C C C O1 C C Parallel calculation of interpolation weights and addresses in the source PSMILe Extraction of useful part of source field only.
21
Computing in Atmospheric Sciences Workshop 2003 Slide 21 Final coupler interpolations/transformations => as for Oasis3 + Support of vector fields Support of 3D fields More flexibility for field combination/merging, etc. The PRISM project final coupler
22
Computing in Atmospheric Sciences Workshop 2003 Slide 22 Conclusions ENES and PRISM PRISM first coupler: Oasis3, now available PRISM final coupler prototype due 11/2003 PRISM final coupler due 12/2004 … and after PRISM ? Follow-on project re-submitted at next EU-call in 2004 (CAPRI rejected) International interaction and collaboration essential in all cases! http://www.enes.org ; http//www.cerfacs.fr/PRISM/prism.html valcke@cerfacs.fr
23
Computing in Atmospheric Sciences Workshop 2003 Slide 23 PRISM project first coupler: Oasis3 Oasis3 communication; PSMILe API: : Initialization: call prism_init_comp(…) Retrieval of component model local communicator call prism_get_localcomm (…) Coupling or I/O field declarations (name, type, shape, local partition, …) call prism_def_var(field_idx, …) End of definition call prism_enddef(…) In model time stepping loop, coupling or I/O field exchange call prism_put(field_id1, time, field_array1, ierror), call prism_get(field_id2, time, field_array2, ierror) => Automatic averaging/accumulation, coupling exchange, and/or I/O depending on time argument and user’s specifications in namcouple Termination: call prism_terminate(…)
24
Computing in Atmospheric Sciences Workshop 2003 Slide 24 PRISM project final coupler Final coupler communication; PSMILe API : : As for Oasis3 PSMILe + Definition of grid (1D, 2D, 3D) call prism_def_grid(…) call prism_set_corners(…) call prism_set_mask(…) Definition of grid for vector and bundle fields call prism_set_vector(…) Call prism_set_subgrid(…) Coupling or I/O field declarations support vector, bundles, 1D, 2D and 3D fields Extraction of SCC and SMIOC information: call prism_get_persist(…)
25
ATM SMIOC V1 : from OCE, T1 V2: to OCE, T2 V3 : to LAND user LAND SMIOC V3 : from ATM V4 : from fileV4 user OCE SMIOC V1 : to ATM, T1 V2 : from ATM, T2 user Driver T OCE ATM LAND fileV4 Definition Phase OCE PMIOD V1: out, metadata V1 V2: in, metadata V2 ATM PMIOD V1: in, metadata V1 V2: out, metadata V2 V3: out, metadata V3 LAND PMIOD V3: in, metadata V3 V4: in, metadata V4 LAND AD ATM AD OCE AD V2 V1 V3 V4 Deployment Phase V2 user Composition Phase user SCC ATM:... OCE:... LAND:... user
26
Deployment Phase V3 V6 V2 V1 V4 V5 V7 Mj SMIOC V1 : cf SCC V4 : cf SCC user Mk SMIOC V4 : cf SCC V5 : in, fileV5, TnlV5k user Mi SMIOC V1 : cf SCC V2 : cf SCC V3 : in, fileV3, Tli user Composition Phase SMIOC: Specific Model Input and Output Config. user SCC V1 : Mi -> Mj, Tli, Tnlij V2 : Mi -> Mj, Tij (+ V6) V4 : Mj -> Mk user SCC: Specific Coupling Configuration Driver T Mi Mj MkMk Definition Phase Mi PMIOD V1: out, metadata V1 V2: out, metadata V2 V3: in, metadata V3 Mj PMIOD V1: in, metadata V1 V4: out, metadata V4 Mk PMIOD V4: in, metadata V4 V5: in, metadata V5 fileV6 fileV3 fileV5 PMIOD: Potential Model Input and Output Description Mi: Model i T: Transformer
27
Computing in Atmospheric Sciences Workshop 2003 Slide 27 Coupling infrastructure Supporting software Scientific code Running environment Software structure of an Earth System Model Share
28
Computing in Atmospheric Sciences Workshop 2003 Slide 28 On going PRISM / ESMF collaboration Coupling infrastructure Supporting software User code Running environment PRISM ESMF Earth System Model
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.