Presentation is loading. Please wait.

Presentation is loading. Please wait.

ECMWF "uncoupled" coupled model: Coupling using subroutine calls

Similar presentations


Presentation on theme: "ECMWF "uncoupled" coupled model: Coupling using subroutine calls"— Presentation transcript:

1 ECMWF "uncoupled" coupled model: Coupling using subroutine calls
CW2015 Kristian S. Mogensen Marine Prediction Section

2 Overview of talk: Baseline: Motivation:
The focus of this talk is going to be on coupling of IFS and WAM to NEMO Motivation: Why do we need to couple the ocean to the atmosphere at ECMWF in the first place? A brief history of coupled model setup at ECMWF Technologies used for coupling in the past Coupling requirement for NWP versus climate modelling The current coupled model based on IFS-WAM-NEMO: How does it work and especially how do we glue the IFS+WAM together with the NEMO model Scalability of the coupled model Conclusions and outlook

3 Why do we need an ocean model for NWP?
The ocean is the lower boundary for atmosphere for a large part of the earth Accurately modelling this lower boundary should give feedback to the atmosphere For long range predictions like seasonal forecasting and monthly forecasting this is very important ENSO For medium range forecasting the ocean state (including sea-ice) can change on a daily time scale Ice modelling Hurricane feedback to/from the ocean Today we use a coupled model for the following systems: The ensemble prediction system (ENS) from day 0 to 15 and the monthly extension to day 32 twice per week The seasonal forecasting system

4 Hurricane modelling with our coupled model.

5 A short history of coupled modelling at ECMWF
IFS (atmosphere) coupled to HOPE (ocean) from around 1997. OASIS2 based on coupling via files/pipes Seasonal: System 1 to 3 Monthly forecasting Originally separate system Became part of VarEPS (now ENS) coupled in leg B (day 10 to 15) and leg C (beyond day 15) in March 2008 IFS coupled to NEMO (ocean) from November 2011 OASIS3 based on coupled via MPI using MPMD execution Seasonal system 4 + VarEPS/ENS Still used for seasonal forecasting. IFS coupled to NEMO in a single executable from November 2013 in ENS. No external coupler Coupling between the WAM component and NEMO ENS system coupled in all 3 legs (coupling from day 0)

6 Coupling requirements for NWP versus climate modelling
Predictions on a short-range to seasonal forecasting is to a large extent an initial condition problem You can’t make accurate predictions unless you know your initial state We need data assimilation systems for atmosphere, land, ocean, wave … Data assimilation systems can depend on the model formulation E.g. you need the tangent linear and adjoints models for a 4D-VAR system A large amount of human resources goes into developing the DA systems Where we use external models (like the NEMO model for the ocean) we stick with one model for many years A consequence is that we don’t care too much about being very flexible with choices of models. A fit for purpose solution is more important Coupling for in all forecasting systems is a priority for ECMWF for the future.

7 Current coupled system:
Design principle: the atmosphere and wave models don’t know anything about the inner workings of NEMO There is a coupling layer, which have access to all NEMO F90 modules (e.g. data), but only accepts data from IFS/WAM as arguments Coupling fields to/from NEMO is passed as subroutine arguments to this layer in IFS/WAM Grid information from IFS/WAM needs to passed as arguments as well All regridding is done within the interface layer: Interpolation weights are computed outside the model and read from a file The interpolation is done in parallel with minimum source field information communicated to the individual MPI tasks If destination points a and b on task N both needing source point c from task M then it is only sent to task N once Coupling is sequential with the following calling sequence: Atmosphere → Waves → Ocean Time stepping in the atmosphere model (IFS) drives the coupling

8 Schematic overview of coupled system:
The LIM2 model needs additional fields from the IFS

9 Coupled from the point of view of the atmosphere:
IF (LNEMOCOUP) CALL ININEMO DO JSTEP=NSTAR2,ISTOP <<BORING ATMOSPHERE STUFF>> IF (LLWVTIME) THEN CALL WVCOUPLE(TSTEP,NSTPW,LLSTOP,LLWRRW) ENDIF IF(NFRCO /= 0 .AND. LNEMOCOUP .AND. LMCC04) THEN IF(MOD(NSTEP,NFRCO) == 0) THEN IF (LWCOU) THEN CALL UPDNEMOFIELDS(LWSTOKES) CALL UPDNEMOSTRESS CALL COUPLNEMO(NSTEP) ENDDO

10 Coupling initialization (below ininemo)
In nemogcmcoup_coupinit.F90: CALL scripremap_read(cdfile_gauss_to_T,remap_gauss_to_T) CALL parinter_init( mype, npes, icomm, & & npoints, nglopoints, nlocmsk, ngloind, & ! IFS & nopoints, noglopoints, omask, ogloind, & ! NEMO & remap_gauss_to_T, gausstoT, …) CALL scripremap_dealloc(remap_gauss_to_T) where: remap_gauss_to_T : Structure containing interpolation weights gausstoT : Structure containing the information for parallel interpolation This is done for all relevant grid (IFS-NEMO, WAM-NEMO) combinations. Interpolation weights are computed outside the coupled model

11 Coupled running of NEMO (couplnemo)
How the data to NEMO is transferred, the NEMO time-stepped and fields received with the IFS Interface routines are in blue ! Update NEMO forcing fields CALL NEMOGCMCOUP_UPDATE( MYPROC-1, NPROC, MPL_COMM, NGPTOT, & & ZSTRSU, ZSTRSV, ZFRSOS, ZFCHAS, ZFHUMS, & & KSTEP, LNEMOFLUXNC ) ! NEMO time stepping DO JSTPNEMO=NEMOCSTEP,NEMOCSTEP+NEMONSTEP-1 ! Advance the NEMO model 1 time step CALL NEMOGCMCOUP_STEP( JSTPNEMO, IDATE, ITIME ) ENDDO ! Update IFS coupling fields CALL NEMOGCMCOUP_GET( MYPROC-1, NPROC, MPL_COMM, & & NGPTOT, ZGSST, ZGICE, ZGUCUR, ZGVCUR ) In NEMOGCMCOUP_UPDATE the arguments are regridding and relevant variables in NEMO updated The routine NEMOGCMCOUP_STEP basically just call the standard NEMO time step In NEMOGCMCOUP_GET the NEMO fields are regridded onto the arguments variable on the atmosphere grid

12 The actual parallel interpolation/regridding:
Parallel regridding routine (pinfo=gausstoT, gausstoU,…): SUBROUTINE parinter_fld( mype, nproc, mpi_comm, & & pinfo, nsrclocpoints, zsrc, ndstlocpoints, zdst ) DO i=1,pinfo%nsendtot zsend(i)=zsrc(pinfo%send_address(i)) ENDDO CALL mpi_alltoallv(& & zsend,pinfo%nsend(0:nproc-1), & & pinfo%nsdisp(0:nproc-1),mpi_double_precision, & & zrecv,pinfo%nrecv(0:nproc-1), & & pinfo%nrdisp(0:nproc-1),mpi_double_precision, & & mpi_comm,istatus) zdst(:)=0.0 DO i=1,pinfo%num_links zdst(pinfo%dst_address(i)) = zdst(pinfo%dst_address(i)) + & & pinfo%remap_matrix(1,i)*zrecv(pinfo%src_address(i)) END DO END SUBROUTINE parinter_fld Called by NEMOGCMCOUP_UPDATE and NEMOGCMCOUP_GET routines.

13 Scalability of the coupled model
Scalability runs for a 10 day period of ENS resolution (T639_L91 or around 31 km) The ocean and atmosphere is about equal in cost HRES resolution (T1279_L137 or around 15 km) The ocean is cheap compared to the atmosphere All run was done on the production Cray XC-30 systems at ECMWF means interference from other running jobs on file system load etc Initialization is ignored No output, but normal input Coupling overhead were investigated by changing the frequency of coupling This changes the solution, but frequent coupling is a goal This was done without the ice model since we call the ice model every coupled time step so the coupled system will increase in cost to due increase cost of LIM2 A production rate of 240 forecasts days per day is required for operations

14 Scalability of the coupled model at ENS resolutions

15 Effect of coupling frequency: No measureable slowdown.

16 Scalability of the coupled model at HRES resolutions

17 Effect of coupling frequency: No measureable slowdown

18 Scalability observations
The coupled model scale reasonable well, but there are room for improvements (as always) In principle we could implement a coupled HRES system today (in terms of run times only) Very little overhead in the actual coupling on a production system More frequent coupling should be more expensive A dedicated system might reveal some difference High frequent coupled current means calling the ice model more frequent since it is called at every coupling step Makes physical sense since the forcing fields of the ice model are updated But at an added cost

19 Challenges for our coupled model: grids.
Top left Gaussian N128 reduced atmosphere grid Top right ORCA1 ocean grid Bottom left 1.0 degree reduced wave grid 16 domains for all grids Each colour represent a domain

20 Challenges for our coupled model: grids (2)
Grids from previous slide 16 point stencil used in interpolation Little overlap of areas means all interpolations needs communication Short messages of the order of Kbytes Packing all fields together could be done to decrease the number of exchanges Especially important when coupling with the LIM ice model A solution could be to reshuffle domains in NEMO, but that would require significant changes to the NEMO model

21 Conclusions and outlook
The single executable coupled model works reasonable well for the current operational resolutions As always: improvements are possible With our setup the coupled model almost don’t feel like a coupled model No fundamental technical difference between calling the atmospheric physics and the ocean model (besides the regridding) The actual “coupling” code is quite small (~2700 lines of code) with about 500 lines of changes to NEMO. It have been used for ensemble medium range forecasting (ENS) for close to a year now with a low resolution (1 degree) ocean configuration Work on integrating the coupled model in other systems has been done or is ongoing Weakly coupled data assimilation prototype has developed in the context of reanalysis More work on high resolution coupled modeling will be done in the near future

22 Questions?


Download ppt "ECMWF "uncoupled" coupled model: Coupling using subroutine calls"

Similar presentations


Ads by Google