Presentation is loading. Please wait.

Presentation is loading. Please wait.

Grid refinement in ROMS

Similar presentations


Presentation on theme: "Grid refinement in ROMS"— Presentation transcript:

1 Grid refinement in ROMS

2 Two-way grid refinement
SST for: USeast grid = 5 km Carolinas grid = 1 km Grid refinement of the ocean and wave models is required to allow increased resolution in coastal areas.

3 just look at rho points for now
Parent grid just look at rho points for now M Mm 1 1 Lm L

4 Child grid connectivity (for Nref=5)
Parent grid rho points Lower left and Upper right parent psi points to identify child region (create_nested_grid.m) M+2 M Mm Parent - child interface 1 Child grid rho points 4 rho points to left and bottom 3 rho points to right and top -3 -3 1 Lm L L+2 Inner child region

5 Time stepping …….. dtp Parent (grid 1) dtc Child (grid 2)
main3d.F (dtp and dtc) # ifdef REFINED_GRID ! get data from parent grid at time of parent. Only need ! to do this once per child loop. IF (ng.gt.1) THEN IF (get_refdata(ng).eq.TRUE.) THEN CALL get_2dparent_data (ng, TILE) CALL get_3dparent_data (ng, TILE) END IF ! Put the child data back into the parent grid. IF ((ng.lt.Ngrids).and.(iic(ng).gt.1)) THEN CALL step3dref_t (ng, TILE) CALL set_2dchild_data (ng, TILE) CALL set_depth (ng, TILE) CALL set_3dchild_data (ng, TILE) # endif ! interpolate the parent data to child time. CALL set_2dparent_data (ng, TILE) CALL set_3dparent_data (ng, TILE) ………… mian model time step computations ….. initial.F ! Read in initial conditions from initial NetCDF file. ! CALL get_state (ng, iNLM, 1, INIname(ng), IniRec, Tindex) …… #ifdef REFINED_GRID ! ! Compute indices of children grid locations in the parent grid. ! For 2-way, this needs to be done for all grids, not just ng>1. IF (ng.lt.NestedGrids) THEN CALL init_child_hindices (ng, TILE) END IF IF (ng.gt.1) THEN CALL init_parent_hindices (ng, TILE) ! Obtain initial boudnary conditions from the parent data. CALL get_2dparent_data (ng, TILE) # ifdef SOLVE3D CALL get_3dparent_data (ng, TILE) # endif #endif 1 4 6 1 5 2 iic=0 iic=1 iic=2 3 3 3 3 dtp 1 6 6 6 Parent (grid 1) …….. dtc Child (grid 2) 1 5 5 5 5 4 5 5 5 5 4 5 5 5 5 4 2 5 5 5

6 How to prepare a refined grid application for ROMS
1) Create child grid 2) interp better bathy to child grid 3) Match parent - child bathy + mask 4) 3D init and climatology 5) Surface forcings 6) roms input file 7) coawst.bash 8) run it

7 1) create child grid Tools/mfiles/mtools/create_nested_grid.m
1) enter parent (coarse) grid file 2) enter child (fine) grid file 3) Istr, Jstr, Iend, Jend 4) scale (3 or 5) 5) set create_child_grid=1 this calls parentchild_grid.m

8 2) interp better bathy to child grid
The bathy in the grid you just created is from the parent. You need to get bathy from somewhere (see discussion earlier today). Need to do the smoothing also.

9 3) Match parent - child bathy + mask
Tools/mfiles/ mtools/create_nested_grid.m (again) You might want to make copies of the grids before you run this again, as this will modify the masking and bathy in both files. Run this again, but this time set : merge_par_child_batrhy=1 (this calls parentchild_bathy.m) merge_par_child_mask=1 (this calls parentchild_mask.m)

10 4) 3D init and climatology
COAWST/Tools/mfiles/roms_clm/roms_master_climatology_coawst_mw.m 5) Surface forcings Can use same surface forcing files as parent (or make new ones)

11 6) roms input file this happens to be for a 5 grid application
First grid is Lm is 2 points less than total length For all other grids, Lm is 7 points less than total length. Same for Mm. N and Nbed and tracers are same for all grids can tile differently, but need same total for each grid time step needs to divide evenly into parent

12 6) roms input file

13 6) roms input file

14 6) roms input file

15 6) roms input file

16 6) roms input file

17 Set NestedGrids to be the
7) coawst.bash Set NestedGrids to be the TOTAL number of grids

18 8) run it Set np = number of procs, same for each grid

19

20 - How does the coupled modeling system work
- How does the coupled modeling system work? and - Setting up a coupled application

21 Coupled Modeling System
Model Coupling Toolkit Mathematics and Computer Science Division Argonne National Laboratory MCT is an open-source package that provides MPI based communications between all nodes of a distributed memory modeling component system. Download and compile as libraries that are linked to. Model A running on M nodes. Model B running on N nodes. Model C ……… MCT provides communications between all models. ……… (it also works here) Warner, J.C., Perlin, N., and Skyllingstad, E. (2008). Using the Model Coupling Toolkit to couple earth system models. Environmental Modeling and Software

22 Libraries MCT - v2.60 or higher (distributed) 1) cd to the MCT dir
2) ./configure This makes Makefile.conf. you can edit this file. 3) make 4) make install 5) set environment vars setenv MCT_INCDIR COAWST/Lib/MCT/include setenv MCT_LIBDIR COAWST/Lib/MCT/lib (or where ever you installed them, see last slide)

23 Compilers dir (side note)

24 init_file (# procs/model)
Model organization master.F mpi_init init_file (# procs/model) { init run finalize SWAN { init run finalize ROMS

25 init, run, and finalize ROMS SWAN roms_init init run roms_run
init_param init_parallel init_scaclars init_coupling MPI_INIT init (grid decomp) roms_init SWINIT SWREAD (grid) init_coupling SWINITMPI run (sync. point) main3d ..... waves_coupling ... swanmain ..... ocean_coupling ... roms_run SWMAIN roms_finalize mpi_finalize close_io finalize SWEXITMPI mpi_finalize close_io

26 Grid decomposition (during initialization)
SWAN ROMS Each tile is on a separate processor. Each tile registers with MCT.

27 init_coupling ROMS- init_coupling SWAN- init_coupling 1 1 2 2 3 3
processed by each ROMS tile processed by each SWAN tile

28 Synchronization (run phase)
ROMS- ocean_oupling SWAN- waves_coupling MCT MCT processed by each ROMS tile processed by each SWAN tile

29 Let's look at the fields exchanged between models.

30 ATM - OCN interactions ATM OCN or
#define ATM2OCN_FLUXES #define BULK_FLUXES Use momentum + heat fluxes computed in WRF for both ROMS+WRF Use wrf vars in COARE algorithm Salt flux #define EMINUSP #define ATM_PRESS - Patm Uwind, Vwind Swrad, Lwrad, RH, Tair, cloud rain, evap Ustress, Vstress, Swrad, Lwrad LH, HFX stflx_salt = evap - rain LH + HFX computed in bulk_fluxes ATM Uwind, Vwind, Patm, RH, Tair, cloud, rain, evap, SWrad, Lwrad LH, HFX, Ustress, Vstress stflx_temp = Swrad+Lwrad +LH+HFX OCN Integration and Application Network (ian.umces.edu/symbols), University of Maryland Center for Environmental Science.

31 = f ( Hwave, Lpwave, Tpsurf )
ATM interactions ATM Hwave, Lpwave, Tpsurf, SST OCN WAV OCN SST Momentum Heat Surface fluxes Moisture = f ( Hwave, Lpwave, Tpsurf ) WAV

32 How to create coupled application
1) Create all input, BC, init, forcing, etc files for each model as if running separately. I recommend that you run each model separately first. 2) modify cppdefs in your header file. 3) SCRIP (if different grids) 4) coupling.in 5) coawst.bash 6) run it as coawstM

33 1) Use each model separately
WRF 27 vertical levels dt 36 s Physics Lin microphysics RRTM longwave, Dudhia shortwave Mellor-Yamada-Janjic (MYJ) PBL Kain-Fritsch (KF) cumulus scheme ROMS 16 vertical levels dt 240, 48 Physics GLS turbulence closure COARE bulk fluxes BC's from HYCOM Timestep = 240s 6 km grid 5km and 1 km grid(s) These models are on different grids.

34 2) south_car.h

35 3) SCRIP - grid interpolation
Ocean grid 5 km Atm Grid 6 km 10 GFS data HFLX SST Ocean model provides higher resolution and coupled response of SST to atmosphere. But the ocean grid is limited in spatial coverage so atmosphere model must combine data from different sources, which can create a discontinuity in the forcing. Atmosphere model provides heat flux to cover entire ocean grid. SCRIP interpolations weights needed to remap data fields. Flux conservative remapping scheme

36 Libraries SCRIP - v 1.6 (distributed)
Used when 2 or more models are not on the same grid. 1) cd to COAWST/Lib/SCRIP/source dir 2) edit makefile 3) make

37 COAWST\Tools\mfiles\mtools\scrip_wrf.m
Need to prepare SCRIP input files by first converting ROMS and WRF grids to a standard netcdf file type that SCRIP likes. COAWST\Tools\mfiles\mtools\scrip_wrf.m

38 COAWST\Tools\mfiles\mtools\scrip_roms.m
Need to prepare SCRIP input files by first converting ROMS and WRF grids to a standard netcdf file type that SCRIP likes. COAWST\Tools\mfiles\mtools\scrip_roms.m

39 run the program as ./scrip
SCRIP input file: scrip_in grid1_file and grid2_file were created with the matlab m files on the last 2 slides interp_file1 and interp_file2 will be the new scrip interpolation weights files Need to use conservative and fracarea !! run the program as ./scrip

40 3) SCRIP Need to run SCRIP for each grid pair.
So if you have 1 WRF grid, driving 2 ROMS grids then you need 2 sets of weights. 1000 m 5000 m ROMS grid 1 ROMS grid 2 WRF grid 1

41 4) coupling.in (this is a ROMS+WRF app)
set # procs for each model (total = 56) set coupling interval. for now leave it the same for all models. input file names. only 1 for WRF, 1 for ROMS, multiple for SWAN SCRIP weights are listed here set which WRF grid to couple to

42 4a) ocean.in set # procs for ocean model had listed 20 in coupling.in
5 x 4 = 20 need dt of 240 to divide evenly into coupling interval of 1200 sec.

43 4b) namelist.input need dt of 30 to divide
evenly into coupling interval of 1200 sec. set # procs for atm model had listed 36 in coupling.in 6 x 6 = 36

44 set # nested roms / swan grids,
5) coawst.bash set # nested roms / swan grids, app name, etc

45 6) run it as coawstM use total number of procs from coupling.in
only 1 executable

46 Processor allocation stdout reports processor allocation
This looks like from a different run, but you get the idea

47 Processor allocation "Timing for …." = WRF
" :59:00 " = ROMS Here is where the model coupling synchronization occurs. so probably could re-allocate more nodes to WRF

48 JOE_TC - test case examples
JOE_TC test cases are distributed applications for testing ROMS+WRF coupling JOE_TCw = wrf only JOE_TCs = same grid, roms + wrf coupled JOE_TCd = different grids for roms and wrf, needs scrip weights


Download ppt "Grid refinement in ROMS"

Similar presentations


Ads by Google