Grid refinement in ROMS

Slides:



Advertisements
Similar presentations
Weather Research & Forecasting: A General Overview
Advertisements

A NUMERICAL PREDICTION OF LOCAL ATMOSPHERIC PROCESSES A.V.Starchenko Tomsk State University.
1 Preliminary Simulation of the Regional Coupled Atmosphere-Ocean Model in the Southern California Coastal Regions (Santa Ana Winds and Air-Sea Interaction)
UU Unitah Basin WRF Model Configuration (same as DEQ) See Alcott and Steenburgh 2013 for further details on most aspects of this numerical configuration:
June 2003Yun (Helen) He1 Coupling MM5 with ISOLSM: Development, Testing, and Application W.J. Riley, H.S. Cooley, Y. He*, M.S. Torn Lawrence Berkeley National.
An intraseasonal moisture nudging experiment in a tropical channel version of the WRF model: The model biases and the moisture nudging scale dependencies.
WRF Physics Options Jimy Dudhia. diff_opt=1 2 nd order diffusion on model levels Constant coefficients (khdif and kvdif) km_opt ignored.
To Couple or Not To Couple John Warner, USGS. Overview of some recent advancements to ROMS –sediment transport components –wave/current interactions –model.
SWAN User's manual
- How does the coupled modeling system work
A coupled model system for shelf seas and marginal seas Hans Burchard (IOW) and many others.
Mesh refinement methods in ROMS Laurent Debreu INRIA, Grenoble, France In collaboration with Patrick Marchesiello and Pierrick Penven (IRD, Brest, France)
Coupling ROMS and WRF using MCT
1 NGGPS Dynamic Core Requirements Workshop NCEP Future Global Model Requirements and Discussion Mark Iredell, Global Modeling and EMC August 4, 2014.
Configuring ROMS for South of Java Kate Hedstrom, ARSC/UAF October, 2007.
COAWST Modelling System Training 27 August 2014 Stephen D. Nicholls and Karen I. Mohr NASA-Goddard Space Flight Center.
Development of WRF-CMAQ Interface Processor (WCIP)
Mesoscale Modeling Review the tutorial at: –In class.
HWRF ERROR ANALYSIS T N Krishnamurti A.Thomas A. Simon Florida State University.
John Warner US Geological Survey, Woods Hole, MA
Atmospheric Modeling in an Arctic System Model John J. Cassano Cooperative Institute for Research in Environmental Sciences and Department of Atmospheric.
Initial Results from the Integration of Earth and Space Frameworks Cecelia DeLuca/NCAR, Alan Sussman/University of Maryland, Gabor Toth/University of Michigan.
On the Multi-Intensity Changes of Hurricane Earl (2010) Daniel Nelson, Jung Hoon Shin, and Da-Lin Zhang Department of Atmospheric and Oceanic Science University.
Model Coupling Environmental Library. Goals Develop a framework where geophysical models can be easily coupled together –Work across multiple platforms,
Mathematics and Computer Science & Environmental Research Divisions ARGONNE NATIONAL LABORATORY Regional Climate Simulation Analysis & Vizualization John.
Coupled Model Data Assimilation: Building an idealised coupled system Polly Smith, Amos Lawless, Alison Fowler* School of Mathematical and Physical Sciences,
Jonathan Pleim 1, Robert Gilliam 1, and Aijun Xiu 2 1 Atmospheric Sciences Modeling Division, NOAA, Research Triangle Park, NC (In partnership with the.
SWAN cx, cy = propagation velocities (x- and y- directions)  = relative frequency  = wave direction S = source/sink term for: - wind-wave generation.
Higher Resolution Operational Models. Operational Mesoscale Model History Early: LFM, NGM (history) Eta (mainly history) MM5: Still used by some, but.
Imposed versus Dynamically Modeled Sea Ice: A ROMS study of the effects on polynyas and waters masses in the Ross Sea John M. Klinck, Y. Sinan Hüsrevoglu.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Earth-Sun System Division National Aeronautics and Space Administration SPoRT SAC Nov 21-22, 2005 Regional Modeling using MODIS SST composites Prepared.
Regional Models in CCSM CCSM/POP/ROMS: Regional Nesting and Coupling Jon Wolfe (CSEG) Mariana Vertenstein (CSEG) Don Stark (ESMF)
Seasonal Modeling (NOAA) Jian-Wen Bao Sara Michelson Jim Wilczak Curtis Fleming Emily Piencziak.
ROMS as a Component of the Community Climate System Model (CCSM) Enrique Curchitser, IMCS/Rutgers Kate Hedstrom, ARSC/UAF Bill Large, Mariana Vertenstein,
VIC Land surface model overview
Higher Resolution Operational Models. Major U.S. High-Resolution Mesoscale Models (all non-hydrostatic ) WRF-ARW (developed at NCAR) NMM-B (developed.
August 2001 Parallelizing ROMS for Distributed Memory Machines using the Scalable Modeling System (SMS) Dan Schaffer NOAA Forecast Systems Laboratory (FSL)
ATmospheric, Meteorological, and Environmental Technologies RAMS Parallel Processing Techniques.
NCEP ESMF GFS Global Spectral Forecast Model Weiyu Yang, Mike Young and Joe Sela ESMF Community Meeting MIT, Cambridge, MA July 21, 2005.
A COUPLED ATMOSPHERE-OCEAN MODELING SYSTEM FOR INVESTIGATING THE EXCEPTIONAL WINTER 2012 CONDITIONS IN THE NORTHERN ADRIATIC SEA b Antonio Ricchi (Univ.
Parallel Data Transfer in the Model Coupling Toolkit Robert L. Jacob J. Walter Larson Mathematics and Computer Science Division Argonne National Laboratory.
On the Road to a Sequential CCSM Robert Jacob, Argonne National Laboratory Including work by: Mariana Vertenstein (NCAR), Ray Loy (ANL), Tony Craig (NCAR)
An Overview of ROMS Code Kate Hedstrom, ARSC April 2007.
Report on POP & CICE of RACM components Jaromir Jakacki, IO PAS.
Report on POP & CICE of RACM components Jaromir Jakacki, IO PAS Boulder, CO, 2010.
Numerical Investigation of Air- Sea Interactions During Winter Extratropical Storms Presented by Jill Nelson M.S. Marine Science Candidate Graduate Research.
ARSC/IARC Report DOD/RACM Workshop, December Juanxiong He 1,2 Greg Newby 1 1. Arctic Region Supercomputing Center 2. International Arctic Research.
Test Cases for the WRF Height Coordinate Model
Introduction Contain two or more CPU share common memory and peripherals. Provide greater system throughput. Multiple processor executing simultaneous.
Interfacing Model Components CRTI RD Project Review Meeting Canadian Meteorological Centre August 22-23, 2006.
IC2_I Scenarios of future changes in the occurrence of extreme storm surges Nilima Natoo A. Paul, M. Schulz (University of Bremen) M.
Coupled model applications Tutorial
Maitane Olabarrieta John C. Warner
Enhancement of Wind Stress and Hurricane Waves Simulation
WRF Four-Dimensional Data Assimilation (FDDA)
ROMS Framework: Kernel
Status of the COSMO-Software and Documentation
Coupled atmosphere-ocean simulation on hurricane forecast
SWAN model applications
Shuyi S. Chen, Ben Barr, Milan Curcic and Brandon Kerns
Mark A. Bourassa and Qi Shi
UCLA Regional Earth System Modeling for VOCALS
Coupled Atmosphere-Wave-Ocean Modeling Experiments in Hurricanes
COAWST Applications: WRF only
A brief introduction to NEMS
WRF Application in COAWST
SWAN model applications
SWAN model coupling Projects/Sandy.
John Warner US Geological Survey, Woods Hole, MA
Presentation transcript:

Grid refinement in ROMS

Two-way grid refinement SST for: USeast grid = 5 km Carolinas grid = 1 km Grid refinement of the ocean and wave models is required to allow increased resolution in coastal areas.

just look at rho points for now Parent grid just look at rho points for now M Mm 1 1 Lm L

Child grid connectivity (for Nref=5) Parent grid rho points Lower left and Upper right parent psi points to identify child region (create_nested_grid.m) M+2 M Mm Parent - child interface 1 Child grid rho points 4 rho points to left and bottom 3 rho points to right and top -3 -3 1 Lm L L+2 Inner child region

Time stepping …….. dtp Parent (grid 1) dtc Child (grid 2) main3d.F (dtp and dtc) # ifdef REFINED_GRID ! get data from parent grid at time of parent. Only need ! to do this once per child loop. IF (ng.gt.1) THEN IF (get_refdata(ng).eq.TRUE.) THEN CALL get_2dparent_data (ng, TILE) CALL get_3dparent_data (ng, TILE) END IF ! Put the child data back into the parent grid. IF ((ng.lt.Ngrids).and.(iic(ng).gt.1)) THEN CALL step3dref_t (ng, TILE) CALL set_2dchild_data (ng, TILE) CALL set_depth (ng, TILE) CALL set_3dchild_data (ng, TILE) # endif … ! interpolate the parent data to child time. CALL set_2dparent_data (ng, TILE) CALL set_3dparent_data (ng, TILE) ………… mian model time step computations ….. initial.F ! Read in initial conditions from initial NetCDF file. ! CALL get_state (ng, iNLM, 1, INIname(ng), IniRec, Tindex) …… #ifdef REFINED_GRID !----------------------------------------------------------------------- ! Compute indices of children grid locations in the parent grid. ! For 2-way, this needs to be done for all grids, not just ng>1. IF (ng.lt.NestedGrids) THEN CALL init_child_hindices (ng, TILE) END IF IF (ng.gt.1) THEN CALL init_parent_hindices (ng, TILE) ! Obtain initial boudnary conditions from the parent data. CALL get_2dparent_data (ng, TILE) # ifdef SOLVE3D CALL get_3dparent_data (ng, TILE) # endif #endif 1 4 6 1 5 2 iic=0 iic=1 iic=2 3 3 3 3 dtp 1 6 6 6 Parent (grid 1) …….. dtc Child (grid 2) 1 5 5 5 5 4 5 5 5 5 4 5 5 5 5 4 2 5 5 5

How to prepare a refined grid application for ROMS 1) Create child grid 2) interp better bathy to child grid 3) Match parent - child bathy + mask 4) 3D init and climatology 5) Surface forcings 6) roms input file 7) coawst.bash 8) run it

1) create child grid Tools/mfiles/mtools/create_nested_grid.m 1) enter parent (coarse) grid file 2) enter child (fine) grid file 3) Istr, Jstr, Iend, Jend 4) scale (3 or 5) 5) set create_child_grid=1 this calls parentchild_grid.m

2) interp better bathy to child grid The bathy in the grid you just created is from the parent. You need to get bathy from somewhere (see discussion earlier today). Need to do the smoothing also.

3) Match parent - child bathy + mask Tools/mfiles/ mtools/create_nested_grid.m (again) You might want to make copies of the grids before you run this again, as this will modify the masking and bathy in both files. Run this again, but this time set : merge_par_child_batrhy=1 (this calls parentchild_bathy.m) merge_par_child_mask=1 (this calls parentchild_mask.m)

4) 3D init and climatology COAWST/Tools/mfiles/roms_clm/roms_master_climatology_coawst_mw.m 5) Surface forcings Can use same surface forcing files as parent (or make new ones)

6) roms input file this happens to be for a 5 grid application First grid is Lm is 2 points less than total length For all other grids, Lm is 7 points less than total length. Same for Mm. N and Nbed and tracers are same for all grids can tile differently, but need same total for each grid time step needs to divide evenly into parent

6) roms input file

6) roms input file

6) roms input file

6) roms input file

6) roms input file

Set NestedGrids to be the 7) coawst.bash Set NestedGrids to be the TOTAL number of grids

8) run it Set np = number of procs, same for each grid

- How does the coupled modeling system work - How does the coupled modeling system work? and - Setting up a coupled application

Coupled Modeling System Model Coupling Toolkit Mathematics and Computer Science Division Argonne National Laboratory http://www-unix.mcs.anl.gov/mct/ MCT is an open-source package that provides MPI based communications between all nodes of a distributed memory modeling component system. Download and compile as libraries that are linked to. Model A running on M nodes. Model B running on N nodes. Model C ……… MCT provides communications between all models. ……… (it also works here) Warner, J.C., Perlin, N., and Skyllingstad, E. (2008). Using the Model Coupling Toolkit to couple earth system models. Environmental Modeling and Software

Libraries MCT - v2.60 or higher (distributed) 1) cd to the MCT dir 2) ./configure This makes Makefile.conf. you can edit this file. 3) make 4) make install 5) set environment vars setenv MCT_INCDIR COAWST/Lib/MCT/include setenv MCT_LIBDIR COAWST/Lib/MCT/lib (or where ever you installed them, see last slide)

Compilers dir (side note)

init_file (# procs/model) Model organization master.F mpi_init init_file (# procs/model) { init run finalize SWAN { init run finalize ROMS

init, run, and finalize ROMS SWAN roms_init init run roms_run init_param init_parallel init_scaclars init_coupling MPI_INIT init (grid decomp) roms_init SWINIT SWREAD (grid) init_coupling SWINITMPI run (sync. point) main3d ..... waves_coupling ... swanmain ..... ocean_coupling ... roms_run SWMAIN roms_finalize mpi_finalize close_io finalize SWEXITMPI mpi_finalize close_io

Grid decomposition (during initialization) SWAN ROMS Each tile is on a separate processor. Each tile registers with MCT.

init_coupling ROMS- init_coupling SWAN- init_coupling 1 1 2 2 3 3 processed by each ROMS tile processed by each SWAN tile

Synchronization (run phase) ROMS- ocean_oupling SWAN- waves_coupling MCT MCT processed by each ROMS tile processed by each SWAN tile

Let's look at the fields exchanged between models.

ATM - OCN interactions ATM OCN or #define ATM2OCN_FLUXES #define BULK_FLUXES Use momentum + heat fluxes computed in WRF for both ROMS+WRF Use wrf vars in COARE algorithm Salt flux #define EMINUSP #define ATM_PRESS - Patm Uwind, Vwind Swrad, Lwrad, RH, Tair, cloud rain, evap Ustress, Vstress, Swrad, Lwrad LH, HFX stflx_salt = evap - rain LH + HFX computed in bulk_fluxes ATM Uwind, Vwind, Patm, RH, Tair, cloud, rain, evap, SWrad, Lwrad LH, HFX, Ustress, Vstress stflx_temp = Swrad+Lwrad +LH+HFX OCN Integration and Application Network (ian.umces.edu/symbols), University of Maryland Center for Environmental Science.

= f ( Hwave, Lpwave, Tpsurf ) ATM interactions ATM Hwave, Lpwave, Tpsurf, SST OCN WAV OCN SST Momentum Heat Surface fluxes Moisture = f ( Hwave, Lpwave, Tpsurf ) WAV

How to create coupled application 1) Create all input, BC, init, forcing, etc files for each model as if running separately. I recommend that you run each model separately first. 2) modify cppdefs in your header file. 3) SCRIP (if different grids) 4) coupling.in 5) coawst.bash 6) run it as coawstM

1) Use each model separately WRF 27 vertical levels dt 36 s Physics Lin microphysics RRTM longwave, Dudhia shortwave Mellor-Yamada-Janjic (MYJ) PBL Kain-Fritsch (KF) cumulus scheme ROMS 16 vertical levels dt 240, 48 Physics GLS turbulence closure COARE bulk fluxes BC's from HYCOM Timestep = 240s 6 km grid 5km and 1 km grid(s) These models are on different grids.

2) south_car.h

3) SCRIP - grid interpolation http://climate.lanl.gov/Software/SCRIP/ Ocean grid 5 km Atm Grid 6 km 10 GFS data HFLX SST Ocean model provides higher resolution and coupled response of SST to atmosphere. But the ocean grid is limited in spatial coverage so atmosphere model must combine data from different sources, which can create a discontinuity in the forcing. Atmosphere model provides heat flux to cover entire ocean grid. SCRIP interpolations weights needed to remap data fields. Flux conservative remapping scheme

Libraries SCRIP - v 1.6 (distributed) Used when 2 or more models are not on the same grid. 1) cd to COAWST/Lib/SCRIP/source dir 2) edit makefile 3) make

COAWST\Tools\mfiles\mtools\scrip_wrf.m Need to prepare SCRIP input files by first converting ROMS and WRF grids to a standard netcdf file type that SCRIP likes. COAWST\Tools\mfiles\mtools\scrip_wrf.m

COAWST\Tools\mfiles\mtools\scrip_roms.m Need to prepare SCRIP input files by first converting ROMS and WRF grids to a standard netcdf file type that SCRIP likes. COAWST\Tools\mfiles\mtools\scrip_roms.m

run the program as ./scrip SCRIP input file: scrip_in grid1_file and grid2_file were created with the matlab m files on the last 2 slides interp_file1 and interp_file2 will be the new scrip interpolation weights files Need to use conservative and fracarea !! run the program as ./scrip

3) SCRIP Need to run SCRIP for each grid pair. So if you have 1 WRF grid, driving 2 ROMS grids then you need 2 sets of weights. 1000 m 5000 m ROMS grid 1 ROMS grid 2 WRF grid 1

4) coupling.in (this is a ROMS+WRF app) set # procs for each model (total = 56) set coupling interval. for now leave it the same for all models. input file names. only 1 for WRF, 1 for ROMS, multiple for SWAN SCRIP weights are listed here set which WRF grid to couple to

4a) ocean.in set # procs for ocean model had listed 20 in coupling.in 5 x 4 = 20 need dt of 240 to divide evenly into coupling interval of 1200 sec.

4b) namelist.input need dt of 30 to divide evenly into coupling interval of 1200 sec. set # procs for atm model had listed 36 in coupling.in 6 x 6 = 36

set # nested roms / swan grids, 5) coawst.bash set # nested roms / swan grids, app name, etc

6) run it as coawstM use total number of procs from coupling.in only 1 executable

Processor allocation stdout reports processor allocation This looks like from a different run, but you get the idea

Processor allocation "Timing for …." = WRF "1 179 52974 02:59:00 " = ROMS Here is where the model coupling synchronization occurs. so probably could re-allocate more nodes to WRF

JOE_TC - test case examples JOE_TC test cases are distributed applications for testing ROMS+WRF coupling JOE_TCw = wrf only JOE_TCs = same grid, roms + wrf coupled JOE_TCd = different grids for roms and wrf, needs scrip weights