System A Community Terrain-Following Ocean Modeling

Slides:



Advertisements
Similar presentations
Weather Research & Forecasting: A General Overview
Advertisements

Computation of High-Resolution Global Ocean Model using Earth Simulator By Norikazu Nakashiki (CRIEPI) Yoshikatsu Yoshida (CRIEPI) Takaki Tsubono (CRIEPI)
WRF Modeling System V2.0 Overview
The Inverse Regional Ocean Modeling System:
A numerical simulation of urban and regional meteorology and assessment of its impact on pollution transport A. Starchenko Tomsk State University.
Patrick Marchesiello Brest, 13 Janvier 2005 Le modèle ROMS et son utilisation sur NYMPHEA Centre IRD de Bretagne.
ROMS/TOMS Numerical Algorithms Hernan G. Arango IMCS, Rutgers University New Brunswick, NJ, USA.
P. N. Vinayachandran Centre for Atmospheric and Oceanic Sciences (CAOS) Indian Institute of Science (IISc) Bangalore Summer.
Indirect Determination of Surface Heat Fluxes in the Northern Adriatic Sea via the Heat Budget R. P. Signell, A. Russo, J. W. Book, S. Carniel, J. Chiggiato,
ROMS/TOMS Tangent Linear and Adjoint Models Andrew Moore, CU Hernan Arango, Rutgers U Arthur Miller, Bruce Cornuelle, Emanuele Di Lorenzo, Doug Neilson.
The ROMS TL and ADJ Models: Tools for Generalized Stability Analysis and Data Assimilation Hernan Arango, Rutgers U Emanuele Di Lorenzo, GIT Arthur Miller,
ECOOP WP2 overview “A community modelling framework” Jason Holt For Roger Proctor ECOOP Consensus and planning meeting March 2005.
NRL modeling during ONR Monterey Bay 2006 experiment. Igor Shulman, Clark Rowley, Stephanie Anderson, John Kindle Naval Research Laboratory, SSC Sergio.
2005 ROMS Users Meeting Monday, October 24, 2005 Coupled sea-ice/ocean numerical simulations of the Bering Sea for the period 1996-present Enrique Curchitser.
Overview of ROMS features (numerics and boundary layer parameterizations) ROMS developments: boundary layers, data assimilation, nesting, Prototype model.
Coastal Ocean Observation Lab John Wilkin, Hernan Arango, John Evans Naomi Fleming, Gregg Foti, Julia Levin, Javier Zavala-Garay,
A Regional Ice-Ocean Simulation Of the Barents and Kara Seas W. Paul Budgell Institute of Marine Research and Bjerknes Centre for Climate Research Bergen,
1 NGGPS Dynamic Core Requirements Workshop NCEP Future Global Model Requirements and Discussion Mark Iredell, Global Modeling and EMC August 4, 2014.
Community Terrain-Following Ocean Modeling System (TOMS) An overview of the collaboration between the modeling communities of Princeton (Ezer) and Rutgers.
Non-hydrostatic algorithm and dynamics in ROMS Yuliya Kanarska, Alexander Shchepetkin, Alexander Shchepetkin, James C. McWilliams, IGPP, UCLA.
Model Simulation Studies of Hurricane Isabel in Chesapeake Bay Jian Shen Virginia Institute of Marine Sciences College of William and Mary.
Introduction In the next few slides you will get an overview of the types of models that the Navy is using – analysis systems, tidal models and the primitive.
SELFE: Semi-implicit Eularian- Lagrangian finite element model for cross scale ocean circulation Paper by Yinglong Zhang and Antonio Baptista Presentation.
An Assimilating Tidal Model for the Bering Sea Mike Foreman, Josef Cherniawsky, Patrick Cummins Institute of Ocean Sciences, Sidney BC, Canada Outline:
A Community T errain-following O cean Modeling S ystem 2003 Terrain-Following Ocean Models Users Workshop PMEL, Seattle, WA, August 5, 2003.
4D Variational Data Assimilation Observation Operators 4D Variational Data Assimilation Observation Operators Hernan G. Arango.
ROMS/TOMS TL and ADJ Models: Tools for Generalized Stability Analysis and Data Assimilation Andrew Moore, CU Hernan Arango, Rutgers U Arthur Miller, Bruce.
The Inverse Regional Ocean Modeling System: Development and Application to Data Assimilation of Coastal Mesoscale Eddies. Di Lorenzo, E., Moore, A., H.
Adjoint Sensitivity Stidues in the Philippine Archipelago Region –Julia Levin –Hernan Arango –Enrique Curchitser –Bin Zhang
The Importance of Atmospheric Variability for Data Requirements, Data Assimilation, Forecast Errors, OSSEs and Verification Rod Frehlich and Robert Sharman.
Oceanic and Atmospheric Modeling of the Big Bend Region Steven L. Morey, Dmitry S. Dukhovksoy, Donald Van Dyke, and Eric P. Chassignet Center for Ocean.
Initial Progress on HYCOM Nested West Florida Shelf Simulations George Halliwell MPO/RSMAS, University of Miami.
Hernan G. Arango, Rutgers University Tal Ezer, Pricenton University FTP File: TOMS.tar A Community.
Hans Burchard Leibniz Institute for Baltic Sea Research Warnemünde How to make a three-dimensional numerical model that.
Physical and numerical issues in river plume modeling Rob Hetland Rocky Geyer Rich Signell.
The Fujin Development of Parallel Coupler Takashi Arakawa Research Organization for Information Science & Technology.
Assimilation of HF Radar Data into Coastal Wave Models NERC-funded PhD work also supervised by Clive W Anderson (University of Sheffield) Judith Wolf (Proudman.
Weak and Strong Constraint 4DVAR in the R egional O cean M odeling S ystem ( ROMS ): Development and Applications Di Lorenzo, E. Georgia Institute of Technology.
Dale haidvogel Nested Modeling Studies on the Northeast U.S. Continental Shelves Dale B. Haidvogel John Wilkin, Katja Fennel, Hernan.
In collaboration with: J. S. Allen, G. D. Egbert, R. N. Miller and COAST investigators P. M. Kosro, M. D. Levine, T. Boyd, J. A. Barth, J. Moum, et al.
1 Advantages of data assimilation in coastal ocean circulation models: Oregon perspective Alexander L. Kurapov, J. S. Allen, G. D. Egbert, R. N. Miller.
Imposed versus Dynamically Modeled Sea Ice: A ROMS study of the effects on polynyas and waters masses in the Ross Sea John M. Klinck, Y. Sinan Hüsrevoglu.
The I nverse R egional O cean M odeling S ystem Development and Application to Variational Data Assimilation of Coastal Mesoscale Eddies. Di Lorenzo, E.
What makes an ocean model coastal ?
ROMS/TOMS European Workshop Alcala de Henares, Spain, November 7, 2006 ROMS Framework and Algorithms Andrew M. Moore UCSC Emanuele Di Lorenzo Georgia Tech.
Weak Constraint 4DVAR in the R egional O cean M odeling S ystem ( ROMS ): Development and application for a baroclinic coastal upwelling system Di Lorenzo,
Comparisons of Numerical Aspects in POM and ROMS Tal Ezer Princeton University (in collaboration with H. Arango (Rutgers) & A. Shchepetkin (UCLA); Supported.
Model Comparison and Evaluation Studies Tal Ezer (with H. Arango & A. Shchepetkin) TOMS Inaugural Meeting, NCAR, Aug. 23, 2001.
CHANGSHENG CHEN, HEDONG LIU, And ROBERT C. BEARDSLEY
Hernan G. Arango, Rutgers University Tal Ezer, Pricenton University FTP File: TOMS.tar A Community.
The I nverse R egional O cean M odeling S ystem Development and Application to Variational Data Assimilation of Coastal Mesoscale Eddies. Di Lorenzo, E.
1 A multi-scale three-dimensional variational data assimilation scheme Zhijin Li,, Yi Chao (JPL) James C. McWilliams (UCLA), Kayo Ide (UMD) The 8th International.
A RAPIDLY RELOCATABLE VERSION OF THE POM Germana Daniel N. FoxNaval Research
The effect of tides on the hydrophysical fields in the NEMO-shelf Arctic Ocean model. Maria Luneva National Oceanography Centre, Liverpool 2011 AOMIP meeting.
Coastal Ocean Circulation and biogeochemical modeling
Posted by Irina Overeem, May 2016
Coupling ROMS and CSIM in the Okhotsk Sea Rebecca Zanzig University of Washington November 7, 2006.
Xing Cai University of Oslo
ROMS Framework: Kernel
Status of the COSMO-Software and Documentation
Modeling and data assimilation in Monterey Bay Area.
Harvard Ocean Prediction System (HOPS)
Mark A. Bourassa and Qi Shi
NRL Coupled Model Activities
Adjoint Sensitivity Analysis of the California Current Circulation and Ecosystem using the Regional Ocean Modeling System (ROMS) Andy Moore, Emanuele.
System A Community Terrain-Following Ocean Modeling
Institute of Marine and Coastal Sciences
  Robin Robertson Lamont-Doherty Earth Observatory
Supervisor: Eric Chassignet
A Coastal Forecasting System
Presentation transcript:

System A Community Terrain-Following Ocean Modeling Hernan G. Arango, Rutgers University Tal Ezer, Pricenton University Alexander F. Shchepetkin, UCLA

COLLABORATORS Bennett et al. (FNMOC; OSU) Chassignet / Iskandarani et al. (RSMAS) Cornuelle / Miller (SIO) Geyer (WHOI) Hetland (TAMU) Lermusiaux (Harvard) Mellor (Pricenton) Moore (U. Colorado) Signell (SACLANT; USGS)

OTHER COLLABORATORS Chao / Song (JPL) Preller / Martin (NRL) Naval Operational Community POM Ocean Modeling Community ROMS / SCRUM Ocean Modeling Community

OBJECTIVES To design, develop and test an expert ocean modeling system for scientific and operational applications To support advanced data assimilation strategies To provide a platform for coupling with operational atmospheric models (like COAMPS) To support massive parallel computations To provide a common set of options for all coastal developers with a goal of defining an optimum coastal/relocatable model for the navy

APPROACH Use state-of-the-art advances in numerical techniques, subgrid-scale parameterizations, data assimilation, nesting, computational performance and parallelization Modular design with ROMS as a prototype Test and evaluate the computational kernel and various algorithms and parameterizations Build a suite of test cases and application databases Provide a web-based support to the user community and a linkage to primary developers

TOMS KERNEL ATTRIBUTES Free-surface, hydrostatic, primitive equation model Generalized, terrain-following vertical coordinates Boundary-fitted, orthogonal curvilinear, horizontal coordinates on an Arakawa C-grid Non-homogeneous time-stepping algorithm Accurate discretization of the baroclinic pressure gradient term High-order advection schemes Continuous, monotonic reconstruction of vertical gradients to maintain high-order accuracy

Dispersive Properties of Advection 5/2 Parabolic Splines 2 10 3/2 6 Vs Finite Centered Differences 8 K(k) • x 4 1 2 1/2 /4 /2 3/4 kx

TOMS SUBGRID-SCALE PARAMETERIZATION Horizontal mixing of tracers along level, geopotential, isopycnic surfaces Transverse, isotropic stress tensor for momentum Local, Mellor-Yamada, level 2.5, closure scheme Non-local, K-profile, surface and bottom closure scheme

TOMS BOUNDARY LAYERS Air-Sea interaction boundary layer from COARE (Fairall et al., 1996) Oceanic surface boundary layer (KPP; Large et al., 1994) Oceanic bottom boundary layer (inverted KPP; Durski et al., 2001)

1. ABL 2. SBL 3. BBL 4. WCBL Boundary Layer Schematic L o n g w a v e Shortwave E p O H 1. ABL 2. SBL 3. BBL 4. WCBL

TOMS BOUNDARY LAYERS Air-Sea interaction boundary layer from COARE (Fairall et al., 1996) Oceanic surface boundary layer (KPP; Large et al., 1994) Oceanic bottom boundary layer (inverted KPP; Durski et al., 2001) Wave / Current / Sediment bed boundary layer (Styles and Glenn, 2000) Sediment transport

TOMS MODULES Lagrangian Drifters (Klinck, Hadfield)

Surface and Bottom Floats

TOMS MODULES Lagrangian Drifters (Klinck, Hadfield) Tidal Forcing (Hetland, Signell)

Gulf of Maine M2 Tides Surface Elevation (m)

TOMS MODULES Lagrangian Drifters (Klinck, Hadfield) Tidal Forcing (Hetland, Signell) River Runoff (Hetland, Signell, Geyer)

Hudson River Estuary Salinity (PSS) Depth (m) Distance (km) 30 -5 25 -10 20 Depth (m) Salinity (PSS) -15 15 -20 10 -25 5 5 10 15 20 25 Distance (km)

TOMS MODULES Lagrangian Drifters (Klinck, Hadfield) Tidal Forcing (Hetland, Signell) River Runoff (Hetland, Signell, Geyer) Biology Fasham-type Model (Moisan, Shchepetkin) EcoSim Bio-Optical Model (Bissett)

TOMS TESTING Systematic evaluation of numerical algorithms via robust test problems Data/Model comparisons Study optimal combination of algorithmic options for various coastal applications Documentation of testing procedures

TOMS CODE DESIGN Modular, efficient, and portable Fortran code (F77+, F90) C-preprocessing managing Multiple levels of nesting Lateral boundary conditions options for closed, periodic, and radiation Arbitrary number of tracers (active and passive) Input and output NetCDF data structure Support for parallel execution on both shared- and distributed -memory architectures

#include "cppdefs.h" mod_ocean. F module mod_ocean use mod_kinds implicit none CSMS$DISTRIBUTE(dh, 1, 2) begin type T_OCEAN real(r8), pointer :: rubar(:,:,:) real(r8), pointer :: rvbar(:,:,:) real(r8), pointer :: rzeta (:,:,:) real(r8), pointer :: ubar (:,:,:) real(r8), pointer :: vbar (:,:,:) real(r8), pointer :: zeta (:,:,:) #ifdef SOLVE3D real(r8), pointer :: pden (:,:,:) real(r8), pointer :: rho (:,:,:) real(r8), pointer :: ru (:,:,:,:) real(r8), pointer :: rv (:,:,:,:) real(r8), pointer :: t (:,:,:,:,:) real(r8), pointer :: u (:,:,:,:) real(r8), pointer :: v (:,:,:,:) real(r8), pointer :: W (:,:,:) #endif end type T_OCEAN CSMS$DISTRIBUTE end type (T_OCEAN), allocatable :: ALL_OCEAN(:) end module mod_ocean

subroutine allocate_mod_ocean mod_ocean. F continue ! !===================================================================== ! Copyright (c) 2001 TOMS Group ! !=================================================== Hernan G. Arango === ! ! ! Allocate and initialize all variables in module mod_ocean for all nested grids. ! use mod_kinds use mod_param use mod_ocean implicit none integer :: ng real(r8), parameter :: IniVal=0.0_r8 allocate (ALL_OCEAN(NestLevels)) do ng=1,NestLevels CSMS$SET_NEST_LEVEL (dh,ng) allocate (ALL_OCEAN(ng) % & rubar(GLOBAL_2D_ARRAY,2)) ALL_OCEAN(ng) % rubar=IniVal & rvbar(GLOBAL_2D_ARRAY,2)) ALL_OCEAN(ng) % rvbar=IniVal & rzeta(GLOBAL_2D_ARRAY,2)) ALL_OCEAN(ng) % rzeta=IniVal & ubar(GLOBAL_2D_ARRAY,3)) ALL_OCEAN(ng) % ubar=IniVal & vbar(GLOBAL_2D_ARRAY,3)) ALL_OCEAN(ng) % vbar=IniVal & zeta(GLOBAL_2D_ARRAY,3)) ALL_OCEAN(ng) % zeta=IniVal

#ifdef SOLVE3D mod_ocean. F continue allocate (ALL_OCEAN(ng) % & pden(GLOBAL_2D_ARRAY,N)) ALL_OCEAN(ng) % pden=IniVal & rho(GLOBAL_2D_ARRAY,N)) ALL_OCEAN(ng) % rho=IniVal & ru(GLOBAL_2D_ARRAY,0:N,2)) ALL_OCEAN(ng) % ru=IniVal & rv(GLOBAL_2D_ARRAY,0:N,2)) ALL_OCEAN(ng) % rv=IniVal & t(GLOBAL_2D_ARRAY,N,3,NT)) ALL_OCEAN(ng) % t=IniVal & u(GLOBAL_2D_ARRAY,N,2)) ALL_OCEAN(ng) % u=IniVal & v(GLOBAL_2D_ARRAY,N,2)) ALL_OCEAN(ng) % v=IniVal & W(GLOBAL_2D_ARRAY,0:N)) ALL_OCEAN(ng) % W=IniVal #endif enddo return end subroutine allocate_mod_ocean

CPP Definitions #ifdef EW_PERIODIC # ifdef NS_PERIODIC # define GLOBAL_2D_ARRAY -2:Lm(ng)+2+padd_X,-2:Mm(ng)+2+padd_E # define START_2D_ARRAY -2,-2 # else # define GLOBAL_2D_ARRAY -2:Lm(ng)+2+padd_X,0:Mm(ng)+1+padd_E # define START_2D_ARRAY -2,0 # endif #else # define GLOBAL_2D_ARRAY 0:Lm(ng)+1+padd_X,-2:Mm(ng)+2+padd_E # define START_2D_ARRAY 0,-2 # define GLOBAL_2D_ARRAY 0:Lm(ng)+1+padd_X,0:Mm(ng)+1+padd_E # define START_2D_ARRAY 0,0 #endif #define PRIVATE_1D_SCRATCH_ARRAY Istr-3:Iend+3 #define PRIVATE_2D_SCRATCH_ARRAY Istr-3:Iend+3,Jstr-3:Jend+3

#include "cppdefs.h” omega. F #ifdef SOLVE3D subroutine omega (ng,tile) ! !========================================== Alexander F. Shchepetkin === ! Copyright (c) 2001 TOMS Group ! !================================================ Hernan G. Arango === ! ! ! This routine computes S-coordinate vertical velocity (m^3/s), ! ! W=[Hz/(m*n)]*omega, ! ! diagnostically at horizontal RHO-points and vertical W-points. ! !================================================================== use mod_kinds use mod_param use mod_grid use mod_ocean implicit none integer, intent(in) :: ng, tile integer :: Iend, Istr, Jend, Jstr, trd integer :: my_threadnum trd=my_threadnum() call get_tile (ng,tile,Istr,Iend,Jstr,Jend) call omega_tile (ng,Istr,Iend,Jstr,Jend, & ALL_OCEAN(ng) % W, & ALL_GRID (ng) % Huon, & ALL_GRID (ng) % Hvom, & ALL_GRID (ng) % z_w) return end

omega. F continue subroutine omega_tile (ng,Istr,Iend,Jstr,Jend,W,Huon,Hvom,z_w) use mod_kinds use mod_param use mod_scalars use mod_sources implicit none integer, intent(in) :: Iend, Istr, Jend, Jstr, ng CSMS$DISTRIBUTE(dh, 1, 2) begin real(r8), intent(inout) :: & W (GLOBAL_2D_ARRAY,0:N) CSMS$DISTRIBUTE end csms$distribute(dh, 1, 2) begin real(r8), intent(in) :: & Huon (GLOBAL_2D_ARRAY,N), & Hvom (GLOBAL_2D_ARRAY,N), & z_w (GLOBAL_2D_ARRAY,N) csms$distribute end integer :: i, j, k real(r8) :: wrk(PRIVATE_1D_SCRATCH_ARRAY) #include "set_bounds.h"

! omega. F continue ! Vertically integrate horizontal mass flux divergence. ! csms$check_halo(Huon<0,1><0,0>, Hvom<0,0><0,1>, 'omega') do j=Jstr,Jend do i=Istr,Iend W(i,j,0)=0.0_r8 enddo do k=1,N W(i,j,k)=W(i,j,k-1)- & (Huon(i+1,j,k)-Huon(i,j,k)+ & Hvom(i,j+1,k)-Hvom(i,j,k)) wrk(i)=W(i,j,N)/(z_w(i,j,N)-z_w(i,j,0)) do k=Nm,1,-1 W(i,j,k)=W(i,j,k)-wrk(i)*(z_w(i,j,k)-z_w(i,j,0)) W(i,j,N)=0.0_r8 ! Set lateral boundary conditions. call w3dbc_tile (ng,Istr,Iend,Jstr,Jend,W) CSMS$EXCHANGE(W<2,1><2,1>) #else subroutine omega #endif /* SOLVE3D */ csms$compare_var(W, 'omega') return end

TOMS PARALLEL DESIGN Coarse-grained parallelization

PARALLEL TILE PARTITIONS } Nx Ny PARALLEL TILE PARTITIONS 8 x 8

TOMS PARALLEL DESIGN Coarse-grained parallelization Shared-memory, compiler depend directives MAIN (OpenMP standard) Distributed-memory (MPI; SMS) Optimized for cache-bound computers ZIG-ZAG cycling sequence of tile partitions Few synchronization points (around 6) Serial and Parallel I/O (via NetCDF) Efficiency 4-64 threads

TOMS DATA ASSIMILATION Nudging Optimal Interpolation (OI) Tangent linear and Adjoint algorithms 4D VARiational data assimilation (4DVAR) and Physical Statistical Analysis System (PSAS) algorithms Inverse Ocean Modeling System (IOMS) Ensemble prediction platform based on singular value decomposition Error Subspace Statistical Estimation (ESSE)

Statistical Approximation ESSE Flow Diagram + DY0/N ^ ESSE Smoothing via Statistical Approximation DE0/N + + DP0/N - - Performance/ Analysis Modules Field Initialization Y0 Central Forecast Ycf(-) ^ Most Probable Forecast + Ymp(-) ^ Shooting Synoptic Obs Measurement Model A Posteriori Residules dr (+) Historical, Synoptic, Future in Situ/Remote Field/Error Observations d0R0 Sample Probability Density + - Select Best Forecast - Measurement Model Data Residuals Measurement Error Covariance Mean OA via ESSE ^ Ensemble Mean d-CY(-) + Options/ Assumptions ^ + eq{Yj(-)} Minimum Error Variance Within Error Subspace (Sequential processing of Observations) Gridded Residules ^ Y(-) + - j=1 ^ ^ Y(+) Y(+) Y1 Yj Yq Scalable Parallel Ensemble Forecast Y1 Yj Yq ^ - + + - Perturbations + Error Subspace Initialization ^ SVDp E(-) P(-) - + + + - E0 P0 +/- ^ j=q Normalization uj(o,Ip) with physical constraints Continuous Time Model Errors Q(t) Adaptive Error Subspace Learning Field Operation Assumption Key Convergence Criterion Continue/Stop Iteration Breeding Peripherals Analysis Modules E(+) P(+) Ea(+) Pa(+)

RESULTS (YEAR 1) Build TOMS from ROMS prototype Mellor-Yamada, level 2.5 Passive and active open boundary conditions Tidal forcing River runoff Lagrangian drifters Data assimilation Inter-comparison between POM and ROMS Evaluation of time-stepping, advection, and pressure gradient algorithms Initial development of TOMS web site