Presentation is loading. Please wait.

Presentation is loading. Please wait.

System A Community Terrain-Following Ocean Modeling

Similar presentations


Presentation on theme: "System A Community Terrain-Following Ocean Modeling"— Presentation transcript:

1 System A Community Terrain-Following Ocean Modeling
Hernan G. Arango, Rutgers University Tal Ezer, Pricenton University Alexander F. Shchepetkin, UCLA

2 COLLABORATORS Bennett et al. (FNMOC; OSU)
Chassignet / Iskandarani et al. (RSMAS) Cornuelle / Miller (SIO) Geyer (WHOI) Hetland (TAMU) Lermusiaux (Harvard) Mellor (Pricenton) Moore (U. Colorado) Shchepetkin (UCLA) Signell (SACLANT; USGS)

3 OTHER COLLABORATORS Chao / Song (JPL) Preller / Martin (NRL)
Naval Operational Community POM Ocean Modeling Community ROMS / SCRUM Ocean Modeling Community

4 OBJECTIVES To design, develop and test an expert ocean modeling system for scientific and operational applications To support advanced data assimilation strategies To provide a platform for coupling with operational atmospheric models (like COAMPS) To support massive parallel computations To provide a common set of options for all coastal developers with a goal of defining an optimum coastal/relocatable model for the navy

5 APPROACH Use state-of-the-art advances in numerical techniques, subgrid-scale parameterizations, data assimilation, nesting, computational performance and parallelization Modular design with ROMS as a prototype Test and evaluate the computational kernel and various algorithms and parameterizations Build a suite of test cases and application databases Provide a web-based support to the user community and a linkage to primary developers

6 “The complexity of physics, numerics, data assimilation, and hardware
CHALLENGE “The complexity of physics, numerics, data assimilation, and hardware technology should be transparent to the expert and non-expert USER”

7 TOMS KERNEL ATTRIBUTES
Free-surface, hydrostatic, primitive equation model Generalized, terrain-following vertical coordinates Boundary-fitted, orthogonal curvilinear, horizontal coordinates on an Arakawa C-grid Non-homogeneous time-stepping algorithm Accurate discretization of the baroclinic pressure gradient term High-order advection schemes Continuous, monotonic reconstruction of vertical gradients to maintain high-order accuracy

8 Dispersive Properties of Advection
5/2 Parabolic Splines 2 10 3/2 6 Vs Finite Centered Differences 8 K(k) • x 4 1 2 1/2 /4 /2 3/4 kx

9 TOMS SUBGRID-SCALE PARAMETERIZATION
Horizontal mixing of tracers along level, geopotential, isopycnic surfaces Transverse, isotropic stress tensor for momentum Local, Mellor-Yamada, level 2.5, closure scheme Non-local, K-profile, surface and bottom closure scheme

10 TOMS BOUNDARY LAYERS Air-Sea interaction boundary layer from COARE (Fairall et al., 1996) Oceanic surface boundary layer (KPP; Large et al., 1994) Oceanic bottom boundary layer (inverted KPP; Durski et al., 2001)

11 1. ABL 2. SBL 3. BBL 4. WCBL Boundary Layer Schematic L o n g w a v e
Shortwave E p O H 1. ABL 2. SBL 3. BBL 4. WCBL

12 TOMS BOUNDARY LAYERS Air-Sea interaction boundary layer from COARE (Fairall et al., 1996) Oceanic surface boundary layer (KPP; Large et al., 1994) Oceanic bottom boundary layer (inverted KPP; Durski et al., 2001) Wave / Current / Sediment bed boundary layer (Styles and Glenn, 2000) Sediment transport

13 TOMS MODULES Lagrangian Drifters (Klinck, Hadfield)

14 Surface and Bottom Floats

15 TOMS MODULES Lagrangian Drifters (Klinck, Hadfield)
Tidal Forcing (Hetland, Signell)

16 Gulf of Maine M2 Tides Surface Elevation (m)

17 TOMS MODULES Lagrangian Drifters (Klinck, Hadfield)
Tidal Forcing (Hetland, Signell) River Runoff (Hetland, Signell, Geyer)

18 Hudson River Estuary Salinity (PSS) Depth (m) Distance (km) 30 -5 25
-10 20 Depth (m) Salinity (PSS) -15 15 -20 10 -25 5 5 10 15 20 25 Distance (km)

19 TOMS MODULES Lagrangian Drifters (Klinck, Hadfield)
Tidal Forcing (Hetland, Signell) River Runoff (Hetland, Signell, Geyer) Biology Fasham-type Model (Moisan, Shchepetkin) EcoSim Bio-Optical Model (Bissett)

20 TOMS TESTING Systematic evaluation of numerical algorithms via robust test problems Data/Model comparisons Study optimal combination of algorithmic options for various coastal applications Documentation of testing procedures

21 TOMS CODE DESIGN Modular, efficient, and portable Fortran code (F77+, F90) C-preprocessing managing Multiple levels of nesting Lateral boundary conditions options for closed, periodic, and radiation Arbitrary number of tracers (active and passive) Input and output NetCDF data structure Support for parallel execution on both shared- and distributed -memory architectures

22 #include "cppdefs.h" #ifdef EW_PERIODIC # ifdef NS_PERIODIC # define GLOBAL_2D_ARRAY -2:Lm(ng)+2+padd_X,-2:Mm(ng)+2+padd_E # define START_2D_ARRAY -2,-2 # else # define GLOBAL_2D_ARRAY -2:Lm(ng)+2+padd_X,0:Mm(ng)+1+padd_E # define START_2D_ARRAY -2,0 # endif #else # define GLOBAL_2D_ARRAY 0:Lm(ng)+1+padd_X,-2 :Mm(ng)+2+padd_E # define START_2D_ARRAY 0,-2 # define GLOBAL_2D_ARRAY 0:Lm(ng)+1+padd_X,0:Mm(ng)+1+padd_E # define START_2D_ARRAY 0,0 #endif #define PRIVATE_1D_SCRATCH_ARRAY Istr-3:Iend+3 #define PRIVATE_2D_SCRATCH_ARRAY Istr-3:Iend+3,Jstr-3:Jend+3

23 module mod_ocean /* ********************************************************************** ** Copyright (c) 2001 TOMS Group ** ************************************************** Hernan G. Arango *** ** ** ** 2D Primitive Variables ** ** rubar Right-hand-side of 2D U-momentum equation (m4/s2) ** ** rvbar Right-hand-side of 2D V-momentum equation (m4/s2) ** ** rzeta Right-hand-side of free surface equation (m3/s) ** ** ubar Vertically integrated U-momentum component (m/s) ** ** vbar Vertically integrated V-momentum component (m/s) ** ** zeta Free surface (m) ** ** 3D Primitive Variables ** ** pden Potential Density anomaly (kg/m3) ** ** rho Density anomaly (kg/m3) ** ** ru Right-hand-side of 3D U-momentum equation (m4/s2) ** ** rv Right hand side of 3D V-momentum equation (m4/s2) ** ** t Tracer type variables (usually, potential temperature ** ** and salinity) ** ** u D U-momentum component (m/s) ** ** v D V-momentum component (m/s) ** ** W S-coordinate (omega*Hz/mn) vertical velocity (m3/s) ** */

24 use mod_kinds implicit none CSMS$DISTRIBUTE(dh, 1, 2) begin type T_OCEAN real(r8), pointer :: rubar(:,:,:) real(r8), pointer :: rvbar(:,:,:) real(r8), pointer :: rzeta (:,:,:) real(r8), pointer :: ubar (:,:,:) real(r8), pointer :: vbar (:,:,:) real(r8), pointer :: zeta (:,:,:) #ifdef SOLVE3D real(r8), pointer :: pden (:,:,:) real(r8), pointer :: rho (:,:,:) real(r8), pointer :: ru (:,:,:,:) real(r8), pointer :: rv (:,:,:,:) real(r8), pointer :: t (:,:,:,:,:) real(r8), pointer :: u (:,:,:,:) real(r8), pointer :: v (:,:,:,:) real(r8), pointer :: W (:,:,:) #endif end type T_OCEAN CSMS$DISTRIBUTE end type (T_OCEAN), allocatable :: ALL_OCEAN(:) end module mod_ocean

25 subroutine allocate_mod_ocean
! !===================================================================== ! Copyright (c) 2001 TOMS Group ! !=================================================== Hernan G. Arango === ! ! ! This routine allocates and initializes all variables in module ! ! "mod_ocean" for all nested grids ! use mod_kinds use mod_param use mod_ocean implicit none integer :: ng real(r8), parameter :: IniVal=0.0_r8 allocate (ALL_OCEAN(NestLevels)) do ng=1,NestLevels CSMS$SET_NEST_LEVEL (dh,ng) allocate (ALL_OCEAN(ng) % & rubar(GLOBAL_2D_ARRAY,2)) ALL_OCEAN(ng) % rubar=IniVal & rvbar(GLOBAL_2D_ARRAY,2)) ALL_OCEAN(ng) % rvbar=IniVal & rzeta(GLOBAL_2D_ARRAY,2)) ALL_OCEAN(ng) % rzeta=IniVal & ubar(GLOBAL_2D_ARRAY,3)) ALL_OCEAN(ng) % ubar=IniVal & vbar(GLOBAL_2D_ARRAY,3)) ALL_OCEAN(ng) % vbar=IniVal & zeta(GLOBAL_2D_ARRAY,3)) ALL_OCEAN(ng) % zeta=IniVal #ifdef SOLVE3D & pden(GLOBAL_2D_ARRAY,N)) ALL_OCEAN(ng) % pden=IniVal & rho(GLOBAL_2D_ARRAY,N)) ALL_OCEAN(ng) % rho=IniVal & ru(GLOBAL_2D_ARRAY,0:N,2)) ALL_OCEAN(ng) % ru=IniVal & rv(GLOBAL_2D_ARRAY,0:N,2)) ALL_OCEAN(ng) % rv=IniVal & t(GLOBAL_2D_ARRAY,N,3,NT)) ALL_OCEAN(ng) % t=IniVal & u(GLOBAL_2D_ARRAY,N,2)) ALL_OCEAN(ng) % u=IniVal & v(GLOBAL_2D_ARRAY,N,2)) ALL_OCEAN(ng) % v=IniVal & W(GLOBAL_2D_ARRAY,0:N)) ALL_OCEAN(ng) % W=IniVal #endif enddo return end subroutine allocate_mod_ocean

26

27 use mod_kinds trd=my_threadnum()
#include "cppdefs.h" #ifdef SOLVE3D subroutine omega (ng,tile) ! !========================================== Alexander F. Shchepetkin === ! Copyright (c) 2001 TOMS Group ! !================================================ Hernan G. Arango === ! ! ! This routine computes S-coordinate vertical velocity (m^3/s), ! ! W=[Hz/(m*n)]*omega, ! ! diagnostically at horizontal RHO-points and vertical W-points ! !================================================================== use mod_kinds use mod_param use mod_grid use mod_ocean implicit none integer, intent(in) :: ng, tile integer :: Iend, Istr, Jend, Jstr, trd integer :: my_threadnum trd=my_threadnum() call get_tile (ng,tile,Istr,Iend,Jstr,Jend) call omega_tile (ng,Istr,Iend,Jstr,Jend, & ALL_OCEAN(ng) % W, & ALL_GRID (ng) % Huon, & ALL_GRID (ng) % Hvom, & ALL_GRID (ng) % z_w) return end !********************************************************************* subroutine omega_tile (ng,Istr,Iend,Jstr,Jend,W,Huon,Hvom,z_w) use mod_kinds use mod_param use mod_scalars use mod_sources integer, intent(in) :: Iend, Istr, Jend, Jstr, ng CSMS$DISTRIBUTE(dh, 1, 2) begin real(r8), intent(inout) :: & W (GLOBAL_2D_ARRAY,0:N) CSMS$DISTRIBUTE end csms$distribute(dh, 1, 2) begin real(r8), intent(in) :: & Huon(GLOBAL_2D_ARRAY,N), & Hvom(GLOBAL_2D_ARRAY,N), & z_w (GLOBAL_2D_ARRAY,N) csms$distribute end integer :: i, j, k real(r8) :: wrk(PRIVATE_1D_SCRATCH_ARRAY) #include "set_bounds.h" ! ! Vertically integrate horizontal mass flux divergence. ! Starting with zero vertical velocity at the bottom, integrate ! from the bottom (k=0) to the free-surface (k=N). The w(:,:,N) ! contains the vertical velocity at the free-surface, d(zeta)/d(t). ! Notice that barotropic mass flux divergence is not used directly. csms$check_halo(Huon<0,1><0,0>, Hvom<0,0><0,1>, 'omega') do j=Jstr,Jend do i=Istr,Iend W(i,j,0)=0.0_r8 enddo do k=1,N W(i,j,k)=W(i,j,k-1)- & (Huon(i+1,j,k)-Huon(i,j,k)+ & Hvom(i,j+1,k)-Hvom(i,j,k)) wrk(i)=W(i,j,N)/(z_w(i,j,N)-z_w(i,j,0)) ! In order to insure zero vertical velocity at the free-surface, ! subtract the vertical velocities of the moving S-coordinates ! Iso-surfaces. These iso-surfaces are proportional to d(zeta)/d(t). ! The proportionality coefficients are a linear function of the ! S-coordinate with zero value at the bottom (k=0) and unity at ! the free-surface (k=N). do k=Nm,1,-1 W(i,j,k)=W(i,j,k)-wrk(i)*(z_w(i,j,k)-z_w(i,j,0)) W(i,j,N)=0.0_r8 ! Set lateral boundary conditions. call w3dbc_tile (ng,Istr,Iend,Jstr,Jend,W) CSMS$EXCHANGE(W<2,1><2,1>) #else subroutine omega #endif /* SOLVE3D */ csms$compare_var(W, 'omega')

28 TOMS PARALLEL DESIGN Coarse-grained parallelization

29 PARALLEL TILE PARTITIONS
} Nx Ny PARALLEL TILE PARTITIONS 8 x 8

30 TOMS PARALLEL DESIGN Coarse-grained parallelization
Shared-memory, compiler depend directives MAIN (OpenMP standard) Distributed-memory (MPI; SMS) Optimized for cache-bound computers ZIG-ZAG cycling sequence of tile partitions Few synchronization points (around 6) Serial and Parallel I/O (via NetCDF) Efficiency 4-64 threads

31 TOMS DATA ASSIMILATION
Nudging Optimal Interpolation (OI) Tangent linear and Adjoint algorithms 4D VARiational data assimilation (4DVAR) and Physical Statistical Analysis System (PSAS) algorithms Inverse Ocean Modeling System (IOMS) Ensemble prediction platform based on singular value decomposition Error Subspace Statistical Estimation (ESSE)

32 Statistical Approximation
ESSE Flow Diagram + DY0/N ^ ESSE Smoothing via Statistical Approximation DE0/N + + DP0/N - - Performance/ Analysis Modules Field Initialization Y0 Central Forecast Ycf(-) ^ Most Probable Forecast + Ymp(-) ^ Shooting Synoptic Obs Measurement Model A Posteriori Residules dr (+) Historical, Synoptic, Future in Situ/Remote Field/Error Observations d0R0 Sample Probability Density + - Select Best Forecast - Measurement Model Data Residuals Measurement Error Covariance Mean OA via ESSE ^ Ensemble Mean d-CY(-) + Options/ Assumptions ^ + eq{Yj(-)} Minimum Error Variance Within Error Subspace (Sequential processing of Observations) Gridded Residules ^ Y(-) + - j=1 ^ ^ Y(+) Y(+) Y1 Yj Yq Scalable Parallel Ensemble Forecast Y1 Yj Yq ^ - + + - Perturbations + Error Subspace Initialization ^ SVDp E(-) P(-) - + + + - E0 P0 +/- ^ j=q Normalization uj(o,Ip) with physical constraints Continuous Time Model Errors Q(t) Adaptive Error Subspace Learning Field Operation Assumption Key Convergence Criterion Continue/Stop Iteration Breeding Peripherals Analysis Modules E(+) P(+) Ea(+) Pa(+)

33 PRESSURE GRADIENT FORCE
Density Jacobian Class (Blumberg and Mellor, 1987; Song 1998; Song and Wright 1998) More Accurate Error vanishes with linear density profiles Pressure Jacobian Class (Lin 1998; Shchepetkin and McWilliams, 2001) JEBAR consistent Conserve Energy

34 RESULTS (YEAR 1) Build TOMS from ROMS prototype
Mellor-Yamada, level 2.5 Passive and active open boundary conditions Tidal forcing River runoff Lagrangian drifters Data assimilation Inter-comparison between POM and ROMS Evaluation of time-stepping, advection, and pressure gradient algorithms Initial development of TOMS web site

35 TRANSITION PATHS To Be Determined !!! Potential Users: NAVO FNMOC NOAA
USCG


Download ppt "System A Community Terrain-Following Ocean Modeling"

Similar presentations


Ads by Google