Download presentation
Presentation is loading. Please wait.
Published byRonald Carr Modified over 8 years ago
1
NOAA/NOS/Coast Survey Development Laboratory Evaluation of Operational Forecast Systems for Coastal Ocean and Estuarine Environments Frank Aikman, Lyon Lanerolle & Richard Patchen NOAA’s National Ocean Service Coast Survey Development Laboratory ePOPf Portland, OR September 20-23, 2010
2
NOAA/NOS/Coast Survey Development Laboratory Evaluation of Operational Forecast Systems OUTLINE OUTLINE OFS Requirements; Objective; System OFS Requirements; Objective; System The Coastal Ocean Modeling Framework The Coastal Ocean Modeling Framework Model Skill Assessment Model Skill Assessment Community Approach and Transition Community Approach and Transition Challenges and On-going Issues Challenges and On-going Issues
3
NOAA/NOS/Coast Survey Development Laboratory NOS Oceanographic Forecast Systems Requirements (user-driven products) Primary Mission: Support of safe & efficient navigation Water levels for under-keel clearance Water levels for under-keel clearance Currents for right-of-way, maneuverability Currents for right-of-way, maneuverability Emergency response (currents, T, S) HAZMAT HAZMAT Search & Rescue Search & Rescue Homeland Security Homeland Security For environmentally sound management of the coastal zone Ecosystem applications Ecosystem applications Marine geospatial applications Marine geospatial applications Salinity SST
4
NOAA/NOS/Coast Survey Development Laboratory National Operational Coastal Modeling Program OBJECTIVE: Develop a national network of operational hydrodynamic models providing nowcasts and short-term (0 hr. – 48 hr.) forecasts of Water Levels, Currents, Salinity, Temperature
5
NOAA/NOS/Coast Survey Development Laboratory Water Level forecasts at CBBT and Kiptopeke from CBOFS
6
NOAA/NOS/Coast Survey Development Laboratory REAL-TIME DATA INGEST QA/QC (COMF) OPERATIONAL MODELS (NOAA’s HPC) FORECAST MODEL GUIDANCE (water level, water temp, currents, & salinity) PRODUCTS (web pages and digital pt. & gridded data) FOR USERS tidesandcurrents.noaa.gov QA/QC (CORMS) 24 x 7 NOS Coastal Ocean Modeling Framework (COMF) Individual Model systems Data Tanks at CCS: Atmospheric ForcingAtmospheric Forcing Coastal Boundary ConditionsCoastal Boundary Conditions Riverine Fresh Water InputsRiverine Fresh Water Inputs Products and archives Linux Server in CO-OPS
7
NOAA/NOS/Coast Survey Development Laboratory Coastal Ocean Modeling Framework Consists of middleware to manage work flow OBJECTIVE: OBJECTIVE: More efficient R&D, O&MPURPOSE: Simplify Data Handling & Maintenance Simplify Data Handling & Maintenance Provide a Standard System for all Locations (NetCDF) Provide a Standard System for all Locations (NetCDF) Share Skill Assessment and Evaluation Tools Share Skill Assessment and Evaluation Tools Enable an efficient technology-transfer process Enable an efficient technology-transfer process Various Models Allowed For Experimentation: Various Models Allowed For Experimentation: ADCIRC, ECOM, EFDC, ELCIRC, FVCOM, POM, SELFE, QUODDY, ROMS ADCIRC, ECOM, EFDC, ELCIRC, FVCOM, POM, SELFE, QUODDY, ROMS NOS “corporate” models to move forward with: NOS “corporate” models to move forward with: For OFS: Structured Grid - ROMS; Unstructured Grid – FVCOM For OFS: Structured Grid - ROMS; Unstructured Grid – FVCOM For VDatum Tidal Modeling and Storm Surge - ADCIRC For VDatum Tidal Modeling and Storm Surge - ADCIRC Consistent with IOOS (DMAC); the Earth System Modeling Framework (ESMF); and COARDS/CF Conventions
8
NOAA/NOS/Coast Survey Development Laboratory Models, products, assessment, documentation will be as uniform as possible: Models, products, assessment, documentation will be as uniform as possible: NOS Procedures for Developing and Implementing Operational Nowcast and Forecast Systems for PORTS, NOAA Tech Rep NOS CO-OPS 20, January 1999 NOS Procedures for Developing and Implementing Operational Nowcast and Forecast Hydrodynamic Model Systems, NOAA Tech Rep NOS CO-OPS 39, May 2003 NOS Standards for Evaluating Operational Nowcast and Forecast Hydrodynamic Model Systems, NOAA Tech Rep NOS CS 17, October 2003 Strategic and Implementation Plan for the National Operational Coastal Modeling Program FY2004-2010, NOAA/NOS OCS and CO-OPS, 2004 IMPLEMENTATION OF MODEL SKILL ASSESSMENT SOFTWARE FOR WATER LEVEL AND CURRENT IN TIDAL REGIONS, NOAA Technical Report NOS CS 24, March 2006 (updated September 2010) Zhang, A., K.W. Hess and F. Aikman III. 2010. User-based Skill Assessment Techniques for Operational Hydrodynamic Forecast Systems. Journal of Operational Oceanography, Volume 3, Number 2, August 2010, pp. 11-24(14). Coastal Ocean Modeling Framework Standards & Procedures
9
NOAA/NOS/Coast Survey Development Laboratory NOS Operational Forecast System Evaluation (Model Skill Assessment) Objectives - Measure the performance of model simulations (including tidal simulations, a hindcast, nowcasts and forecasts) by comparing with observations - All models are assessed with respect to the NOS skill assessment standards before transition to operations Functional Steps Data acquisition and processing (observations and model outputs) Time interval conversion and Gap-filling Concatenation of model outputs Filtering Tidal harmonic analysis and prediction Extracting extrema/events and slack water Compute statistical variables Generate skill assessment score tables Harmonic constants comparison
10
NOAA/NOS/Coast Survey Development Laboratory Skill Assessment Variables and Statistics Used Error - The error is defined as the predicted value minus the reference (observed or astronomical tide value) SM - Series Mean; RMSE - Root Mean Square Error; SD - Standard Deviation CF(X) - Central Frequency. Percentage of errors that lie within the limits +X POF(X)/ NOF(X) – Positive/Negative Outlier Frequency. Percentage of errors that are >/< X MDPO(X)/ MDNO(X) - Maximum Duration of Positive/Negative Outliers; two or more consecutive occurrences of an error >/< X WOF(X) - Worst Case Outlier Frequency. Percentage of errors that exceed X NOS Skill Assessment Criteria (Metrics) VariableWater LevelCurrents CF15 cm (>90%) 26 cm/s (>90%) POF/NOF15 cm (<1%)26 cm/s (<1%)
11
NOAA/NOS/Coast Survey Development Laboratory Variable Name X L (hr) Water Level (WL)15 cm24 Extrema WL amplitude15 cm24 Extrema WL time0.5 hr24 Current speed0.26 m/s24 Extrema current speed0.26 m/s24 Extrema current time0.5 hr24 Slack water time0.25 hr24 Current direction22.5 deg24 Extrema current direction22.5 deg24 Water temperature2.0 C24 Salinity3.0 PSU24 Acceptable error limits (X) and the maximum duration limits (L) for variables in tidal regions
12
NOAA/NOS/Coast Survey Development Laboratory Examples of the NOS skill assessment skill scores for 2008 Left to Right: Bayonne Bridge, NJ; Baltimore, MD; Buchman Bridge, FL; and Eagle Point, TX. Top: Central Frequency (CF) of water level amplitudes; Bottom: Root Mean Square Error (RMSE) of water level amplitudes.
13
NOAA/NOS/Coast Survey Development Laboratory Synoptic CBOFS2 Hindcast: T, S validation Stations cover Bay axis and also tributaries Observations from the Chesapeake Bay Program Comparison period with observations : January 01, 2004 – September 01, 2005 Discard first 7-months of simulation to account for adjustment/spin-up
14
NOAA/NOS/Coast Survey Development Laboratory Local Spatio- Temporal Error Structure CBOFS2 model solutions interpolated in space & time to Bay Program observed locations CBOFS2 model solutions interpolated in space & time to Bay Program observed locations Account for depth differences at surface & bottom Account for depth differences at surface & bottom Look at differences in T, S between CBOFS2 & observations in both space and time – i.e. local “error structure” Look at differences in T, S between CBOFS2 & observations in both space and time – i.e. local “error structure” Very useful but too much information – need to condense/summarize Very useful but too much information – need to condense/summarize
15
NOAA/NOS/Coast Survey Development Laboratory Global Spatio- Temporal Error Structure I Look at global trends by converting spatio-temporal errors in to PDFs without discriminating in space or time or for seasonal cycles, etc. Look at global trends by converting spatio-temporal errors in to PDFs without discriminating in space or time or for seasonal cycles, etc. CBOFS2 T generally cooler than observations with overall probability ~0.6 and largest being in the [-0.5 o C, 0 o C] interval (~0.3) CBOFS2 T generally cooler than observations with overall probability ~0.6 and largest being in the [-0.5 o C, 0 o C] interval (~0.3) CBOFS2 S generally saltier than observations with overall probability ~0.9 and largest being in the [2.5, 3.0] PSU interval (~0.17) CBOFS2 S generally saltier than observations with overall probability ~0.9 and largest being in the [2.5, 3.0] PSU interval (~0.17)
16
NOAA/NOS/Coast Survey Development Laboratory T-S Error Variation with Depth
17
NOAA/NOS/Coast Survey Development Laboratory T Error Structure in Horizontal RMSE shows average CBOFS2 error and ME shows error bias (warmer/cooler) RMSE shows average CBOFS2 error and ME shows error bias (warmer/cooler) Look at surface, 15-feet (4.6m) and bottom Look at surface, 15-feet (4.6m) and bottom Surface, 15-feet similar & largest errors at bottom Surface, 15-feet similar & largest errors at bottom Errors primarily along axis of Bay Errors primarily along axis of Bay Mean errors ~ [-1 o C, +1 o C] RMSE ~ [0 o C, 2 o C] Mean errors ~ [-1 o C, +1 o C] RMSE ~ [0 o C, 2 o C] CBOFS2 excessively cool at surface and excessively warm at bottom CBOFS2 excessively cool at surface and excessively warm at bottom
18
NOAA/NOS/Coast Survey Development Laboratory Skill Assessment Metrics Shown RMSE/CF/POF/NOF - global, integrated error measures - no error structure/bias info; Spatio-temporal error/difference plots – good for examining local error structure; Histograms of the error/difference - good for examining systemic errors/biases. Skill Assessment Metrics Being Considered 2D fields - spatial (X,Y) difference plots; as a function of depth and/or time; Taylor diagrams (Jolliff and Kindle, 2007); Target/Jolliff diagrams (Jolliff et al., 2009); e.g. CBOFS2 model-data surface S misfit (28 CBP Stations) Stratification indices; Upwelling indices; Average Discrete Frechet Distance methods - to look at how two curves, e.g. vertical salinity profiles, are similar in shape and value; Edge-detection methods - to compare two 2D patches for similarity of shape and extent (area); e.g. for HABs NOS Operational Forecast System Evaluation (Model Skill Assessment)
19
NOAA/NOS/Coast Survey Development Laboratory The Community Approach Given limited resources a community approach… Given limited resources a community approach… Allows for open discussion of strengths and weaknesses of different models Allows for open discussion of strengths and weaknesses of different models Elucidates the requirements of a common shared infrastructure Elucidates the requirements of a common shared infrastructure Allows model improvements to be shared effectively Allows model improvements to be shared effectively Advances the science (research and operations) Advances the science (research and operations) Leverages resources and amplifies the voice of the community Leverages resources and amplifies the voice of the community
20
NOAA/NOS/Coast Survey Development Laboratory Prototype Test Bed: The Delaware Bay Model Evaluation Environment ROMS, POM, ADCIRC, FVCOM, ELCIRC, SELFE,.... COMMUNITY MODELS Cap e Henl open Ca pe Ma y Mau rice Rive r Brid geto n Model Hindcast Grids, Bathymetry, Environmental conditions Grids Bathymetry Metrics Historical Data Environmental conditions
21
NOAA/NOS/Coast Survey Development Laboratory NOAA/NOS Coastal Modeling Challenges Continued Collaboration with the Ocean Modeling Community Transition from individual port models to a regional modeling approach Transitioning NOS OFS to the NOAA HPC facility maintained by NCEP Coupled Model Systems Riverine-estuarine-coastal-basin; Hydrodynamic-wave; hydrodynamic-sediment transport Physical-biogeochemical coupling (ecological; water quality; habitat) Forecast uncertainty estimation Probabilistic approach Ensemble averaging Data assimilation techniques HF Radar; coastal altimetry; IOOS data; etc. Higher spatial resolution in key areas (e.g. in navigation channels; for storm surge and inundation modeling) Nesting vs. unstructured grids; Finite difference, finite element and finite volume approaches
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.