Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 IUGG/IAHS 26 th General Assembly Prague, Czech Republic, 28 June 2015 Creating a real-time, automated demonstration and evaluation of short to seasonal.

Similar presentations


Presentation on theme: "1 IUGG/IAHS 26 th General Assembly Prague, Czech Republic, 28 June 2015 Creating a real-time, automated demonstration and evaluation of short to seasonal."— Presentation transcript:

1 1 IUGG/IAHS 26 th General Assembly Prague, Czech Republic, 28 June 2015 Creating a real-time, automated demonstration and evaluation of short to seasonal range streamflow forecasting in US watersheds Andy Wood NCAR Research Applications Laboratory Hydrometeorological Applications Program Coauthors: P. Mendoza (NCAR), M. Clark (NCAR), B. Nijssen (U. Washington), A. Newman (NCAR), L. Brekke (Reclamation), J. Arnold (US Army Corps of Engineers)

2 Are US operational forecasts improving? 2 Since 1997, there have been notable advances in capabilities supporting hydrologic prediction. Are we harnessing those advances? http://www.srh.noaa.gov/abrfc/fcstver/ 19972012 RMSE

3 ‘End Member’ Forecast System Approaches Operational, 1980s => Provide tailored, but limited (mostly deterministic) forecasts that are inputs to water management Simple conceptual models that can be adjusted manually, in real-time Heavy use of model calibration Reliance on human expertise at the model/data system level. Non-reproducible, non-scalable forecast process Run on small number of workstations Research/experimental 2000s => Provide non-tailored, centrally produced forecasts, typically in form of percentile or frequency analyses ‘Physically-based’ high-dimensional models Little model calibration Automated forecast process, reproducible Ensemble outputs – require supercomputing grid2grid

4 End Member Forecast System Philosophies Operational, 1980s => “The models, data and systems will always be inadequate, thus human expertise is needed to fix performance on the fly” “If the decisions using your model outputs require a certain answer, the models must be simple so that they can be adjusted to provide that answer” Research/experimental 2000s => “The superior physics in new models and datasets will yield good quality results” “Any problems can be fixed with higher resolution and more detailed physical process representation” “Decision makers can use risk/hazard levels rather even if flows are poor” grid2grid

5 Workflow/Data Management Platform hindcasting, ensembles (uncertainty), benchmarking, real-time operations Workflow/Data Management Platform hindcasting, ensembles (uncertainty), benchmarking, real-time operations Workflow/Data Management Platform real-time operations Workflow/Data Management Platform real-time operations USACE/Recl. Focus: Sufficiency of Objective Methods Phase 1: Making a Auto Forecasting System Work  Models, Data, Systems Spinup Forcings Forecast Forcings Hydro/Other Models Hydro/Other Observations Streamflow & Other Outputs Historical Forcings Products, Website Phase 2: Making a Auto Forecasting System Work Well  Methods & Tradeoffs Spinup Forcings Forecast and Hindcast Forcings Forecast and Hindcast Forcings Appropriate Hydro/Other Models Hydro/Other Observations Hydro/Other Observations Streamflow & Other Outputs Products, Website adjustments verification post- processing, forecast calibration objective DA (regional) parameter estimation (regional) parameter estimation calibrated downscaling Historical Forcings? feedback into component improvements auto QC

6 Forecast precip / temp Weather and Climate Forecasts River Forecasting System parameters Observed Data Update Model states Models Calibration model outputs Hydrologic Model Analysis Analysis products Outputs Graphics River Forecasts Support W.M. Decisions Motivation: Assess Alternatives in the River Forecasting Process Analysis & Quality Control +

7 Streamflow Prediction System Elements Candidate opportunities for advancement 1) alternative hydrologic model(s) 2) new forcing data/methods to drive hydrologic modeling 3) objective calibration methods to support hydrologic model implementation 4) Automated data assimilation to specify initial watershed conditions for hydrologic forecasts 5) new weather and climate forecast data and method 6,7)methods to post-process streamflow forecasts and reduce systematic errors 8) benchmarking / hindcastsing / verification system / ensembles (not shown)

8 R2O/O2R has been notoriously difficult in US Research Operations Two water agencies (USACE and Reclamation) are supporting NCAR to assess the adequacy of current science and datasets for crossing the R2O gap

9 Real-time demonstration and evaluation project In water management case study basins, demonstrate and evaluate experimental, automated days-to-seasons flow forecasts using: various real-time forcing generation approaches ensemble meteorological forecasts and downscaling techniques variations in model physics and architecture automated, objective model calibration data assimilation flow forecast post-processing hindcasting and verification Partner with water agency field office personnel -- for evaluation -- guide product development -- develop decision calendars

10 Watershed Modeling Dataset Goal: framework for assessing watershed models CONUS-wide – including for short range and seasonal ensemble forecasting Basin Selection – Used GAGES-II, Hydro-climatic data network (HCDN)-2009 Initial Data & Models, Calibration Approach – Forcing via Daymet (http://daymet.ornl.gov/)http://daymet.ornl.gov/ – NWS operational Snow-17 and Sacramento-soil moisture accounting model (Snow-17/SAC) – Shuffled complex evolution (SCE) global optimization routine Newman et al (HESS 2015) NSE (calibration)

11 Initial case study set for real-time prediction chosen for varying hydroclimates, being relatively unimpaired, and feeding reservoir inflows http://www.ral.ucar.edu/staff/wood/case_studies/

12 The unified approach to hydrologic modeling (SUMMA) Martyn Clark, WRR 2015 a,b

13 Pragmatic Model Architectures & Physics what are appropriate/tractable scales & complexity to capture variability? Snow17- Sacramento SUMMA- Sacramento One Lump HRUs Elevation Bands Crystal River

14 Structure-dependence of model states – eg, SWE June 20 th, 1983 Near start of rise to peak Snow17-HRU too much in lower elevations Snow17-lump too little in higher elevations SUMMA-band too little in higher elevations Snow17-band probably about right given flow performance SUMMA

15 Impact on streamflows Snow17-lump Snow17-hru Snow17-band SUMMA-band June 20

16 operational example of automated PF DA 16 Alternatives to manual spinup: ensemble initializations system by Amy Sansone, Matt Wiley, 3TIER slide from DOH Mtg talk, 2012

17 Ensemble Model Forcings (100 member) Figure 4. Example monthly precipitation totals from two ensemble members (a-b), along with the monthly ensemble mean (c) and standard deviation (d). April 2008 example Estimating this uncertainty is valuable for: More robust model calibration Input to ensemble data assimilation techniques

18 simpler statistical approaches provide a benchmark for intensive dynamical approaches Seasonal forecasting methods for intercomparison 18 Forecasting centers around the world have offered seasonal streamflow predictions for decades – methods span a wide range a.regression of flow on in situ obs (rainfall, SWE, flow) - ‘regression’ = regressive technique, ie PCR, MLR, etc. b.the same but with teleconnection indices included as predictors c.the same but with custom climate state predictors (eg EOFs of SST) or climate forecasts d.land model based ensemble simulation (eg ESP or HEPS) without climate forecast -with short to medium range prediction embedded -with climate index (or custom index) weighting e.climate forecast weighted ESP (eg using CFSv2 or NMME in the US) f.downscaled climate forecast downscaled with weather generation for land model ESP input - from one land/climate model to multi-model; from simple land model to hyper- resolution g. d-g with statistical post-processing to correct model bias h. d-g with post-processing to correct bias and merge with other predictions (cf BOM approach) i. d-g with DA to correct land model errors (particularly with snow variables) j. d-g with both post-processing AND DA USDA, NWS NWS ESP NWS HEFS

19 January 1February 1March 1 Climate diagnostics: predictor fields April 1 Crystal River Abv. Avalanche Crk. Some examples 19

20 Climate diagnostics: indices Crystal River Abv. Avalanche Crk. Some examples… Predictor indices = Mean[Positively correlated region] - Mean[Negatively correlated region] Variable: Geopotential Height at 700 mb January 1February 1March 1 April 1 20

21 Seasonal runoff volume hindcasts Hungry Horse Reservoir Inflow Lead time: February 1 for April-July 21 Raw ESP Bias corrected ESP BC-ESP + Best. Stat. (Land) Equal Weights BC-ESP + Best. Stat. (Land) RMSE weighting Trace weighting (Land) RMSE weighting

22 Short-to-medium range forecasts Goal: downscaled ensemble met forecasts enable estimation of prediction uncertainty Method: use locally-weighted multi-variate regression to downscale GEFS (reforecast) atmospheric predictors to watershed precipitation and temperature Status: have run water year 2008-2012 with daily updates for Hungry Horse inflow – daily timestep to 15 day lead times. Running further back and setting up website to browse images & data files. Figures: Case study hindcast of 15- day ensemble forecast uses 7 days of downscaled GEFS Snow17/SAC model combination

23 Concluding Points This is an exciting time! But also a transitional one. – A new wealth of resources for hydrologic forecasting – A long-standing tradition that has difficulty leveraging new science – The evolution of the forecaster-centered operational paradigm is unclear In-the-loop to Over-the-loop? Operational proof of concept is critical for R2O – Testbeds that prove literature methods work in practice – Broad dissemination of results to create awareness and familiarity in the operational community – End-to-end interaction with decisionmakers (eg reservoir operators) We welcome community collaborations and method sharing – Short-range and seasonal-range predictions – A joint international testbed effort?

24 Acknowledgements & Website NCAR Martyn Clark Pablo Mendoza, Andy Newman University of Washington Bart Nijssen http://www.ral.ucar.edu/projects/hap/flowpredict/ Reclamation Levi Brekke US Army Corps of Engineers Jeff Arnold in progress

25 Questions? 25 andywood@ucar.edu


Download ppt "1 IUGG/IAHS 26 th General Assembly Prague, Czech Republic, 28 June 2015 Creating a real-time, automated demonstration and evaluation of short to seasonal."

Similar presentations


Ads by Google