Evaluation of Potential Performance Measures for the Advanced Hydrologic Prediction Service Gary A. Wick NOAA Environmental Technology Laboratory On Rotational.

Slides:



Advertisements
Similar presentations
Climate Prediction Applications Science Workshop
Advertisements

What is a good ensemble forecast? Chris Ferro University of Exeter, UK With thanks to Tom Fricker, Keith Mitchell, Stefan Siegert, David Stephenson, Robin.
What is a good ensemble forecast? Chris Ferro University of Exeter, UK With thanks to Tom Fricker, Keith Mitchell, Stefan Siegert, David Stephenson, Robin.
1 Verification Continued… Holly C. Hartmann Department of Hydrology and Water Resources University of Arizona RFC Verification Workshop,
Unisys Weather Information Services Presentation for NWS Partners Meeting Partner Perspective June 2010 Ron Guy, Director Unisys Weather
1 of Introduction to Forecasts and Verification.
1 Verification Introduction Holly C. Hartmann Department of Hydrology and Water Resources University of Arizona RFC Verification Workshop,
Details for Today: DATE:3 rd February 2005 BY:Mark Cresswell FOLLOWED BY:Assignment 2 briefing Evaluation of Model Performance 69EG3137 – Impacts & Models.
Forecasting Uncertainty Related to Ramps of Wind Power Production
Verification and evaluation of a national probabilistic prediction system Barbara Brown NCAR 23 September 2009.
0 Future NWS Activities in Support of Renewable Energy* Dr. David Green NOAA, NWS Office of Climate, Water & Weather Services AMS Summer Community Meeting.
Application of Forecast Verification Science to Operational River Forecasting in the National Weather Service Julie Demargne, James Brown, Yuqiong Liu.
Reliability Trends of the Global Forecast System Model Output Statistical Guidance in the Northeastern U.S. A Statistical Analysis with Operational Forecasting.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Colorado Basin River Forecast Center Water Supply Forecasting Method Michelle Stokes Hydrologist in Charge Colorado Basin River Forecast Center April 28,
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
Application of seasonal climate forecasts to predict regional scale crop yields in South Africa Trevor Lumsden and Roland Schulze School of Bioresources.
Classification and Prediction: Regression Analysis
CPC’s U.S. Seasonal Drought Outlook & Future Plans April 20, 2010 Brad Pugh, CPC.
Water Supply Forecast using the Ensemble Streamflow Prediction Model Kevin Berghoff, Senior Hydrologist Northwest River Forecast Center Portland, OR.
Hydrologic Statistics
Great Basin Verification Task 2008 Increased Variability Review of 2008 April through July Period Forecast for 4 Selected Basins Determine what verification.
Evaluating decadal hindcasts: why and how? Chris Ferro (University of Exeter) T. Fricker, F. Otto, D. Stephenson, E. Suckling CliMathNet Conference (3.
Southwest Hydrometeorology Symposium Tempe, AZ September 28, 2011 Kevin Werner NWS Colorado Basin River Forecast Center : A Year of Extremes.
ECMWF WWRP/WMO Workshop on QPF Verification - Prague, May 2001 NWP precipitation forecasts: Validation and Value Deterministic Forecasts Probabilities.
Verification of ensembles Courtesy of Barbara Brown Acknowledgments: Tom Hamill, Laurence Wilson, Tressa Fowler Copyright UCAR 2012, all rights reserved.
ESET ALEMU WEST Consultants, Inc. Bellevue, Washington.
1 James Brown An introduction to verifying probability forecasts RFC Verification Workshop.
Woody Roberts Tom LeFebvre Kevin Manross Paul Schultz Evan Polster Xiangbao Jing ESRL/Global Systems Division Application of RUA/RTMA to AWIPS and the.
Mississippi River Tri-Agency Meeting National Weather Service 1 COE/NWS/USGS Tri-Agency Meeting Mississippi River Basin AHPS UPDATE COE/NWS/USGS Tri-Agency.
HIC MEETING July 12, 2005 Streamflow Regulation Accounting by Tom Gurss Missouri Basin River Forecast Center National Weather Service Pleasant Hill, MO.
National Weather Service - Southeast River Forecast Center Southeast River Forecast Center North Florida Visit July 17-20, 2006 Southeast River Forecast.
Hydrology in the National Weather Service Mark Fuchs Service Hydrologist National Weather Service St. Louis, MO Presentation to local Media Partners November.
National Weather Service Products on the Internet Erik Heden Meteorologist NWS Weather Forecast Office Binghamton, NY Patti Wnek Service Coordination Hydrologist.
National Weather Service Application of CFS Forecasts in NWS Hydrologic Ensemble Prediction John Schaake Office of Hydrologic Development NOAA National.
Celeste Saulo and Juan Ruiz CIMA (CONICET/UBA) – DCAO (FCEN –UBA)
Overview of the Colorado Basin River Forecast Center Lisa Holts.
The IEM-KCCI-NWS Partnership: Working Together to Save Lives and Increase Weather Data Distribution.
Toward Probabilistic Seasonal Prediction Nir Krakauer, Hannah Aizenman, Michael Grossberg, Irina Gladkova Department of Civil Engineering and CUNY Remote.
61 st IHC, New Orleans, LA Verification of the Monte Carlo Tropical Cyclone Wind Speed Probabilities: A Joint Hurricane Testbed Project Update John A.
Hydrometeorological Prediction Center HPC Experimental PQPF: Method, Products, and Preliminary Verification 1 David Novak HPC Science and Operations Officer.
National Weather Service Water Science and Services John J. Kelly, Jr. Director, National Weather Service NOAA Science Advisory Board November 6, 2001.
CBRFC Stakeholder Forum February 24, 2014 Ashley Nielson Kevin Werner NWS Colorado Basin River Forecast Center 1 CBRFC Forecast Verification.
Logistical Verification Forecast Services in IHFS Mary Mullusky RFC Verification Workshop, August
1 Proposal for a Climate-Weather Hydromet Test Bed “Where America’s Climate and Weather Services Begin” Louis W. Uccellini Director, NCEP NAME Forecaster.
Colorado Basin River Forecast Center and Drought Related Forecasts Kevin Werner.
RFC Climate Requirements 2 nd NOAA Climate NWS Dialogue Meeting January 4, 2006 Kevin Werner.
1 Probabilistic Forecast Verification Allen Bradley IIHR Hydroscience & Engineering The University of Iowa RFC Verification Workshop 16 August 2007 Salt.
Sources of Skill and Error in Long Range Columbia River Streamflow Forecasts: A Comparison of the Role of Hydrologic State Variables and Winter Climate.
Verification of ensemble systems Chiara Marsigli ARPA-SIMC.
NOAA’s National Weather Service National River Forecast Verification System NOAA Science Advisory Board Meeting July 16, 2003 Gary Carter Director, Office.
Alan F. Hamlet Andy Wood Dennis P. Lettenmaier JISAO Center for Science in the Earth System Climate Impacts Group and the Department.
1 Symposium on the 50 th Anniversary of Operational Numerical Weather Prediction Dr. Jack Hayes Director, Office of Science and Technology NOAA National.
Science plan S2S sub-project on verification. Objectives Recommend verification metrics and datasets for assessing forecast quality of S2S forecasts Provide.
ATM 401/501 Status of Forecasting: Spring Forecasting at NCEP Environmental Modeling Center Ocean Prediction Center.
Focus areas of the NWS Missouri/Souris River Floods of May-August 2011 Service Assessment – Per the NOAA and NWS Strategic Plans, gather stakeholder input.
Probabilistic Forecasts - Baseline Products for the Advanced Hydrologic Prediction Services (AHPS) Dave Reed HIC – LMRFC 2006 NWA Annual Meeting October.
Colorado Basin River Forecast Center Greg Smith Senior Hydrologist National Weather Service Colorado Basin River Forecast Center January 25, 2011 Navajo.
Findings and Recommendations from the Hydrologic Verification System Requirements Team Peter Gabrielsen, Julie Demargne, Mary Mullusky, Kevin Werner, Bill.
1 Overview of CPC’s Operations Branch February 11, 2011 Ed O’Lenic Chief, Operations Branch NOAA-NWS-Climate Prediction Center.
HIC Meeting, 02/25/2010 NWS Hydrologic Forecast Verification Team: Status and Discussion Julie Demargne OHD/HSMB Hydrologic Ensemble Prediction (HEP) group.
Hydrologic Considerations in Global Precipitation Mission Planning
Forecast Capability for Early Warning:
Verifying and interpreting ensemble products
Dennis P. Lettenmaier, Andrew W. Wood, Ted Bohn, George Thomas
Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier
Overview of Models & Modeling Concepts
Probabilistic forecasts
N. Voisin, J.C. Schaake and D.P. Lettenmaier
What is a good ensemble forecast?
Presentation transcript:

Evaluation of Potential Performance Measures for the Advanced Hydrologic Prediction Service Gary A. Wick NOAA Environmental Technology Laboratory On Rotational Assignment with the NWS Office of Hydrologic Development November 2003

Overview u Performance measures for the Advanced Hydrologic Prediction Service (AHPS) u Review of probabilistic forecast verification measures u Trial application with operational forecast data u Recommendations

AHPS AHPS Science Activities Development Activity Effectiveness Contribution to forecast maturity Number of science tools deployed per year Contribution to information content Advanced Hydrologic Prediction Service (AHPS) Deployment Number of forecast points Coverage area Maturity, e.g. metrics addressing Usage of forecast information Probabilistic forecast effectiveness Program Performance Existing Measures Future Measures Science Projects Performance

Probabilistic Forecast Verification u Categorical forecasts u Brier Score u Rank Probability Score (RPS)

Categorical Forecasts u Transforms probabilistic forecast into a categorical forecast through selection of a probability threshold u Simple but doesn’t fully address probability

Brier Score u Simple extension that fully characterizes probabilistic forecasts u Limited to occurrence of a specific event

Rank Probability Score u Extension characterizing full distribution of forecasts u Ideal as science measure but added complexity a concern at program level

Application of Accuracy Measures u Deterministic Measures u Probabilistic Brier Score

Deterministic Application u Used National Weather Service verification database –Monthly data for 177 sites starting April 2001 –Results computed “on-the-fly” u Evaluated accuracy difference between AHPS and non-AHPS points –Considered subset of points for the North Central, Ohio, and Missouri Basin River Forecast Centers u Expressed in terms of mean absolute error and root mean square error

Deterministic Results

Implications: Deterministic u Possible to implement something rapidly u Characterization must be defined u Existing verification database and interface inadequate

Brier Score Evaluation u Sample ensemble forecasts and verification provided by Kristie Franz –43 sites from the Ohio River Forecast Center –11 weekly mean and monthly maximum exceedance forecasts –Forecast traces, verification, and historical data u Evaluated accuracy of forecasts for exceedance of flood stage

Brier Score Evaluation u Use of all forecasts suggested very high accuracy u Use of all forecasts suggests very high accuracy u Only 17% improvement over forecasts for no flooding

Brier Score Evaluation u Most revealing results obtained for forecasts where flooding occurred

Rank Probability Score u Could interpret as accuracy from 76 to 91%

Implications: Probabilistic u Brier score can be presented simply and meaningfully u Application limited by constraint to instances of flooding u Rank probability score addresses all forecasts but meaning is harder to express u Necessary to regularly archive ensemble forecasts and verification

Recommendations u Performance measures u Data collection u Additional analyses

Recommended AHPS Accuracy Measures u Deterministic River Forecast Accuracy u Probabilistic River Forecast Accuracy u Flood Forecast Accuracy

AHPS Deterministic River Forecast Accuracy u Percent accuracy of mean daily streamflow for days 1-3 u Evaluates short-term hydrograph forecasts

AHPS Probabilistic River Forecast Accuracy u RPS derived accuracy of weekly mean streamflow exceedance u Evaluates AHPS weekly chance of exceedance forecasts u Express as percent accuracy u Apply to week 2 and week 4 forecasts

AHPS Flood Forecast Accuracy u Derived from Brier score and weekly maximum stage forecasts u Simple evaluation of weekly exceedance forecasts u Express as percent accuracy for cases where flooding occurred u Apply to week 2 and week 4 forecasts

Recommended Data Archival u Forecast Data –Ensemble forecast traces at selected points u Verification –Corresponding stage/streamflow observations u Historical Data –Ensure consistency with forecast quantities

Further Analysis u Apply proposed measures to enhanced set of archived data u Evaluations will help illustrate where forecast skill exists and improvements are possible u Explore possible alternatives for collection of climatological data u Final metric selection best made after more comprehensive evaluation

Closing Notes u Accuracy measures provide a bridge between programmatic and science activities and metrics u Important to recognize limitations of accuracy measures and continue consideration of other metrics