Verification Methods for High Resolution Model Forecasts Barbara Brown NCAR, Boulder, Colorado Collaborators: Randy Bullock, John Halley.

Slides:



Advertisements
Similar presentations
Model Evaluation Tools MET. What is MET Model Evaluation Tools ( MET )- a powerful and highly configurable verification package developed by DTC offering:
Advertisements

Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
14 May 2001QPF Verification Workshop Verification of Probability Forecasts at Points WMO QPF Verification Workshop Prague, Czech Republic May 2001.
Validation of Satellite Precipitation Estimates for Weather and Hydrological Applications Beth Ebert BMRC, Melbourne, Australia 3 rd IPWG Workshop / 3.
WRF Verification: New Methods of Evaluating Rainfall Prediction Chris Davis NCAR (MMM/RAP) Collaborators: Dave Ahijevych, Mike Baldwin, Barb Brown, Randy.
Assessment of Tropical Rainfall Potential (TRaP) forecasts during the Australian tropical cyclone season Beth Ebert BMRC, Melbourne, Australia.
Monitoring the Quality of Operational and Semi-Operational Satellite Precipitation Estimates – The IPWG Validation / Intercomparison Study Beth Ebert Bureau.
Improving Excessive Rainfall Forecasts at HPC by using the “Neighborhood - Spatial Density“ Approach to High Res Models Michael Eckert, David Novak, and.
Validation of the Ensemble Tropical Rainfall Potential (eTRaP) for Landfalling Tropical Cyclones Elizabeth E. Ebert Centre for Australian Weather and Climate.
Testbeds, Model Evaluation, Statistics, and Users Barbara Brown, Director Joint Numerical Testbed Program Research Applications Laboratory.
NWP Verification with Shape- matching Algorithms: Hydrologic Applications and Extension to Ensembles Barbara Brown 1, Edward Tollerud 2, Tara Jensen 1,
PROVIDING DISTRIBUTED FORECASTS OF PRECIPITATION USING A STATISTICAL NOWCAST SCHEME Neil I. Fox and Chris K. Wikle University of Missouri- Columbia.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Quantitative precipitation forecasts in the Alps – first.
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Object-based Spatial Verification for.
© Crown copyright Met Office From the global to the km-scale: Recent progress with the integration of new verification methods into operations Marion Mittermaier.
4th Int'l Verification Methods Workshop, Helsinki, 4-6 June Methods for verifying spatial forecasts Beth Ebert Centre for Australian Weather and.
Climate Forecasting Unit Prediction of climate extreme events at seasonal and decadal time scale Aida Pintó Biescas.
Towards an object-oriented assessment of high resolution precipitation forecasts Janice L. Bytheway CIRA Council and Fellows Meeting May 6, 2015.
STEPS: An empirical treatment of forecast uncertainty Alan Seed BMRC Weather Forecasting Group.
12 June 2008TIES -- Kelowna, B.C. Canada The Image Warp for Evaluating Gridded Weather Forecasts Eric Gilleland National Center for Atmospheric Research.
Verification Approaches for Ensemble Forecasts of Tropical Cyclones Eric Gilleland, Barbara Brown, and Paul Kucera Joint Numerical Testbed, NCAR, USA
On the spatial verification of FROST-2014 precipitation forecast fields Anatoly Muraviev (1), Anastasia Bundel (1), Dmitry Kiktev (1), Nikolay Bocharnikov.
Ebert-McBride Technique (Contiguous Rain Areas) Ebert and McBride (2000: Verification of precipitation in weather systems: determination of systematic.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Priority project « Advanced interpretation and verification.
DTC Verification for the HMT Edward Tollerud 1, Tara Jensen 2, John Halley Gotway 2, Huiling Yuan 1,3, Wally Clark 4, Ellen Sukovich 4, Paul Oldenburg.
A QPE Product with Blended Gage Observations and High-Resolution WRF Ensemble Model Output: Comparison with Analyses and Verification during the HMT-ARB.
2/10/03F.Marks1 Development of a Tropical Cyclone Rain Forecasting Tool Frank D. Marks NOAA/AOML, Hurricane Research Division, Miami, FL QPE Techniques.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Quantitative precipitation forecast in the Alps Verification.
Refinement and Evaluation of Automated High-Resolution Ensemble-Based Hazard Detection Guidance Tools for Transition to NWS Operations Kick off JNTP project.
DIAMET meeting 7 th-8th March 2011 “New tools for the evaluation of convective scale ensemble systems” Seonaid Dey Supervisors: Bob Plant, Nigel Roberts.
EMS 2013 (Reading UK) Verification techniques for high resolution NWP precipitation forecasts Emiel van der Plas Kees Kok Maurice.
Feature-based (object-based) Verification Nathan M. Hitchens National Severe Storms Laboratory.
Verification of Precipitation Areas Beth Ebert Bureau of Meteorology Research Centre Melbourne, Australia
Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University.
Diagnostic Evaluation of Mesoscale Models Chris Davis, Barbara Brown, Randy Bullock and Daran Rife NCAR Boulder, Colorado, USA.
U. Damrath, COSMO GM, Athens 2007 Verification of numerical QPF in DWD using radar data - and some traditional verification results for surface weather.
Spatial Forecast Methods Inter-Comparison Project -- ICP Spring 2008 Workshop NCAR Foothills Laboratory Boulder, Colorado.
Edward Tollerud 1, Tara Jensen 2, John Halley Gotway 2, Huiling Yuan 1,3, Wally Clark 4, Ellen Sukovich 4, Paul Oldenburg 2, Randy Bullock 2, Gary Wick.
Page 1© Crown copyright 2004 The use of an intensity-scale technique for assessing operational mesoscale precipitation forecasts Marion Mittermaier and.
Opportunities for Satellite Observations in Validation and Assessment Barbara Brown NCAR/RAL 19 November 2008.
Science plan S2S sub-project on verification. Objectives Recommend verification metrics and datasets for assessing forecast quality of S2S forecasts Provide.
Diagnostic verification and extremes: 1 st Breakout Discussed the need for toolkit to build beyond current capabilities (e.g., NCEP) Identified (and began.
WRF Verification Toolkit Workshop, Boulder, February 2007 Spatial verification of NWP model fields Beth Ebert BMRC, Australia.
NCAR, 15 April Fuzzy verification of fake cases Beth Ebert Center for Australian Weather and Climate Research Bureau of Meteorology.
Eidgenössisches Departement des Innern EDI Bundesamt für Meteorologie und Klimatologie MeteoSchweiz Weather type dependant fuzzy verification of precipitation.
Extracting probabilistic severe weather guidance from convection-allowing model forecasts Ryan Sobash 4 December 2009 Convection/NWP Seminar Series Ryan.
The WRF Verification Toolkit Lacey Holland, Tressa Fowler, and Barbara Brown Research Applications Laboratory NCAR Lacey Holland, Tressa Fowler, and Barbara.
Overview of SPC Efforts in Objective Verification of Convection-Allowing Models and Ensembles Israel Jirak, Chris Melick, Patrick Marsh, Andy Dean and.
State-of-the-art in (terrestrial) weather forecast verification and tools and resources Barbara Brown and Tara Jensen
User-Focused Verification Barbara Brown* NCAR July 2006
1 Application of MET for the Verification of the NWP Cloud and Precipitation Products using A-Train Satellite Observations Paul A. Kucera, Courtney Weeks,
11 Short-Range QPF for Flash Flood Prediction and Small Basin Forecasts Prediction Forecasts David Kitzmiller, Yu Zhang, Wanru Wu, Shaorong Wu, Feng Ding.
Application of the CRA Method Application of the CRA Method William A. Gallus, Jr. Iowa State University Beth Ebert Center for Australian Weather and Climate.
Evaluation of Precipitation from Weather Prediction Models, Satellites and Radars Charles Lin Department of Atmospheric and Oceanic Sciences McGill University,
UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 1 Evaluation software tools Cristian Lussana (2) and Michael Borsche.
Application of object-oriented verification techniques to ensemble precipitation forecasts William A. Gallus, Jr. Iowa State University June 5, 2009.
Intensity-scale verification technique
Fuzzy verification using the Fractions Skill Score
Systematic timing errors in km-scale NWP precipitation forecasts
Verifying Precipitation Events Using Composite Statistics
Spatial Verification Intercomparison Meeting, 20 February 2007, NCAR
Multi-scale validation of high resolution precipitation products
PP INSPECT report Dmitry Alferov (1), Elena Astakhova (1), Petra Baumann (4), Dimitra Boukouvala (2), Anastasia Bundel (1), Ulrich Damrath (3), Pierre.
Verification of nowcasting products: Issues and methods
Adaption of an entity based verification
Hydrologically Relevant Error Metrics for PEHRPP
Numerical Weather Prediction Center (NWPC), Beijing, China
Verification of Tropical Cyclone Forecasts
Peter May and Beth Ebert CAWCR Bureau of Meteorology Australia
Presentation transcript:

Verification Methods for High Resolution Model Forecasts Barbara Brown NCAR, Boulder, Colorado Collaborators: Randy Bullock, John Halley Gotway, Chris Davis, David Ahijevych, Eric Gilleland, Beth Ebert Hurricane Diagnostics and Verification Workshop May 2009

2 Goals Describe new approaches for evaluation of high-resolution spatial (gridded) forecasts Potentially useful for evaluating hurricane rainfall (and wind) patternsPotentially useful for evaluating hurricane rainfall (and wind) patterns The aim of these methods is to Provide more meaningful information about performance than traditional approachesProvide more meaningful information about performance than traditional approaches Overcome some of the insensitivity of traditional approachesOvercome some of the insensitivity of traditional approaches

Relevance for tropical cyclones Meaningful evaluation of precipitation evolution for land-falling hurricanes Meaningful evaluation of precipitation evolution for land-falling hurricanes Short-term predictions (e.g., extreme events; precipitation distribution; flooding)Short-term predictions (e.g., extreme events; precipitation distribution; flooding) Total storm precipitation predictionsTotal storm precipitation predictions Meaningful comparisons of high- and low- resolution models Meaningful comparisons of high- and low- resolution models Low resolution tends to “win” using traditional measures even if pattern is not as good…Low resolution tends to “win” using traditional measures even if pattern is not as good… Evaluation of oceanic precipitation using satellite Evaluation of oceanic precipitation using satellite Evaluation of wind pattern forecasts Evaluation of wind pattern forecasts Other?? Other?? 3

Hurricane Rita precipitation 4

Hurricane precipitation verification 5 Swath-based precip verification approach (Marchok et al. 2007) QPF pattern matching QPF pattern matching Mean, volume, and distribution of rain values Mean, volume, and distribution of rain values Production of extreme amounts Production of extreme amounts

Hurricane Rita precipitation 6 9/209/219/22 9/23 9/24 9/25 9/26

7 Consider gridded forecasts and observations of precipitation Traditional approach Which is better? OBS

8 Traditional approach OBS Scores for Examples 1-4: Correlation Coefficient = Probability of Detection = 0.00 False Alarm Ratio = 1.00 Hanssen-Kuipers = Gilbert Skill Score (ETS) = Scores for Example 5: Correlation Coefficient = 0.2 Probability of Detection = 0.88 False Alarm Ratio = 0.89 Hanssen-Kuipers = 0.69 Gilbert Skill Score (ETS) = 0.08 Forecast 5 is “Best”

9 Traditional approach OBS Some problems with the traditional approach: (1) Non-diagnostic – doesn’t tell us what was wrong with the forecast – or what was right (2) Utra-sensitive to small errors in simulation of localized phenomena

10 Spatial verification techniques aim to: Account for Account for Uncertainties in timing and locationUncertainties in timing and location Spatial structureSpatial structure Provide information that is Provide information that is Able to characterize error in physical termsAble to characterize error in physical terms DiagnosticDiagnostic Meaningful to forecast usersMeaningful to forecast users Weather variables (e.g., precipitation, wind fields) defined over spatial domains have coherent structure and features Spatial forecasts

Spatial verification approaches Filtering 1. Neighborhood 2. Scale separation Displacement 3. Feature-based 4. Field deformation 11

12 Field deformation approaches Field deformation approaches Measure distortion and displacement (phase error) for whole fieldMeasure distortion and displacement (phase error) for whole field How should the forecast be adjusted to make the best match with the observed field? Scale decomposition methods Scale decomposition methods Measure scale-dependent errorMeasure scale-dependent error Neighborhood verification methods Neighborhood verification methods Give credit to "close" forecastsGive credit to "close" forecasts Object- and feature-based methods Object- and feature-based methods Evaluate attributes of identifiable featuresEvaluate attributes of identifiable features New spatial verification approaches Keil and Craig, 2008

13 Intensity-scale method Casati et al. (2004) Evaluate forecast skill as a function of the precipitation intensity and the spatial scale of the error

14 Scale  wavelet decomposition of binary error Scale l=8 (640 km) Scale l=1 (5 km) mean (1280 km) Scale l=6 (160 km) Scale l=7 (320 km) Scale l=5 (80 km)Scale l=4 (40 km) Scale l=3 (20 km)Scale l=2 (10 km) 1 0 From Ebert 2008

15 MSE skill score Sample climatology (base rate) From Ebert 2008

16 Neighborhood verification Also called “fuzzy” verification Also called “fuzzy” verification Upscaling Upscaling Put observations and/or forecast on coarser gridPut observations and/or forecast on coarser grid Calculate traditional metricsCalculate traditional metrics Provide information about scales where the forecasts have skill Provide information about scales where the forecasts have skill

17 Neighborhood methods From Mittermaier 2008 Fractional Skill Score (FSS)

18 Feature-based verification Composite approach (Nachamkin) Composite approach (Nachamkin) Contiguous rain area approach (CRA; Ebert and McBride, 2000; Gallus and others) Contiguous rain area approach (CRA; Ebert and McBride, 2000; Gallus and others) Error components Error components displacementdisplacement volumevolume patternpattern

19 Spatial verification method: MODE MODE: Method for Object-based Diagnostic Evaluation MODE: Method for Object-based Diagnostic Evaluation Goals Goals Mimic how a human would identify storms and evaluate forecastsMimic how a human would identify storms and evaluate forecasts Measure forecast “attributes” that are of interest to usersMeasure forecast “attributes” that are of interest to users Steps Steps Identify objectsIdentify objects Measure attributesMeasure attributes Match forecast attributesMatch forecast attributes Measure differences in attributesMeasure differences in attributes Example: Precipitation; 1 Jun 2005 ForecastObserved User inputs: Object identification; attribute selection and weighting

20 Object-based example: 1 June 2005 MODE quantitative results indicate MODE quantitative results indicate Most forecast areas too largeMost forecast areas too large Forecast areas slightly displacedForecast areas slightly displaced Median and extreme intensities too largeMedian and extreme intensities too large BUT – overall – forecast is pretty goodBUT – overall – forecast is pretty good In contrast: In contrast: POD = 0.40POD = 0.40 FAR = 0.56FAR = 0.56 CSI = 0.27CSI = Forecast precipitation objects with Diagnosed objects overlaid

21 Back to the original example… MODE “Interest” measures overall ability of forecasts to match obs MODE “Interest” measures overall ability of forecasts to match obs Interest values provide more intuitive estimates of performance than the traditional measure (ETS) Interest values provide more intuitive estimates of performance than the traditional measure (ETS) But – even for spatial methods, Single measures don’t tell the whole story! But – even for spatial methods, Single measures don’t tell the whole story!

22 Spatial Method Intercomparison Project (ICP) Goal: Compare the various approaches using the same datasets (real, geometric, known errors) Goal: Compare the various approaches using the same datasets (real, geometric, known errors) Includes all of the methods described here; international participants Includes all of the methods described here; international participants Collection of papers in preparation (Weather and Forecasting) Collection of papers in preparation (Weather and Forecasting)

23 What do the new methods measure?

Applicability to ensemble forecasts 24 From C. Davis

Statistical inference Confidence intervals are required to provide Meaningful evaluations of individual model performanceMeaningful evaluations of individual model performance Meaningful comparisons of model performanceMeaningful comparisons of model performance 25 Threshold 24-h QPF, 24-h lead

Method availability Many methods available as part of the Model Evaluation Tools (MET) Many methods available as part of the Model Evaluation Tools (MET) MODEMODE NeighborhoodNeighborhood Intensity-scaleIntensity-scale MET is freely available MET is freely available Strong user supportStrong user support Software for some others is available on the intercomparison website or from original developers Software for some others is available on the intercomparison website or from original developers 26

Conclusion New spatial methods provide great opportunities for more meaningful evaluation of precipitation forecasts – and other forecast fields New spatial methods provide great opportunities for more meaningful evaluation of precipitation forecasts – and other forecast fields Feed back into forecast developmentFeed back into forecast development Provide information to usersProvide information to users Each method is useful for particular types of situations and for answering particular types of questions Each method is useful for particular types of situations and for answering particular types of questions 27

Topics for discussion Consider how new methods may be beneficial (and adaptable) for high-res NWP for hurricanes Consider how new methods may be beneficial (and adaptable) for high-res NWP for hurricanes Can these methods help?Can these methods help? How do they need to be adapted/altered?How do they need to be adapted/altered? Would they be useful for other fields (e.g., winds)?Would they be useful for other fields (e.g., winds)? Are other kinds of new methods needed?Are other kinds of new methods needed? Use of aircraft observations – incomplete grids (reflectivity) Use of aircraft observations – incomplete grids (reflectivity) Methods for evaluation of genesis? A global problem… Need to consider false alarms as well as misses Methods for evaluation of genesis? A global problem… Need to consider false alarms as well as misses Use of confidence intervals Use of confidence intervals 28 From Bender et al. 2007

29