Climate Modeling LaboratoryMEASNC State University An Extended Procedure for Implementing the Relative Operating Characteristic Graphical Method Robert.

Slides:



Advertisements
Similar presentations
ECMWF Slide 1Met Op training course – Reading, March 2004 Forecast verification: probabilistic aspects Anna Ghelli, ECMWF.
Advertisements

Slide 1ECMWF forecast User Meeting -- Reading, June 2006 Verification of weather parameters Anna Ghelli, ECMWF.
Tolerating the Right Kinds of Uncertainty Translational Climate Model and Data Analysis Andy Morse, Cyril Caminade, Dave MacLeod, Anne Jones School of.
“A LPB demonstration project” Celeste Saulo CIMA and Dept. of Atmos. and Ocean Sciences University of Buenos Aires Argentina Christopher Cunningham Center.
Part 3 Probabilistic Decision Models
Verification of probability and ensemble forecasts
Details for Today: DATE:3 rd February 2005 BY:Mark Cresswell FOLLOWED BY:Assignment 2 briefing Evaluation of Model Performance 69EG3137 – Impacts & Models.
Predictability and Chaos EPS and Probability Forecasting.
The Art and Science of Teaching (2007)
Engineering Economic Analysis Canadian Edition
2012: Hurricane Sandy 125 dead, 60+ billion dollars damage.
ROC Statistics for the Lazy Machine Learner in All of Us Bradley Malin Lecture for COS Lab School of Computer Science Carnegie Mellon University 9/22/2005.
Operations Management Decision-Making Tools Module A
Evaluating Hypotheses Chapter 9. Descriptive vs. Inferential Statistics n Descriptive l quantitative descriptions of characteristics.
Evaluating Hypotheses Chapter 9 Homework: 1-9. Descriptive vs. Inferential Statistics n Descriptive l quantitative descriptions of characteristics ~
EG1204: Earth Systems: an introduction Meteorology and Climate Lecture 7 Climate: prediction & change.
The AutoSimOA Project Katy Hoad, Stewart Robinson, Ruth Davies Warwick Business School OR49 Sept 07 A 3 year, EPSRC funded project in collaboration with.
Lecture 10 Comparison and Evaluation of Alternative System Designs.
Value of Information for Complex Economic Models Jeremy Oakley Department of Probability and Statistics, University of Sheffield. Paper available from.
SIMULATION. Simulation Definition of Simulation Simulation Methodology Proposing a New Experiment Considerations When Using Computer Models Types of Simulations.
Water Management Presentations Summary Determine climate and weather extremes that are crucial in resource management and policy making Precipitation extremes.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
Using Short Range Ensemble Model Data in National Fire Weather Outlooks Sarah J. Taylor David Bright, Greg Carbin, Phillip Bothwell NWS/Storm Prediction.
Probabilistic forecasts of (severe) thunderstorms for the purpose of issuing a weather alarm Maurice Schmeits, Kees Kok, Daan Vogelezang and Rudolf van.
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
Repository Method to suit different investment strategies Alma Lilia Garcia & Edward Tsang.
© Crown copyright Met Office Operational OpenRoad verification Presented by Robert Coulson.
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
4IWVM - Tutorial Session - June 2009 Verification of categorical predictands Anna Ghelli ECMWF.
LSM733-PRODUCTION OPERATIONS MANAGEMENT By: OSMAN BIN SAIF LECTURE 26 1.
Pablo Santos WFO Miami, FL Mark DeMaria NOAA/NESDIS David Sharp WFO Melbourne, FL rd IHC St Petersburg, FL PS/DS “HURRICANE CONDITIONS EXPECTED.”
Discrete Distributions The values generated for a random variable must be from a finite distinct set of individual values. For example, based on past observations,
Engineering Economic Analysis Canadian Edition
10.2 Tests of Significance Use confidence intervals when the goal is to estimate the population parameter If the goal is to.
Climate Modeling LaboratoryMEASNC State University Predictability of the Moisture Regime Associated with the Pre-onset of Sahelian Rainfall Roberto J.
A Preliminary Verification of the National Hurricane Center’s Tropical Cyclone Wind Probability Forecast Product Jackie Shafer Scitor Corporation Florida.
Measuring forecast skill: is it real skill or is it the varying climatology? Tom Hamill NOAA Earth System Research Lab, Boulder, Colorado
Heidke Skill Score (for deterministic categorical forecasts) Heidke score = Example: Suppose for OND 1997, rainfall forecasts are made for 15 stations.
Celeste Saulo and Juan Ruiz CIMA (CONICET/UBA) – DCAO (FCEN –UBA)
Numerical Prediction of High-Impact Local Weather: How Good Can It Get? Kelvin K. Droegemeier Regents’ Professor of Meteorology Vice President for Research.
© Crown copyright Met Office Probabilistic turbulence forecasts from ensemble models and verification Philip Gill and Piers Buchanan NCAR Aviation Turbulence.
Inferential Statistics A Closer Look. Analyze Phase2 Nature of Inference in·fer·ence (n.) “The act or process of deriving logical conclusions from premises.
Designing multiple biometric systems: Measure of ensemble effectiveness Allen Tang NTUIM.
Simulation is the process of studying the behavior of a real system by using a model that replicates the system under different scenarios. A simulation.
Decision Theory McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
Hypothesis Testing Introduction to Statistics Chapter 8 Feb 24-26, 2009 Classes #12-13.
Verification of ensemble systems Chiara Marsigli ARPA-SIMC.
Applying Ensemble Probabilistic Forecasts in Risk-Based Decision Making Hui-Ling Chang 1, Shu-Chih Yang 2, Huiling Yuan 3,4, Pay-Liam Lin 2, and Yu-Chieng.
Nathalie Voisin 1, Florian Pappenberger 2, Dennis Lettenmaier 1, Roberto Buizza 2, and John Schaake 3 1 University of Washington 2 ECMWF 3 National Weather.
Common verification methods for ensemble forecasts
Diagnostic verification and extremes: 1 st Breakout Discussed the need for toolkit to build beyond current capabilities (e.g., NCEP) Identified (and began.
NCAR, 15 April Fuzzy verification of fake cases Beth Ebert Center for Australian Weather and Climate Research Bureau of Meteorology.
Jake Mittelman James Belanger Judith Curry Kris Shrestha EXTENDED-RANGE PREDICTABILITY OF REGIONAL WIND POWER GENERATION.
Details for Today: DATE:13 th January 2005 BY:Mark Cresswell FOLLOWED BY:Practical Dynamical Forecasting 69EG3137 – Impacts & Models of Climate Change.
Climate Modeling LaboratoryMEASNC State University Predictability of the Moisture Regime During the Pre-onset Period of Sahelian Rains Robert J. Mera Marine,
Figures from “The ECMWF Ensemble Prediction System”
Addressing the environmental impact of salt use on roads
Verifying and interpreting ensemble products
Probabilistic forecasts
Validation-Based Decision Making
COSMO-LEPS Verification
Command Terms
Signal detection theory
Can we distinguish wet years from dry years?
Recurrence Probabilites of Weather Events What does this "40 percent" mean? ...will it rain 40 percent of of the time? ...will it rain.
Roc curves By Vittoria Cozza, matr
Verification of SPE Probability Forecasts at SEPC
Short Range Ensemble Prediction System Verification over Greece
Presentation transcript:

Climate Modeling LaboratoryMEASNC State University An Extended Procedure for Implementing the Relative Operating Characteristic Graphical Method Robert J. Mera March 6, 2008 Marine, Earth and Atmospheric Sciences North Carolina State University

Climate Modeling LaboratoryMEASNC State University Outline Objective and Motivation Principles of Ensemble Forecasting Traditional Relative Operating Characteristic (ROC) and Economic Value (EV) analysis Extended Relative Operating Characteristic (EROC) Conclusions Applications and Future Work

Climate Modeling LaboratoryMEASNC State University Motivation Climate prediction is becoming increasingly important for different sectors of the economy worldwide Courtesy: El Universo A case of false alarm/miss

Climate Modeling LaboratoryMEASNC State University Ensemble forecasting Ensemble Prediction System (EPS) forecasting is a method used to account for uncertainties and errors in the forecasting system (recall Chaos theory) Through the ensemble approach one can generate probabilistic forecasts for assessing a future event such as excessive rains, droughts, etc.

Climate Modeling LaboratoryMEASNC State University Contingency Matrix NoYes No No cost ( ) Miss ( ) Yes False Alarm ( ) Hit ( ) Observations EPS Forecast Hit Rate: H= δ/( +δ) False Alarm Rate: F= /( + ) A decision maker becomes a user of weather forecasts if he/she alters his/her actions based on forecast information A cost-loss analysis can be assessed based on a 2x2 matrix in which we evaluate the skill of a probabilistic forecast

Climate Modeling LaboratoryMEASNC State University Ensemble numbers In our contingency matrix we compute the hit rate and false alarm for an array of ensemble member groups Example: 15 ensembles used – Hit rate and false alarm calculated for only 1 out of 15, 2/15... n/N We can use this information to analyze the skill of a model

Climate Modeling LaboratoryMEASNC State University The Relative Operating Characteristic (ROC) The ROC method is widely used for estimating the skill of ensemble prediction systems (EPS) (Marzban, 2004) The closer a curve is to the upper-left-hand corner, the more skillful the forecast system is A perfect forecast system would have a ROC area (ROCA) of 1 A system with no capability of distinguishing in advance between different climate events has a score of 0.5, i.e. lying on the diagonal defined by (0,0) and (1,1) Using the information in our contingency matrix we can compute ROC Each point on the curve represents a group of ensembles (1/15, 2/15, etc.)

Climate Modeling LaboratoryMEASNC State University The ultimate utility of climate forecasts is economic and other benefits associated with their actual use in the daily decision-making process of individuals and different organizations Simplistically, users of climate forecasts either DO or DO NOT take action, but the relative value of the forecasts varies with model performance (i.e. hit rate, false alarm) Utility of Climate Predictions

Climate Modeling LaboratoryMEASNC State University Economic value (EV) graph based on the ROC graph The economic value (EV) graphical method provides a measure of the EPS performance in relative economic terms for a specific hypothetical range of end users (C/L,) varying from 0 to 1 (Richardson 2000a,b) EV computes the relative economic value using the hit rate and false alarm rate, where a value of 1 translates to a perfect forecast. For the sake of brevity, we will not discuss the mathematics involved.

Climate Modeling LaboratoryMEASNC State University EROC Procedure Based on the ROC skill alone it is not possible to determine if a useful level of skill has been achieved for a specific end user. The EV method is more cumbersome to use and its approach could result in some oversimplification of the actual situation for many users because in reality they may have an infinite range of available mitigation options. Also, EV does not provide certain features of ROC that help to diagnose specific characteristics of EPS. Our goal was to develop an alternative procedure similar to the traditional implementation of the ROC graphical method, but one that also provides evaluation for a specific end user.

Climate Modeling LaboratoryMEASNC State University ROC and EV relationship ROC measures skill only for the C/L (μ) with the maximum value (V opt ) This effectively ignores any other user

Climate Modeling LaboratoryMEASNC State University EROC Specific users are more interested in the economic value related their own mitigation options EROC allows us to build different base lines for different users μ= 0.25 μ= 0.40 μ= V (opt) Two users

Climate Modeling LaboratoryMEASNC State University What if the end-user decides to use a forecast only if the V min is at a certain value? EROC allows for this. Initial: V min =0 Initial: V min =0.1 Notice the shift

Climate Modeling LaboratoryMEASNC State University Additional advantage Each curve in the EV plot represents a particular group of ensembles on the ROC plot (i.e. a point on the ROC curve) EROC preserves ROCs ability to diagnose each ensemble groups skill and its relative value for a specific user This translates into a possibility of using a smaller number of ensemble members, say 3 or 4 model runs instead of 15 (i.e. a less expensive forecast)

Climate Modeling LaboratoryMEASNC State University Conclusions An extended ROC (EROC) procedure has been developed from the traditional ROC and the EV graphical methods used for evaluating the performance of ensemble climate/weather prediction systems In the proposed EROC approach we recommend construction of user-specific baselines that provide us with an analysis of both skill and value of an EPS forecast that is tailor-made for a specific user EROC allows for a clearer picture of minimum value and an ensemble groups skill for a particular user

Climate Modeling LaboratoryMEASNC State University Applications and Future Work Implementation of a routine to calculate EROC and EV plots to the CML R statistical library (currently in its final testing stage) You can view the progress on html An ongoing project in partnership with NCAR using WRF-DART simulations output for EROC implementation

Climate Modeling LaboratoryMEASNC State University Acknowledgements Professor Fred Semazzi Neil Davis Matt Norman Richard Anyah NCSU CLIMLAB An Extended Procedure for Implementing the Relative Operating Characteristic Graphical Method Fredrick H. M. Semazzi, and Roberto J. Mera Journal of Applied Meteorology and Climatology, Semptember 2006

Climate Modeling LaboratoryMEASNC State University Questions?

Climate Modeling LaboratoryMEASNC State University

Climate Modeling LaboratoryMEASNC State University Criteria for Issuing a forecast Decision to issue a forecast of an event (E) to occur is probabilistically based on the criteria: Where: (N): size of the ensemble (n): number of the runs in the ensemble for which (E) actually occurs (p): probability given by the ratio (n/N) This is the threshold fraction above which the event (E) is predicted to occur based on the model forecast