Addressing the environmental impact of salt use on roads

Slides:



Advertisements
Similar presentations
Climate Modeling LaboratoryMEASNC State University An Extended Procedure for Implementing the Relative Operating Characteristic Graphical Method Robert.
Advertisements

ECMWF Slide 1Met Op training course – Reading, March 2004 Forecast verification: probabilistic aspects Anna Ghelli, ECMWF.
Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
1 Verification Continued… Holly C. Hartmann Department of Hydrology and Water Resources University of Arizona RFC Verification Workshop,
Guidance of the WMO Commission for CIimatology on verification of operational seasonal forecasts Ernesto Rodríguez Camino AEMET (Thanks to S. Mason, C.
14 May 2001QPF Verification Workshop Verification of Probability Forecasts at Points WMO QPF Verification Workshop Prague, Czech Republic May 2001.
1 of Introduction to Forecasts and Verification.
Snow Trends in Northern Spain. Analysis and Simulation with Statistical Downscaling Methods Thanks to: Daniel San Martín, Sixto.
Verification of probability and ensemble forecasts
Details for Today: DATE:3 rd February 2005 BY:Mark Cresswell FOLLOWED BY:Assignment 2 briefing Evaluation of Model Performance 69EG3137 – Impacts & Models.
Seasonal Predictability in East Asian Region Targeted Training Activity: Seasonal Predictability in Tropical Regions: Research and Applications 『 East.
Introduction to Probability and Probabilistic Forecasting L i n k i n g S c i e n c e t o S o c i e t y Simon Mason International Research Institute for.
Gridded OCF Probabilistic Forecasting For Australia For more information please contact © Commonwealth of Australia 2011 Shaun Cooper.
Verification and evaluation of a national probabilistic prediction system Barbara Brown NCAR 23 September 2009.
Paul Fajman NOAA/NWS/MDL September 7,  NDFD ugly string  NDFD Forecasts and encoding  Observations  Assumptions  Output, Scores and Display.
1 Intercomparison of low visibility prediction methods COST-722 (WG-i) Frédéric Atger & Thierry Bergot (Météo-France)
Barbara Casati June 2009 FMI Verification of continuous predictands
Evaluation of Potential Performance Measures for the Advanced Hydrologic Prediction Service Gary A. Wick NOAA Environmental Technology Laboratory On Rotational.
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
Creating Empirical Models Constructing a Simple Correlation and Regression-based Forecast Model Christopher Oludhe, Department of Meteorology, University.
Probabilistic forecasts of (severe) thunderstorms for the purpose of issuing a weather alarm Maurice Schmeits, Kees Kok, Daan Vogelezang and Rudolf van.
Performance of the MOGREPS Regional Ensemble
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
© Crown copyright Met Office Operational OpenRoad verification Presented by Robert Coulson.
Richard (Rick)Jones Regional Training Workshop on Severe Weather Forecasting Macau, April 8 -13, 2013.
ECMWF WWRP/WMO Workshop on QPF Verification - Prague, May 2001 NWP precipitation forecasts: Validation and Value Deterministic Forecasts Probabilities.
April 24, 2007 Nihat Cubukcu Utilization of Numerical Weather Forecast in Energy Sector.
Verification of ensembles Courtesy of Barbara Brown Acknowledgments: Tom Hamill, Laurence Wilson, Tressa Fowler Copyright UCAR 2012, all rights reserved.
4IWVM - Tutorial Session - June 2009 Verification of categorical predictands Anna Ghelli ECMWF.
Verification of the Cooperative Institute for Precipitation Systems‘ Analog Guidance Probabilistic Products Chad M. Gravelle and Dr. Charles E. Graves.
Tutorial. Other post-processing approaches … 1) Bayesian Model Averaging (BMA) – Raftery et al (1997) 2) Analogue approaches – Hopson and Webster, J.
Continued Development of Tropical Cyclone Wind Probability Products John A. Knaff – Presenting CIRA/Colorado State University and Mark DeMaria NOAA/NESDIS.
A Preliminary Verification of the National Hurricane Center’s Tropical Cyclone Wind Probability Forecast Product Jackie Shafer Scitor Corporation Florida.
Measuring forecast skill: is it real skill or is it the varying climatology? Tom Hamill NOAA Earth System Research Lab, Boulder, Colorado
Heidke Skill Score (for deterministic categorical forecasts) Heidke score = Example: Suppose for OND 1997, rainfall forecasts are made for 15 stations.
Model validation Simon Mason Seasonal Forecasting Using the Climate Predictability Tool Bangkok, Thailand, 12 – 16 January 2015.
Latest results in verification over Poland Katarzyna Starosta, Joanna Linkowska Institute of Meteorology and Water Management, Warsaw 9th COSMO General.
© Crown copyright Met Office Probabilistic turbulence forecasts from ensemble models and verification Philip Gill and Piers Buchanan NCAR Aviation Turbulence.
61 st IHC, New Orleans, LA Verification of the Monte Carlo Tropical Cyclone Wind Speed Probabilities: A Joint Hurricane Testbed Project Update John A.
Probabilistic Forecasting. pdfs and Histograms Probability density functions (pdfs) are unobservable. They can only be estimated. They tell us the density,
Issues concerning the interpretation of statistical significance tests.
ECMWF Training Course Reading, 25 April 2006 EPS Diagnostic Tools Renate Hagedorn European Centre for Medium-Range Weather Forecasts.
Standard Verification Strategies Proposal from NWS Verification Team NWS Verification Team Draft03/23/2009 These slides include notes, which can be expanded.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss A more reliable COSMO-LEPS F. Fundel, A. Walser, M. A.
Jen-Tzung Chien, Meng-Sung Wu Minimum Rank Error Language Modeling.
Probabilistic Forecasts of Extreme Precipitation Events for the U.S. Hazards Assessment Kenneth Pelman 32 nd Climate Diagnostics Workshop Tallahassee,
Verification of ensemble systems Chiara Marsigli ARPA-SIMC.
Nathalie Voisin 1, Florian Pappenberger 2, Dennis Lettenmaier 1, Roberto Buizza 2, and John Schaake 3 1 University of Washington 2 ECMWF 3 National Weather.
Common verification methods for ensemble forecasts
NCAR, 15 April Fuzzy verification of fake cases Beth Ebert Center for Australian Weather and Climate Research Bureau of Meteorology.
Details for Today: DATE:13 th January 2005 BY:Mark Cresswell FOLLOWED BY:Practical Dynamical Forecasting 69EG3137 – Impacts & Models of Climate Change.
HIC Meeting, 02/25/2010 NWS Hydrologic Forecast Verification Team: Status and Discussion Julie Demargne OHD/HSMB Hydrologic Ensemble Prediction (HEP) group.
Verifying and interpreting ensemble products
Dan Petersen Bruce Veenhuis Greg Carbin Mark Klein Mike Bodner
COSMO Priority Project ”Quantitative Precipitation Forecasts”
Graphical Descriptive Techniques
Binary Forecasts and Observations
Probabilistic forecasts
Validation-Based Decision Making
Application of a global probabilistic hydrologic forecast system to the Ohio River Basin Nathalie Voisin1, Florian Pappenberger2, Dennis Lettenmaier1,
Quantitative verification of cloud fraction forecasts
Predicting Frost Using Artificial Neural Network
COSMO-LEPS Verification
Deterministic (HRES) and ensemble (ENS) verification scores
Can we distinguish wet years from dry years?
Roc curves By Vittoria Cozza, matr
Seasonal Forecasting Using the Climate Predictability Tool
Verification of SPE Probability Forecasts at SEPC
Short Range Ensemble Prediction System Verification over Greece
Power Regression & Regression estimation of event probabilities (REEP)
Presentation transcript:

Addressing the environmental impact of salt use on roads 10.10.14 10.10.14 Addressing the environmental impact of salt use on roads 1 1

10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #2 Introduction 2014-2015 Winter General considerations (RSTs, frost/ice conditions) Reuter's model Verification through contingency tables (some theory first... then results!) 2 2

10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #3 RST Summary – Winter 2014-2015 Number of days with: RST > 0°C RST ≤ 0°C NA Coldest > Rocchetta / Casa Cantoniera Warmest > Cadino / Acquaviva Coldest month 3 3

Ice/Frost Formation – Winter 2014-2015 Tdew point as provided + 1°C ** 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #4 Ice/Frost Formation – Winter 2014-2015 Data: From 15th November 2014 to 15th April 2015 Applied conditions: FROST ↔ TRST ≤ Tdew point < 0 ICE ↔ TRST ≤ Tdew point & TRST ≤ 0 & Tdew point ≥ 0 * RST sensor properly installed on A22 flyover on December 23rd ** In order to compensate for measurements errors that might be introduced for high RH values Tdew point as provided + 1°C ** 4 4

Forecast verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #5 Forecast verification Verification is the process of comparing forecasts to relevant observations. Verification is one aspect of measuring forecast goodness. (quality of the forecast + the user and his/her application of the forecast information) Contingency table: display format used to analyse and record the relationship between two or more categorical variables, in this case the forecast and the observation of a weather forecast element. → It is the categorical equivalent of the scatterplot used to analyse the relationship between two continuous variables... 5 5

What if data are not binary? (occurred – not occurred event?) 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #6 Contigency tables 2x2 contingency table (tornados) Categorical scores Hit Rate = a/(a+c) False Alarm Ratio = b/(a+b) … many more! BUT What if data are not binary? (occurred – not occurred event?) 6 6

Contingency Table for Freezing Temperatures (i.e. RST ≤ 0°C) 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #7 Contigency tables When your data are not binary you have to: (1) define a threshold (2) pick a threshold that is meaningful to your end-users Contingency Table for Freezing Temperatures (i.e. RST ≤ 0°C) 7 7

PROBABILITY FORECASTING 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #8 Contigency tables That's especially true for “Observed” categories “Ice” when RST ≤ 0°C “No ice” when RST > 0°C For “Forecast” categories we'd better be not so deterministic! Overnight changes for RST decay trend Reuter's model works best in certain conditions only Our forecast is uncertain and this uncertainty must be taken into account! PROBABILITY FORECASTING 8 8

Contigency tables How to construct: Bin your data 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #9 Contigency tables How to construct: Bin your data Calculate PODY and POFD by moving through bins Plot using scatter plot 9 9

Contigency tables How to construct: Bin your data 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #10 Contigency tables How to construct: Bin your data Calculate scores by moving through bins (different threshold!) 10 10

Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #11 Forecast Verification How to construct: Bin your data Calculate scores by moving through bins (different threshold!) PLOT, PLOT, PLOT! Using a set of increasing probability thresholds (for example, 0.05, 0.15, 0.25, etc.) to make yes/no decisions Relative Operating Characteristic (ROC) graph What is the ability of the forecast to discriminate between events and non-events? (conditioned on the observations) 11 11

Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #12 Forecast Verification Attributes Diagram Points below the line = overforecasting (probabilities too high) Points above the line = underforecasting (probabilities too low) Brier Score What is the magnitude of the probability forecast errors? Brier Skill Score What is the relative skill of the probabilistic forecast over that of climatology, in terms of predicting whether or not an event occurred? How well do the predicted probabilities of an event correspond to their observed frequencies? (conditioned on the forecasts) 12 12

Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #13 Forecast Verification Relative Value Diagram Relative value (AKA value score) = a skill score of expected expense, with climatology as the reference forecast. Because the cost/loss ratio is different for different users of forecasts, the value is generally plotted as a function of C/L. For a cost/loss ratio C/L for taking action based on a forecast, what is the relative improvement in economic value between climatological and perfect information? 13 13

Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #14 Forecast Verification Relative Value Diagram Like ROC, it gives information that can be used in decision making. Compute relative value curves for the entire range of probabilities Select the optimal values (the upper envelope of the relative value curves) to represent the value of the probabilistic forecast system 14 14

Reuter's Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #15 Reuter's Forecast Verification Aim Verifying Reuter's “frost” forecast on nights when the minimum RST is 5°C o below Procedure The minimum RST for each winter night is noted at each forecast site and compared with the minimum RSTs forecast overnight by Reuter's model. Results are entered into a contingency table where forecast categories are probabilities! Results ... 15 15

Reuter's Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #16 Reuter's Forecast Verification Number of days with: 0 < RST ≤ 5°C RST ≤ 0°C 16 16

Reuter's Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #17 Reuter's Forecast Verification Date Measured RSTmin (at a given time t) Forecast RST at time t Results table for overnight Reuter's forecasts 2nd time point in Reuter's interpolation 17 17

Reuter's Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #18 Reuter's Forecast Verification Date Measured RSTmin (at a given time t) Forecast RST at time t Contingency table for every overnight Reuter's forecast 2nd time point in Reuter's interpolation Overall contingency table “Ice” as RST ≤ 0°C This is the starting point from which you can build the verification plots described before 18 18

Reuter's Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #19 Reuter's Forecast Verification CADINO no skill climatology no skill no resolution 1% 20% 80% The forecast: distinguishes well between events and non-events has a good reliability (slightly underforecasting) has small probability-associated errors has a good predictability skill with respect to climatology gives a 66% economic benefit if you use 20% for your threshold and C/L = 1/8 = 0.125 (Thornes 2001) Station climatological frequency = 0.31 19 19

Reuter's Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #20 Reuter's Forecast Verification SAN MICHELE climatology no skill no skill no resolution 1% 20% 80% The forecast: distinguishes well between events and non-events has a good reliability (slightly overforecasting) has small probability-associated errors has a good predictability skill with respect to climatology gives a 31% economic benefit if you use 20% for your threshold and C/L = 1/8 = 0.125 (Thornes 2001) Station climatological frequency = 0.41 20 20

Reuter's Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #21 Reuter's Forecast Verification ACQUAVIVA no skill climatology no skill no resolution 1% 20% 80% The forecast: distinguishes well between events and non-events has a good reliability (slightly overforecasting) has small probability-associated errors has a good predictability skill with respect to climatology gives a 32% economic benefit if you use 20% for your threshold and C/L = 1/8 = 0.125 (Thornes 2001) Station climatological frequency = 0.31 21 21

Reuter's Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #22 Reuter's Forecast Verification CASA CANTONIERA climatology no skill no skill no resolution 1% 20% 80% The forecast: distinguishes well between events and non-events has a good reliability (slightly underforecasting) has small probability-associated errors has a good predictability skill with respect to climatology gives a 30% economic benefit if you use 1% for your threshold and C/L = 1/8 = 0.125 (Thornes 2001) Station climatological frequency = 0.48 22 22

Reuter's Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #23 Reuter's Forecast Verification ROCCHETTA climatology no skill no skill no resolution 80% 1% 20% The forecast: distinguishes well between events and non-events has a good reliability (slightly underforecasting) has decreasing probability-associated errors has an good predictability skill with respect to climatology gives a 40% economic benefit if you use 20% for your threshold and C/L = 1/8 = 0.125 (Thornes 2001) Station climatological frequency = 0.50 23 23

Reuter's Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #24 Reuter's Forecast Verification VIADOTTO climatology no skill no skill no resolution 80% 1% 20% The forecast: distinguishes well between events and non-events has a good reliability (slightly underforecasting) has decreasing probability-associated errors has an increasing predictability skill with respect to climatology and with respect to interpolation times gives a 10% economic benefit if you use 20% for your threshold and C/L = 1/8 = 0.125 (Thornes 2001) Station climatological frequency = 0.48 24 24

Reuter's Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #25 Reuter's Forecast Verification Conclusions Cadino – San Michele – Acquaviva Same results (similar RST and probability thresholds) Casa Cantoniera Similar results to the C-SM-A group but higher economic benefits for lower probability thresholds (thus higher RST threshold) Rocchetta – Viadotto Different results from the C-SM-A group (different RST thresholds) ← Different climatology (e.g. RST threshold at 20% equal to 2°C and 3°C respectively → RST decreases overnight more rapidly) 25 25

Reuter's Forecast Verification 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #26 Reuter's Forecast Verification Observations Rocchetta* Cadino *Similarly for Viadotto! Föhn on Jan 12th, 2015 26 26

Deliverables As far as the Weather Service is concerned: 10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #27 Deliverables As far as the Weather Service is concerned: D.B1.1 v2.4 Standing from FAMAS (a few integrations suggested by Roberto C to be inserted) D.B2.4 v2.3 Done D.B2.7 v2.2 Insert results from 2014-2015 winter (Reuter + bulletins) 27 27

10.10.14 10.10.14 8th TECHNICAL SUPERVISORY BOARD TRENTO, 27/04/2015 SLIDE #28 What's next? Conclude analysis on 2014-2015 winter data (probabilistic bulletins) Deliverables writing up! Resources on forecast verification from: Fowler T.L. et al., Introduction to Forecast Verification, University Corporation for Atmospheric Research, 2012 Kryjov V., Probabilistic downscaling: verification methods and metrics. Methods for probabilistic forecasts, NEACC training portal neacc.meteoinfo.ru/training/103-lecture-4-on-forecast-downscaling Wilson L et al., Forecast Verification, Eumetcal learning module www.eumetcal.org/resources/ukmeteocal/verification/www/english/courses/msgcrs/index.htm 28 28

Thank you for your attention 10.10.14 10.10.14 Provincia Autonoma di Trento Servizio Prevenzione Rischi Ufficio Previsioni e Pianificazione – Meteotrentino Claudia Di Napoli claudia.dinapoli@provincia.tn.it Andrea Piazza andrea.piazza@provincia.tn.it www.clean-roads.eu Thank you for your attention 29 29