Page 1© Crown copyright 2004 Presentation to ECMWF Forecast Product User Meeting 16th June 2005.

Slides:



Advertisements
Similar presentations
Verification of Probabilistic Forecast J.P. Céron – Direction de la Climatologie S. Mason - IRI.
Advertisements

Climate Prediction Applications Science Workshop
Chapter 1 The Study of Body Function Image PowerPoint
STATISTICS HYPOTHESES TEST (III) Nonparametric Goodness-of-fit (GOF) tests Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering.
Overview of KMA s Operational LRF Services Korea Meteorological Administration.
Recent & planned developments to the Met Office Global and Regional Ensemble Prediction System (MOGREPS) Richard Swinbank, Warren Tennant, Sarah Beare,
1 RA I Sub-Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Casablanca, Morocco, 20 – 22 December 2005 Status of observing programmes in RA I.
THOR Annual Meeting - Bergen 9-11 November /25 On the impact of initial conditions relative to external forcing on the skill of decadal predictions:
Norwegian Meteorological Institute met.no LAMEPS – Limited area ensemble forecasting in Norway, using targeted EPS Marit Helene Jensen, Inger-Lise Frogner,
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Extended range forecasts at MeteoSwiss: User experience.
Page 1© Crown copyright 2004 Seasonal forecasting activities at the Met Office Long-range Forecasting Group, Hadley Centre Presenter: Richard Graham ECMWF.
Page 1© Crown copyright 2004 ECMWF Forecast Products Users Meeting 15th June 2006.
Norwegian Meteorological Institute met.no TEPS/LAMEPS at met.no Marit Helene Jensen, Inger-Lise Frogner, Hilde Haakenstad and Ole Vignes.
ECMWF Slide 1Met Op training course – Reading, March 2004 Forecast verification: probabilistic aspects Anna Ghelli, ECMWF.
Slide 1ECMWF forecast User Meeting -- Reading, June 2006 Verification of weather parameters Anna Ghelli, ECMWF.
User Meeting 15 June 2005 Monthly Forecasting Frederic Vitart ECMWF, Reading, UK.
Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
Seasonal forecasts Laura Ferranti and the Seasonal Forecast Section User meeting June 2005.
F. Grazzini, Forecast Products Users Meeting June 2005 SEVERE WEATHER FORECASTS F. Grazzini, F. Lalaurette.
The ECMWF Monthly and Seasonal Forecast Systems
Page 1 © Crown copyright 2005 ECMWF User Meeting, June 2006 Developments in the Use of Short and Medium-Range Ensembles at the Met Office Ken Mylne.
Severe Weather Forecasts
ECMWF Training Course 2005 slide 1 Forecast sensitivity to Observation Carla Cardinali.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
FACTORING ax2 + bx + c Think “unfoil” Work down, Show all steps.
Lecture 2 ANALYSIS OF VARIANCE: AN INTRODUCTION
Chapter 13 – Weather Analysis and Forecasting
Chapter 16 Goodness-of-Fit Tests and Contingency Tables
The North American Monsoon System: Recent Evolution and Current Status Update prepared by Climate Prediction Center / NCEP 11 June 2012.
Lecture 3 Validity of screening and diagnostic tests
Lets play bingo!!. Calculate: MEAN Calculate: MEDIAN
Sets Sets © 2005 Richard A. Medeiros next Patterns.
Understanding Generalist Practice, 5e, Kirst-Ashman/Hull
Model and Relationships 6 M 1 M M M M M M M M M M M M M M M M
25 seconds left…...
Subtraction: Adding UP
Statistical Inferences Based on Two Samples
©Brooks/Cole, 2001 Chapter 12 Derived Types-- Enumerated, Structure and Union.
Chapter Thirteen The One-Way Analysis of Variance.
Chapter 8 Estimation Understandable Statistics Ninth Edition
PSSA Preparation.
Experimental Design and Analysis of Variance
Simple Linear Regression Analysis
Commonly Used Distributions
LRF Training, Belgrade 13 th - 16 th November 2013 © ECMWF Sources of predictability and error in ECMWF long range forecasts Tim Stockdale European Centre.
ECMWF long range forecast systems
1 Verification Continued… Holly C. Hartmann Department of Hydrology and Water Resources University of Arizona RFC Verification Workshop,
Verification of probability and ensemble forecasts
Details for Today: DATE:3 rd February 2005 BY:Mark Cresswell FOLLOWED BY:Assignment 2 briefing Evaluation of Model Performance 69EG3137 – Impacts & Models.
© Crown copyright Met Office Andrew Colman presentation to EuroBrisa Workshop July Met Office combined statistical and dynamical forecasts for.
Performance of the MOGREPS Regional Ensemble
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
Copyright 2012, University Corporation for Atmospheric Research, all rights reserved Verifying Ensembles & Probability Fcsts with MET Ensemble Stat Tool.
Verification of ensembles Courtesy of Barbara Brown Acknowledgments: Tom Hamill, Laurence Wilson, Tressa Fowler Copyright UCAR 2012, all rights reserved.
Heidke Skill Score (for deterministic categorical forecasts) Heidke score = Example: Suppose for OND 1997, rainfall forecasts are made for 15 stations.
© Crown copyright Met Office Probabilistic turbulence forecasts from ensemble models and verification Philip Gill and Piers Buchanan NCAR Aviation Turbulence.
61 st IHC, New Orleans, LA Verification of the Monte Carlo Tropical Cyclone Wind Speed Probabilities: A Joint Hurricane Testbed Project Update John A.
Probabilistic Forecasting. pdfs and Histograms Probability density functions (pdfs) are unobservable. They can only be estimated. They tell us the density,
Short-Range Ensemble Prediction System at INM José A. García-Moya SMNT – INM 27th EWGLAM & 12th SRNWP Meetings Ljubljana, October 2005.
© Crown copyright Met Office Standard Verification System for Long-range Forecasts (SVSLRF) Richard Graham, Met Office Hadley Centre. With acknowledgements.
The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Verification and Metrics (CAWCR)
Verification of ensemble systems Chiara Marsigli ARPA-SIMC.
Nathalie Voisin 1, Florian Pappenberger 2, Dennis Lettenmaier 1, Roberto Buizza 2, and John Schaake 3 1 University of Washington 2 ECMWF 3 National Weather.
ECMWF Training course 26/4/2006 DRD meeting, 2 July 2004 Frederic Vitart 1 Predictability on the Monthly Timescale Frederic Vitart ECMWF, Reading, UK.
Details for Today: DATE:13 th January 2005 BY:Mark Cresswell FOLLOWED BY:Practical Dynamical Forecasting 69EG3137 – Impacts & Models of Climate Change.
Verifying and interpreting ensemble products
Sub-seasonal prediction at ECMWF
Probabilistic forecasts
Short Range Ensemble Prediction System Verification over Greece
Presentation transcript:

Page 1© Crown copyright 2004 Presentation to ECMWF Forecast Product User Meeting 16th June 2005

Page 2© Crown copyright 2004 Met Office use and verification of the monthly forecasting system Bernd Becker, Richard Graham. Met Office monthly forecast suite Products from the Monthly Outlook Standard Verification System

Page 3© Crown copyright 2004 Monthly Forecasting System Coupled ocean-atmosphere integrations: a 51-member ensemble is integrated for 32 days every week. Atmospheric component: IFS with the latest operational cycle 29r1 and with a TL159L40 resolution (320 * 161) Oceanic component: HOPE (from Max Plank Institute) with a zonal resolution of 1.4 degrees and 29 vertical levels Coupling: OASIS (CERFACS). Coupling every ocean time step (1 hour) Perturbations: Atmosphere: Singular vectors + stochastic physics Ocean: SST perturbations in the initial conditions + wind stress perturbations during data assimilation. Hindcast statistics: 5-member ensemble integrated over 32 days during the past 12 years. Representing a 60-member ensemble. Running every week

Page 4© Crown copyright 2004 HC minHC max qb1 qb 2 qb 3 qb 4 FXmean FXminFXmax qm1 qm2qm4qm3 qm5 FORMOST global PDF data Forecast Hind cast Properties of quintile PDF well below below normal above well above

Page 5© Crown copyright 2004 FORMOST Monthly Outlook

Page 6© Crown copyright 2004 Example UK day temperature forecast for 10 climate districts Deterministic forecast (based on most probable category or ensemble mean) Probability forecast Verification ….a sudden change to below or well-below average temperatures is expected… (Forecast text issued 11 th Feb for week of Feb)

Page 7© Crown copyright 2004 Example global capability tercile probability forecast – Europe, days12-18 Verification (ECMWF operations) P(abv) P(avg) P(blw) PrecipTmean

Page 8© Crown copyright 2004 WK 3&4 WK 1 Obs. T anom Jan WK 2

Page 9© Crown copyright 2004 Verification May April forecast/ observation pairs of Temperature and Precipitation Verifying Observations: ECMWF short range (12-36 hrs) forecasts over the period Global Forecasts: Relative Operating Characteristics for quintile forecast Reliability Diagram Brier skill score decomposition

Page 10© Crown copyright 2004 Relative Operating Characteristics ROC measures the ability of the forecast to discriminate between two alternative outcomes. measuring resolution. It is not sensitive to bias in the forecast, so says nothing about reliability. A biased forecast may still have good resolution and produce a good ROC curve, which means that it may be possible to improve the forecast through calibration. The ROC can thus be considered as a measure of potential usefulness. The ROC is conditioned on the observations (i.e., given that bin X occurred, what was the corresponding forecast?) The area under the ROC curve indicates skill above climatology, when > 0.5 POFD= FA/(FA+CR) POD= H/(H+M) 3 cold warm

Page 11© Crown copyright 2004 ROC Score for Temperature well below normal 19 to 32 days ahead ROC Map

Page 12© Crown copyright 2004 Brier Score = RMS ensemble forecast 1/N Sum_i (P_k – O_k)^2 BS = Reliability – Resolution + Uncertainty Reliability: conditional Bias, weighted vertical distance of the reliability curve to the diagonal. The flatter the less resolution. Resolution: difference between fx pdf and expectation, variance (HR). The larger (better), the more often the fx pdf is steeper and narrower than the climatological pdf. (Flat sharpness plot) Uncertainty: difference between obs pdf and sample average pdf, variance(H/total),.2(1-.2)=0.16 always > Resolution Bracketed Terms: BSS= 1 – BS/BS_clim BS_clim= REL(=0) - Res(=0) + Unc Brier Score HR=H/(H+FA) 3

Page 13© Crown copyright 2004 Monthly Verification (Europe): Tmean ROC & reliability, all seasons, days Based on 93 operational forecasts (all seasons) Verification data = ECMWF T POD= H/(H+M) HR=H/(H+FA) 22

Page 14© Crown copyright 2004 Monthly Verification (Europe): Precip ROC & reliability, all seasons, days POD= H/(H+M) POFD= FA/(FA+CR) HR=H/(H+FA) 2 2

Page 15© Crown copyright 2004 Conclusion The monthly forecasts model runs are produced at ECMWF, products are derived at the Met Office, operationally. Standardised Verification system (SVS) for Long-range Forecasts (LRF) is beginning to take shape. Forecasts for day are as useful as climatology. Predictions of Quintile 1 and 5 are more skilful than of Quintiles 2 to 4 Europe is a difficult region to predict at long time range. Weather uncertainty imposes a fundamental limit on the sharpness of the probabilistic forecast. The Monthly Outlook is a powerful tool to provide forecast guidance up to a month ahead in many areas.

Page 16© Crown copyright 2004 THANK YOU! Richard Graham Margaret Gordon Andrew Colman

Page 17© Crown copyright 2004 The Met Office We are one of the world's leading providers of environmental and weather-related services. Our solutions and services meet the needs of many communities of interest…from the general public, government and schools, to civil aviation and almost every industry sector around the world.

Page 18© Crown copyright 2004 Desirable Attributes of Scores Equitable: Equitable without dependence on the forecast distribution: Rewards for correct forecasts inversely proportional to their event frequencies Penalties for incorrect forecasts directly proportional to their event frequencies Random forecasts and constant forecasts have the same (0) skill. Note: 2 & 3 imply that all information in the contingency table is taken into account.

Page 19© Crown copyright 2004 Recap: Post processing and Products Data Volume reduction before transfer to The Met Office: Calculate 1.Tercile/Quintile boundaries from the Hindcast ensemble 2.Tercile/Quintile populations from the Forecast ensemble 3.Maximum, Mean and Minimum from Forecast and from Hindcast 4.Forecast Tercile/Quintile averages 5.Average in time to week 1, 2 and 3&4. UK Forecast: 1.Interpolation to points representing UK climate regions 2.Calibration with historical UK climate region observations 3.Interpretation of the Histogram, Ensemble mean or Mode in cases with large spread, derive deterministic forecast tercile/quintile 4.Mapping Tercile/Quintile average onto calibration PDF to derive deterministic forecast value Global Forecast: 1.Tercile/Quintile probabilities 2.Calibrate by overlaying Tercile/Quintile boundaries derived from 1989 – 1998 ERA40 data

Page 20© Crown copyright 2004 Key Points Monthly Outlooks are a split process between ECMWF and Met Office The Monthly Outlook is operational Standardised Verification system is shaping up

Page 21© Crown copyright 2004 Most probable Quint Low spread, Ensemble mean a good best estimate, high confidence. High Spread, ensemble mean wrong in 80% of all cases. High Spread, delta probability > 5%. Most probable quint best estimate. Hedge by use of above/below quint.

Page 22© Crown copyright 2004 FORMOST Verification Week 2 Quint boundaries Observed Deterministic t

Page 23© Crown copyright 2004 FORMOST Verification Week 3&4, UK average t

Page 24© Crown copyright 2004 FORMOST Verification Week 3&4, UK average t

Page 25© Crown copyright 2004 Monthly forecast: 3 category probabilities prob. of above prob. of average prob. of below Issued: 9 th June 2005 valid: 20 th to 26 th June temperatureprecipitation

Page 26© Crown copyright 2004 Grid point diagnostics Properties of a contingency table per grid point wan 5 QUINT 4 n 3 bn 2 wbn 1 H M FA CR 100 % probability Hit : Q =Qobs & P(Q)>= Pthresh Miss: Q =Qobs & P(Q) < Pthresh False Alarm: Q != Qobs & P(Q)>= Pthresh Correct rejection: Q !=Qobs & P(Q)<Pthresh Stratify by magnitude of the probability at each grid point POD = H / (H+M) conditioned on Observations POFD=FA / (FA+CR) Hit Rate = H / (H+FA) conditioned on Forecasts

Page 27© Crown copyright 2004 EUROPE T HR=H/(H+FA) 19 lead time 2 weeks period 3 3 POD= H/(H+M)

Page 28© Crown copyright 2004 EUROPE P HR=H/(H+FA) 19d lead time 2 weeks period 3 3 POD= H/(H+M)