© Crown copyright Met Office 1 ISES Forecast Verification Workshop Saturday 11 th April 2015 Millennium Harvest House, 1345 28 th Street, Boulder, Colorado.

Slides:



Advertisements
Similar presentations
Robin Hogan Ewan OConnor University of Reading, UK What is the half-life of a cloud forecast?
Advertisements

Space Weather in CMA Xiaonong Shen Deputy Administrator China Meteorological Administration 17 May 2011 WMO Cg-XVI Side Event Global Preparedness for Space.
Splinter Session – ‘Space Weather Metrics, Verification & Validation.’ Thursday 20 th Nov., 16: :00 Splinter session - Space weather metrics, verification.
Splinter: SSA Space Weather Service Network Splinter: SSA Space Weather Service Network Tuesday 18th, 15: :00 1.Overview of the P2-SSA activities.
ATMOP Partners Centre National de la Recherche Scientifique Centre National de la Recherche Scientifique (CNRS), France 7 th framework project selected.
“A LPB demonstration project” Celeste Saulo CIMA and Dept. of Atmos. and Ocean Sciences University of Buenos Aires Argentina Christopher Cunningham Center.
© NERC All rights reserved Storms rare but important Balance dataset otherwise storms look like noise Features selected like Split: training set, validation.
Page 1© Crown copyright 2006ESWWIII, Royal Library of Belgium, Brussels, Nov 15 th 2006 Forecasting uncertainty: the ensemble solution Mike Keil, Ken Mylne,
Forecasting the high-energy electron flux throughout the radiation belts Sarah Glauert British Antarctic Survey, Cambridge, UK SPACECAST stakeholders meeting,
ESA SSA Programme Objective:
© Crown copyright Met Office Requirement for transitioning radio heliophysics research into operations David Jackson Radio Heliophysics Infrastructure.
Gridded OCF Probabilistic Forecasting For Australia For more information please contact © Commonwealth of Australia 2011 Shaun Cooper.
Verification and evaluation of a national probabilistic prediction system Barbara Brown NCAR 23 September 2009.
DIAS home page Second European Space Weather Week, ESTEC, Noordwijk, The Netherlands, November 2005 European Digital.
© Crown copyright Met Office Transitioning space weather models to operations at the UK Met Office Suzy Bingham, David Jackson, Catherine Burnett and Mark.
Peter Wintoft Swedish Institute of Space Physics IUGG, Sapporo, Japan, July 2, 2003 Real time forecasting of geomagnetic indices Peter Wintoft Swedish.
Transitioning unique NASA data and research technologies to the NWS 1 Evaluation of WRF Using High-Resolution Soil Initial Conditions from the NASA Land.
© Crown copyright Met Office Pass the baton: Verification and the NCOF product chain Andy Saulter, Business Support and Waves.
Mesoscale ionospheric tomography over Finland Juha-Pekka Luntama Finnish Meteorological Institute Cathryn Mitchell, Paul Spencer University of Bath 4th.
© Crown copyright Met Office Cost benefit studies for observing systems Stuart Goldstraw, Met Office, CBS-RA3-TECO-RECO, 13 th September 2014.
Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
Lessons learned from CCMC-led community-wide Model Validation Challenges. Outlook on international coordination of M&V activities. MODELS  DATA  TOOLS.
Probabilistic forecasts of (severe) thunderstorms for the purpose of issuing a weather alarm Maurice Schmeits, Kees Kok, Daan Vogelezang and Rudolf van.
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
Monday 13 th November GSY/050388/ © BAE SYSTEMS All Rights Reserved ESA Space Weather Applications Pilot Project Service Development.
Further investigations of the July 23, 2012 extremely rare CME: What if the rare CME was Earth-directed? C. M. Ngwira 1,2, A. Pulkkinen 2, P. Wintoft 3.
14-18 Nov 2005ESWW – SAAPS1 SAAPS Spacecraft Anomaly Analysis and Prediction System ESA Contract 11974/96/NL/JG(SC) Two year project (April 1999 – June.
Space Weather Aviation Operational Needs
Space weather forecasting has made tremendous strides in recent years. Nevertheless, there are frequent mismatches between predicted and measured impacts.
Splinter 1: Space science & weather Chair: J. De Keyser Summary: F. Clette 03/06/20101STCE: Space Science & Weather.
The SWENET Online Archive: 10 years of a European Space Weather Community Resource H. Laurens¹, A. Glover¹², JP. Luntama¹, E.Amata³, E.Clarke ⁴, P. Beltrami.
How can LAMEPS * help you to make a better forecast for extreme weather Henrik Feddersen, DMI * LAMEPS =Limited-Area Model Ensemble Prediction.
Solar Shield - Forecasting and Mitigating Solar Effects on Power Transmissions Systems Pulkkinen, A., M. Hesse, S. Habib, F. Policelli, B. Damsky, L. Van.
Page 1 Pacific THORPEX Predictability, 6-7 June 2005© Crown copyright 2005 The THORPEX Interactive Grand Global Ensemble David Richardson Met Office, Exeter.
Event-based Verification and Evaluation of NWS Gridded Products: The EVENT Tool Missy Petty Forecast Impact and Quality Assessment Section NOAA/ESRL/GSD.
K9LA Vancouver 2003 Disturbances to Propagation Carl Luetzelschwab K9LA CQ DX?Where’d everybody go?
P. Wielgosz and A. Krankowski IGS AC Workshop Miami Beach, June 2-6, 2008 University of Warmia and Mazury in Olsztyn, Poland
The International Space Environment Service (ISES) Joe Kunches Secretary for Space Weather, ISES Space Weather Week 2005 April 6,2005.
Space weather forecasters perspective: UK David Jackson and Mark Gibbs SEREN Bz workshop, Abingdon, 9-10 July 2014.
© Crown copyright Met Office Probabilistic turbulence forecasts from ensemble models and verification Philip Gill and Piers Buchanan NCAR Aviation Turbulence.
61 st IHC, New Orleans, LA Verification of the Monte Carlo Tropical Cyclone Wind Speed Probabilities: A Joint Hurricane Testbed Project Update John A.
Solar Shield project - lessons learned and advances made (ccmc.gsfc.nasa.gov/Solar_Shield) Pulkkinen, A., M. Hesse, S. Habib, F. Policelli, B. Damsky,
Solar Shield project - lessons learned and advances made Pulkkinen, A., M. Hesse, S. Habib, F. Policelli, B. Damsky, L. Van der Zel, D. Fugate, W. Jacobs,
Spatial Verification Methods for Ensemble Forecasts of Low-Level Rotation in Supercells Patrick S. Skinner 1, Louis J. Wicker 1, Dustan M. Wheatley 1,2,
Sol-Terra: A Roadmap to Operational Sun-to- Earth Space Weather Forecasting Mike Marsh 1, David Jackson 1, Alastair Pidgeon 2, Gareth Lawrence 2, Simon.
Page 1 Andrew Lorenc WOAP 2006 © Crown copyright 2006 Andrew Lorenc Head of Data Assimilation & Ensembles Numerical Weather Prediction Met Office, UK Data.
Verification of C&V Forecasts Jennifer Mahoney and Barbara Brown 19 April 2001.
Agency, version?, Date 2012 Coordination Group for Meteorological Satellites - CGMS Add CGMS agency logo here (in the slide master) Coordination Group.
The CME geomagnetic forecast tool (CGFT) M. Dumbović 1, A. Devos 2, L. Rodriguez 2, B. Vršnak 1, E. Kraaikamp 2, B. Bourgoignie 2, J. Čalogović 1 1 Hvar.
Space Weather Services to Build Global Resilience Expert Meeting on Space Weather Services February 3, 2015 – UNCOPUOS STSC Assembly Goal: Foster greater.
ISES Director’s Report Terry Onsager, April 9, 2015 Accomplishments Actions from 2014 Annual Meeting Complementary Activities Challenges.
1 Pruning of Ensemble CME modeling using Interplanetary Scintillation and Heliospheric Imager Observations A. Taktakishvili, M. L. Mays, L. Rastaetter,
UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 1 Evaluation software tools Cristian Lussana (2) and Michael Borsche.
Space Weather - UK Activity Mark Gibbs, Head of Space Weather Met Office.
Pulkkinen, A., M. Kuznetsova, Y. Zheng, L. Mays and A. Wold
JMA Report on Satellite-based Space Weather Activities in Japan
How to forecast solar flares?
ESA SSA Measurement Requirements for SWE Forecasts
Status of GNSS ionospheric Study in Korea
Utilizing Scientific Advances in Operational Systems
Verifying and interpreting ensemble products
Application of satellite-based rainfall and medium range meteorological forecast in real-time flood forecasting in the Upper Mahanadi River basin Trushnamayee.
Developing an improved Aurora prediction model for operational use
Probabilistic forecasts
Validation-Based Decision Making
Quantitative verification of cloud fraction forecasts
Verification of SPE Probability Forecasts at SEPC
Short Range Ensemble Prediction System Verification over Greece
Indices and proxies for space weather
Presentation transcript:

© Crown copyright Met Office 1 ISES Forecast Verification Workshop Saturday 11 th April 2015 Millennium Harvest House, th Street, Boulder, Colorado 80302, USA Agenda 09:00 – 09:15Summary of European Space Weather Week verification session. Verification at the UK Met Office. Suzy Bingham, Met Office, UK. 09:15 – 09:30Introduction to NICT ’ s evaluation of the space weather forecasts provided by Japan, US, Belgium, Australia and China. Mamoru Ishii, NICT, Japan. 09:30 – 09:45Verification at RWC Belgium. Jesse Andries & Andy Devos. 09:45 – 10:00The verification of solar proton event probability forecasts at SEPC. China SEPC. 10:00 – 10:30State-of-the-art in weather forecast verification, and available tools and resources in the weather community. Barbara Brown, NCAR, USA. 10:30 – 10:45Coffee break. 10:45 – 12:15Discussion focus 1. Global product: flare forecasts. Discussion lead by Jesse Andries & Andy Devos. 12:15 – 13:15Lunch. 13:15 – 13:30Gathering GNSS data for making TEC maps and the effort of spreading GTEX format for easy exchange of GNSS data. Mamoru Ishii, NICT, Japan. 13:30 – 13:45TEC maps over Canada, Robyn Fiori, Canada. 13:45 – 15:00Discussion focus 2. Regional product: TEC nowcasts. Discussion lead by: Mihail Codrescu 15:00 – 15:15Coffee break. 15:15 – 16:30Discussion on other verification topics.

© Crown copyright Met Office 2 Discussion topics Product information & current verification by different centres. Relevant quantities/values for verification. Exact interpretation of communicated values. Standardisation. User-driven metrics. ‘Truth’, true measurements to verify against. Verification techniques to allow consistency & best-practices in order to compare verification of local products. Coordination & collaboration. Alignment with WIS Pilot Project. Common products & exact descriptions of each so that verification results are comparable. Future milestones/targets of verification. Discussion on other verification topics (15: :00) Operational Enlil output composite plot

© Crown copyright Met Office European Space Weather Week verification session summary ISES Verification Workshop 11 th April 2015 Suzy Bingham Met Office, UK 3

© Crown copyright Met Office ‘Space weather metrics, verification & validation’ splinter session 4 Minutes available:

© Crown copyright Met Office ESWW presentations 5 Liege Andy Devos, ROB. Solar Terrestrial Centre of Excellence (STCE) ( Evaluate: flare probability, Kp index, 10.7cm radio flux & proton events. Peter Wintoft, Swedish Institute of Space Physics. Perspectives on extreme events: science, statistics & user perspectives. The use of particular skill scores for rare/extreme events. Alexi Glover, ESA. Space Weather European Network (SWENET) ( Space weather services & data portal with simple metrics already being used.

© Crown copyright Met Office ESWW presentations 6 Manoulis Georgoulis, Academy of Athens, Greece. Solar flare prediction methods. Use of different skill scores & test in different parts of solar cycle. Sean Elvidge, University of Birmingham, UK. Taylor Diagrams: visualising stats & different parameters & models on 1 diagram. “On the use of modified Taylor diagrams to compare ionospheric assimilation models,” Radio Science, Suzy Bingham, Met Office, UK. Translating terrestrial met verification methods to space weather. Real-time verification/validation is important for forecasters. Maria Kuznetsova, CCMC, USA. Model validation activities include: TEC, Dst index & regional K. Can calculate skill scores on the CCMC system.

© Crown copyright Met Office Challenges for verification/validation 7 (1) Confusion: many forecasts from many centres so difficult to understand what data to use. (2) Variety of techniques required: due to varied nature of forecasts (e.g. probabilistic forecasts, binary forecasts, time-series predictions, 2-d 3-d matrices of predictions, single value predictions). (3) What to compare: local/global forecasts, max/average values. How forecast windows & lead times will impact on forecast accuracy. (4) Terminology: how to compare with varying terminology between centres, e.g ‘unsettled’ Kp & ‘eruptive flaring’ can be different between centres. (5) Forecasting of extremes/rare events: raises problems for many skill scores.

© Crown copyright Met Office Challenges for verification/validation continued 8 (6) Solar cycle: differences in activity depending on where you are in the solar cycle. (7) Thresholds: for probabilistic forecasts, e.g. set threshold to maximise the skill score. But how would this work with inter- model comparisons? (8) Errors in validation process: can arise from models, validation data & post-processing. So need to make sure other forms of inaccuracy aren’t passed on to model/forecast. (9) Fear of loss of funding: if researchers fear a loss of funding when they don’t do well in a validation activity then this discourages participation. Activity has to be representative & unbiased. (10) Flow of information between centres/modellers: it’s important for info to flow between them so that centres can present appropriate metrics to users, & models are tested against metrics which can be translated into user requirements.

© Crown copyright Met Office Requirements for verification/validation (1)fix timescales / lead times, (2)common parameters, (3)common fomat, (4)common terminology, (5)provide data access, (6)use metrics which are adequate, easy to interpret & easily reproducible, (7)if there are multiple lead times, use appropriate weightings, (8)agreed a list of events for validation exercises. 9

© Crown copyright Met Office Suggested analysis (1)Error analysis. (2)ROC curve (Relative Operating Characteristic). False Alarm Rate v Probability of Detection. (3)Reliability plots (e.g. for flare probability forecasts). (4)Brier skill scores (e.g. flare probability forecast). (5)Extremal Dependence Index (EDI)- independent of the rarity of events. Has been applied to ground dB/dt forecasts at the Swedish SpWx Center. (6)Equitable skill scores with weighting for rare events. (6)Taylor diagrams to combine different synoptic metrics such as correlation, bias, etc, in a single plot. Potentially useful for inter- model comparisons but not likely to be used for a non-specialist service end-user. 10

© Crown copyright Met Office To aid verification/validation (1)More transparency between forecasts. (2)Communication/coordination between centres & users. (3)Different approaches need different domains. (5)Agencies can provide an unbiased platform. (6)Agencies can encourage participation in community wide initiatives. (7)Agencies could provide computing resources/manpower to do independent tests. (8)Agencies could provide dedicated workshops & campaigns. (9)There’re established lists of products, which vary depending on community, e.g. WMO, ‐ sat.info/oscar/applicationareas/view/25 ESA, ‐ SWE/SSA ‐ SWE ‐ RS ‐ SSD ‐ 0001_i1r3.pdf 11

© Crown copyright Met Office Overview of Met Office verification plans ISES Verification Workshop 11 th April 2015 Suzy Bingham Met Office, UK 12

© Crown copyright Met Office Met Office models 13 Enlil: predicts solar wind speed & density between Sun & Earth for next few days. REFM: 3-day forecast of high-energy electrons at GEO orbit. D-RAP: Global map of real-time D region absorption predictions. MIDAS/Bernese: Nowcasting Total Electron Content in ionosphere.

© Crown copyright Met Office Verification requirements Probabilistic forecasts, 1-4 days ahead, produced twice daily: o High energy proton flux (2 levels). o High energy electron fluence (2 levels). o Geomagnetic storms (4 levels). o X-ray flares (2 levels). Spatial: o MIDAS/Bernese (TEC) o DRAP Deterministic: o CME arrival time (time at Earth) o REFM (daily fluence) 14 Examples of probability forecasts

© Crown copyright Met Office 15 Would like to verify: 1.CME arrival times at Earth. 1.Enlil output from the four centres running Enlil operationally, against observed data from ACE (composite plot). 1.Real-time solar wind speed on forecaster bench, against ACE (later DSCOVR) & STEREO A &B. Enlil verification ENLIL output on the forecaster visualisation suite

© Crown copyright Met Office CME forecast verification 16 Overview of results for 2014 and 2015 (2013 still to be done). Accuracy proportion correct ~ 0.75 Threat Score ~ 0.67 Bias ~0.9 Reliability False Alarm Ratio ~ 0.15 Discrimination Hit rate ~ 0.76 False Alarm Rate ~ 0.36 Skill Heidke ~ 0.36 Peirce ~ 0.4 Equitable Threat Score ~ 0.2

© Crown copyright Met Office Relativistic Electron Forecast Model (REFM) The REFM uses 30 days of solar wind speed data (currently from the ACE spacecraft) to predict the 24-hour fluence, 1-4 days ahead. Forecasters issue probability of energetic electron fluence exceeding thresholds (10 8 and 10 9 pfu), based on model output and experience. Need to assess model performance, and advisor guidance How do we verify these forecasts and identify a probability threshold for what defines a “hit”? Could use ROC curves to find the detection threshold which maximises hit rate/false alarm rate. Assess whether skill is statistically significant. 17

© Crown copyright Met Office REFM verification Use of ROC curves for evaluating forecaster probabilities has limitations: Very few observed threshold exceedences (~5 out of ~100 events). Difficult to robustly identify a probability threshold By eye, the threshold appears to be higher for shorter lead times, but this trend is unlikely to be statistically significant. 18 Use Extremal Dependence Index instead of ROC: “A probability model for verifying deterministic forecasts of extreme events”, Ferro, 2007, AMS. “Extremal Dependence Indices: Improved verification measures for deterministic forecasts of rare binary events”, Ferro & Stephenson, 2011, AMS.

© Crown copyright Met Office TEC: Multi-Instrument Data Analysis System (MIDAS) Purpose: Multi-Instrument Data Analysis System.Images ionospheric activity in real- time. Produces European region Total Electron Content (TEC) nowcast every 15 mins Input: Measurements potentially from ground- & spaced-based GPS receivers, & point estimates of local electron density. Output: TEC map over Europe every 15mins. 19

© Crown copyright Met Office TEC: Bernese model 20 European plots every 15mins, global plots every hour. TEC anomaly from previous 10 day mean. TEC change from previous hour. Output: IONEX format, 2.5km lat, 5km lon.

© Crown copyright Met Office EU H2020 FLARECAST project ( Ensemble of best methods with improved verification. Compare our human-based methods to improved automatic methods. Consistent verification methods used on consistent data sets for inter- comparison. Bringing in better methods from more mature field of numerical weather prediction. Most popular flare forecasting verification methods to use: Heidke Skill Score & True Skill Statistic. CCMC flare forecast planning page: 21 Flare forecasting

© Crown copyright Met Office 22 Probabilistic forecasts, 1-4 days ahead, produced twice daily: o High energy proton flux (2 levels). o High energy electron fluence (2 levels). o Geomagnetic storms (4 levels). o X-ray flares (2 levels). Compare against climatological/persistence models (Markov chain using transition probabilities). Use Rank Probability Score to understand accuracy. Horizon 2020 FLARECAST: may use True Skill Score or Heidke Skill Score. Spatial forecasts: o MIDAS/Bernese (TEC) o DRAP Deterministic forecasts: o CME arrival time at Earth (RMS error, FAR, etc. Composite plot.) o REFM (Use Extremal Dependence Index (EDI)). Met Office would like to provide simple metrics so that stakeholders can understand forecaster added value. Summary of Met Office verification requirements/plans

© Crown copyright Met Office Notes/guidance 23

© Crown copyright Met Office

Flexible Warning Verification System 31 Plan to translate terrestrial verification techniques to space weather, firstly using the Flexible Warning Verification System (WVS) (Mike Sharpe, Met Office). WVS is based on contingency table but gives more flexibility in space & time, i.e. gives some score to near hits. E.g. if a gale (or X-ray flare) occurs 5mins before a warning issue time then this is given some score rather than a miss. WVS provides a means for the forecaster to review events. After analysis in the WVS, suitable skill scores can be applied, e.g. Performance ROC plots & reliability plots. Useful to understand forecaster added value.

© Crown copyright Met Office Gale warning verification time Low Event threshold HIT FALSE ALARM MISS NON- EVENT Warning periodEnd time + late hit period Issue time EARLY HIT LATE HIT Event threshold EARLY LOW HIT LATE LOW HIT LOW HIT Introducing the warnings verification system 32

© Crown copyright Met Office Gale warning verification St Judes Storm October a hit! 33