Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.

Slides:



Advertisements
Similar presentations
© Crown copyright Met Office WAFC CAT verification Objective verification of GRIB CAT forecasts Dr Philip G Gill, WAFC Science Meeting, Washington, 20.
Advertisements

© Crown copyright Met Office WAFC turbulence and Cb hazard verification Recent results and future plans Dr Philip G Gill WAFSOPSG 7/14, 30 th April 2013.
ECMWF Slide 1Met Op training course – Reading, March 2004 Forecast verification: probabilistic aspects Anna Ghelli, ECMWF.
Slide 1ECMWF forecast User Meeting -- Reading, June 2006 Verification of weather parameters Anna Ghelli, ECMWF.
Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
Robin Hogan Ewan OConnor University of Reading, UK What is the half-life of a cloud forecast?
Chapter 13 – Weather Analysis and Forecasting
© Crown copyright Met Office Verification of forecasts of Cbs Bob Lunnon, Aviation Outcomes Manager, Met Office WAFS Science meeting, Washington, April.
Matthew Vaughan, Brian Tang, and Lance Bosart Department of Atmospheric and Environmental Sciences University at Albany/SUNY Albany, NY NROW XV Nano-scale.
© Crown copyright /0XXX Met Office and the Met Office logo are registered trademarks Met Office FitzRoy Road, Exeter, Devon, EX1 3PB United Kingdom.
Seasonal Predictability in East Asian Region Targeted Training Activity: Seasonal Predictability in Tropical Regions: Research and Applications 『 East.
Gridded OCF Probabilistic Forecasting For Australia For more information please contact © Commonwealth of Australia 2011 Shaun Cooper.
Improving Excessive Rainfall Forecasts at HPC by using the “Neighborhood - Spatial Density“ Approach to High Res Models Michael Eckert, David Novak, and.
© Crown copyright Met Office Enhanced rainfall services Paul Davies.
1 Intercomparison of low visibility prediction methods COST-722 (WG-i) Frédéric Atger & Thierry Bergot (Météo-France)
Verification of Numerical Weather Prediction systems employed by the Australian Bureau of Meteorology over East Antarctica during the summer season.
Univ of AZ WRF Model Verification. Method NCEP Stage IV data used for precipitation verification – Stage IV is composite of rain fall observations and.
Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.
© Crown copyright Met Office Improving forecasting of disruption due to convection within UK airspace Paul Maisey and Katie Brown ECAM, European Meteorological.
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
Probabilistic forecasts of (severe) thunderstorms for the purpose of issuing a weather alarm Maurice Schmeits, Kees Kok, Daan Vogelezang and Rudolf van.
Performance of the MOGREPS Regional Ensemble
Verification of Mobile weather Alert Forecasts over Lake Victoria region in Uganda Khalid Y. Muwembe MSc Dissertation University of Reading Supervisors.
© Crown copyright Met Office Operational OpenRoad verification Presented by Robert Coulson.
© Crown copyright Met Office Forecasting Icing for Aviation: Some thoughts for discussion Cyril Morcrette Presented remotely to Technical Infra-structure.
© Crown copyright /0653 Met Office and the Met Office logo are registered trademarks Met Office Hadley Centre, FitzRoy Road, Exeter, Devon, EX1.
An Overview of the UK Met Office Weymouth Bay wind model for the 2012 Summer Olympics Mark Weeks 1. INTRODUCTION In the summer of 2012 a very high resolution.
1 On the use of radar data to verify mesoscale model precipitation forecasts Martin Goeber and Sean Milton Model Diagnostics and Validation group Numerical.
ESA DA Projects Progress Meeting 2University of Reading Advanced Data Assimilation Methods WP2.1 Perform (ensemble) experiments to quantify model errors.
© Crown copyright 03/2014 Met Office and the Met Office logo are registered trademarks Met Office FitzRoy Road, Exeter, Devon, EX1 3PB United Kingdom Tel:
Verification of the Cooperative Institute for Precipitation Systems‘ Analog Guidance Probabilistic Products Chad M. Gravelle and Dr. Charles E. Graves.
Verification Summit AMB verification: rapid feedback to guide model development decisions Patrick Hofmann, Bill Moninger, Steve Weygandt, Curtis Alexander,
How can LAMEPS * help you to make a better forecast for extreme weather Henrik Feddersen, DMI * LAMEPS =Limited-Area Model Ensemble Prediction.
The vertical resolution of the IASI assimilation system – how sensitive is the analysis to the misspecification of background errors? Fiona Hilton and.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Data mining in the joint D- PHASE and COPS archive Mathias.
New statistical approach for hail storm nowcasting: case study of severe thunderstorm developed on 28 June 2007 over northeast Bulgaria Boryana Markova.
Event-based Verification and Evaluation of NWS Gridded Products: The EVENT Tool Missy Petty Forecast Impact and Quality Assessment Section NOAA/ESRL/GSD.
Measuring forecast skill: is it real skill or is it the varying climatology? Tom Hamill NOAA Earth System Research Lab, Boulder, Colorado
Henry Fuelberg Pete Saunders Pendleton, Oregon Research Region Map Types and Lightning Frequencies.
Requirements from KENDA on the verification NetCDF feedback files: -produced by analysis system (LETKF) and ‘stat’ utility ((to.
© Crown copyright Met Office Probabilistic turbulence forecasts from ensemble models and verification Philip Gill and Piers Buchanan NCAR Aviation Turbulence.
World Meteorological Organization Working together in weather, climate and water Enhanced User and Forecaster Oriented TAF Quality Assessment CAeM-XIV.
Phillip Bothwell Southern Thunder 2011 Workshop July 13, 2011 Multi-Model Lightning Prediction.
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Air Force Weather Agency Probabilistic Lightning Forecasts Using Deterministic Data Evan Kuchera.
© Crown copyright Met Office Standard Verification System for Long-range Forecasts (SVSLRF) Richard Graham, Met Office Hadley Centre. With acknowledgements.
© Crown copyright Met Office WAFC CAT verification Objective verification of GRIB CAT forecasts Dr Philip G Gill, WAFS Workshop on the use and visualisation.
Probabilistic Forecasts of Extreme Precipitation Events for the U.S. Hazards Assessment Kenneth Pelman 32 nd Climate Diagnostics Workshop Tallahassee,
Decadal Climate Prediction Project (DCPP) © Crown copyright 09/2015 | Met Office and the Met Office logo are registered trademarks Met Office FitzRoy Road,
Verification of ensemble systems Chiara Marsigli ARPA-SIMC.
Page 1© Crown copyright 2005 Met Office Verification -status Clive Wilson, Presented by Mike Bush at EWGLAM Meeting October 8- 11, 2007.
NWP models. Strengths and weaknesses. Morten Køltzow, met.no NOMEK
© Crown copyright Met Office Seasonal forecasting: Not just seasonal averages! Emily Wallace November 2012.
1) Verification of individual predictors Development of improved turbulence forecasts for aviation Debi Turp © Crown copyright Met Office and the Met Office.
Using TIGGE Data to Understand Systematic Errors of Atmospheric River Forecasts G. Wick, T. Hamill, P. Neiman, and F.M. Ralph NOAA Earth System Research.
Jake Mittelman James Belanger Judith Curry Kris Shrestha EXTENDED-RANGE PREDICTABILITY OF REGIONAL WIND POWER GENERATION.
Gridded WAFS Icing Verification System Matt Strahan WAFC Washintgon.
Extracting probabilistic severe weather guidance from convection-allowing model forecasts Ryan Sobash 4 December 2009 Convection/NWP Seminar Series Ryan.
Developing GFS-based MOS Thunderstorm Guidance for Alaska Phillip E. Shafer* and Kathryn Gilbert Meteorological Development Laboratory, NWS, NOAA
Impact of the representation of the stratosphere on tropospheric weather forecasts Sana Mahmood © Crown copyright 07/0XXX Met Office and the Met Office.
FORECASTING HEATWAVE, DROUGHT, FLOOD and FROST DURATION Bernd Becker
Intelligent pricing plans for de-icing companies
Systematic timing errors in km-scale NWP precipitation forecasts
Verifying and interpreting ensemble products
Graupel and Lightning Forecasting in the 1.5km UKV Model
Challenge: High resolution models need high resolution observations
Probabilistic forecasts
Quantitative verification of cloud fraction forecasts
Can we distinguish wet years from dry years?
Some Verification Highlights and Issues in Precipitation Verification
Short Range Ensemble Prediction System Verification over Greece
Presentation transcript:

Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes. Various different verification measures have been calculated to compare the models and explore their strengths and weaknesses. CONTINGENCY TABLES AND RELATIVE OPERATING CHARACTERISTIC CURVE RELIABILITY DIAGRAM COST-LOST ANALYSIS The forecast products are verified using observations from the ATDnet (Arrival Time Difference) system; this is an automatic lightning location network that senses lightning flashes over a geographical area. By timing the arrival of unique very low frequency radio waves (‘Sferics’) that are generated by individual lightning strokes the location of every stroke can be calculated. A stroke count was processed to give the number of strokes within a 50km radius of each airport over a 6 hour period. MOGREPS-GLOBALUKV (post processed) Probability of lightning index exceeding 10 from a 12 member ensemble: 0 → lightning is unlikely 10 → deep convectively unstable environment with lightning a possibility Model resolution: 60km Forecast lead times: T+24 and T+36 Model diagnostics: Precipitation, CAPE, convective cloud features (temperature, base height, depth), wet bulb temperature. Risk of lightning within 50km radius of a location: LR1 → lightning expected LR5 → lightning not expected Model resolution: 1.5km Forecast lead times: T+21 and T+33 Model diagnostics: Lifted index, CAPE, lightning index, total precipitation rate. Forecasts are all valid at 06Z and 18Z (±3hours) © Crown copyright Met Office and the Met Office logo are registered trademarks Met Office FitzRoy Road, Exeter, Devon, EX1 3PB United Kingdom Tel: Fax: The forecasting of hazardous weather conditions such as lightning is of huge importance to the Civil Aviation Authority. The accuracy and skill of lightning forecasts is key to the safety of all flights in the UK. This project looks into the verification of two different models’ lightning products to assess how well they forecast this rare event during the 2012 summer. ForecastsObservations Verification Conclusions and Future Work There were almost 60,000 recorded strikes across the UK within a 13 hour period (05Z to 18Z). Both models indicated lightning was expected with a focus on the East coast of England. The ATDnet observations for 1800Z indicate two strong bands of lightning in that region. Hits 1485 Misses 48 False alarms Correct rejections Hits 175 Misses 1359 False alarms 931 Correct rejections Hit =Non zero probability Figure 3: MOGREPS-Global ROC curve. Assumption: zero probability of precipitable water, minimal cloud cover, minimal risk of lightning. Figure 4: UKV ROC curve. Hit = LR1 and LR2 UKV shows considerably higher hit rate (0.97), however the global model results have a much smaller false alarm rate (0.025). A ROC curve displays the forecasts’ ability to distinguish between an event and a non event. Both ROC curves have a very similar initial gradient showing similar skill where data is present. The limited number of points on the MOGREPS-Global curve is due to the small number of members in the ensemble (lowest threshold = 1/12). The rare nature of lightning focuses the points on the ROC curve closer to the origin. Case Study 28th June 2012 Figure 6: Value plot of MOGREPS-Global and UKV. Perfect score = 1.0. Figure 7: Forecasts for 2012/06/28 18Z: a) MOGREPS-Global T+36, b) UKV T+33, c) lightning stroke map for 15Z to 21Z on 28/06/12. MOGREPS-Global UKV A reliability diagram displays how well the predicted probabilities correspond to their observed frequencies. Figure 5: Reliability diagram of MOGREPS-Global and UKV. A value plot shows the relative improvement in economic value between climatology and a perfect forecast. The UKV curve fully encompasses the MOGREPS-Global curve, indicating UKV has greater value. UKV shows high skill and value, greater than MOGREPS-G, however the specific risk and observed frequencies of LR 1&2 could be explored further. MOGREPS-G does show some skill which shows it has potential given the resolution of the model and the rare nature of lightning. In the future MOGREPS-UK (resolution of 1.5km) will be verified to allow direct comparison of the two models for the same domain and resolution, to determine if an ensemble forecast would be better. The ensemble could also be expanded to use 24 members by combining two forecast runs to investigate the ROC curves of the two models further. Figure 1: Hyperbola plot - the potential locations of the lightning source from every pair of stations, the intersection of all the hyperbolae is the lightning location. Figure 2: Stroke counts within 50km radius of Heathrow and Gatwick airports. Heathrow Gatwick MOGREPS-Global shows good reliability for available probabilities. Due to the categorical nature of the UKV forecast a classic reliability diagram could not be produced. LR3,4,5 indicate minimal risk of lightning which correlates with the near zero observed frequency and LR2&1 indicate some risk of lightning correlating with observations. a) b) c) Verification of Met Office Lightning Forecasts for Aviation Rebecca Stretton, Piers Buchanan, Stephen Moseley, Clare Bysouth