Perfomance of the WWRP project FROST-2014 forecasting systems: Preliminary assessments (FROST = Forecast and Research in the Olympic Sochi Testbed) D.Kiktev,

Slides:



Advertisements
Similar presentations
Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
Advertisements

Introduction to data assimilation in meteorology Pierre Brousseau, Ludovic Auger ATMO 08,Alghero, september 2008.
1 FROST-2014 Verification activities Finnish Meteorological Institute XXII Olympic Winter Games Feb - Mar 2014 Pertti Nurmi WMO WWRP.
Details for Today: DATE:3 rd February 2005 BY:Mark Cresswell FOLLOWED BY:Assignment 2 briefing Evaluation of Model Performance 69EG3137 – Impacts & Models.
Gridded OCF Probabilistic Forecasting For Australia For more information please contact © Commonwealth of Australia 2011 Shaun Cooper.
The use of the NWCSAF High Resolution Wind product in the mesoscale AROME model at the Hungarian Meteorological Service Máté Mile, Mária Putsay and Márta.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Quantitative precipitation forecasts in the Alps – first.
Participation in MAP D-PHASE / COPS Description of MAP D-PHASE project Implementation strategy Key relevant features of GEM v3.3.0 Overview of verification.
Rapid Update Cycle Model William Sachman and Steven Earle ESC452 - Spring 2006.
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
COSMO GM2013.E.Astakhova,D.Alferov,A.Montani,etal The COSMO-based ensemble systems for the Sochi 2014 Winter Olympic Games: representation and use of EPS.
Laurence Wilson Associate Scientist Emeritus Environment Canada Monica Bailey, Marcel Vallee and Ivan Heckmann Verification of forecasts from the 2010.
WWOSC 2014, Aug 16 – 21, Montreal 1 Impact of initial ensemble perturbations provided by convective-scale ensemble data assimilation in the COSMO-DE model.
Moisture observation by a dense GPS receiver network and its assimilation to JMA Meso ‑ Scale Model Koichi Yoshimoto 1, Yoshihiro Ishikawa 1, Yoshinori.
REGIONAL DECISION SUPPORT SYSTEM T. Bazlova, N. Bocharnikov, V. Olenev, and A. Solonin Institute of Radar Meteorology St. – Petersburg, Russia 2010.
Data assimilation and observing systems strategies Pierre Gauthier Data Assimilation and Satellite Meteorology Division Meteorological Service of Canada.
Verification methods - towards a user oriented verification WG5.
COSMO Priority Project CORSO “ C onsolidation of O peration and R esearch results for the S ochi O lympic Games” General Meeting 2011.
The Framework of the WMO/WWRP FROST-2014 Forecast Verification Setup and Activities The Framework of the WMO/WWRP FROST-2014 Forecast.
WORKSHOP ON SHORT-RANGE ENSEMBLE PREDICTION USING LIMITED-AREA MODELS Instituto National de Meteorologia, Madrid, 3-4 October 2002 Limited-Area Ensemble.
Rozinkina Inna, Kukanova Evgenia, Revokatova Anastasia, & Muravev Anatoly, Glebova Ekaterina.
Radar in aLMo Assimilation of Radar Information in the Alpine Model of MeteoSwiss Daniel Leuenberger and Andrea Rossa MeteoSwiss.
On the spatial verification of FROST-2014 precipitation forecast fields Anatoly Muraviev (1), Anastasia Bundel (1), Dmitry Kiktev (1), Nikolay Bocharnikov.
29th EWGLAM Meeting HIRLAM-A Verification Xiaohua Yang with contributions from Kees Kok, Sami Niemela, Sander Tijm, Bent Sass, Niels W. Nilsen, Flemming.
Improving Ensemble QPF in NMC Dr. Dai Kan National Meteorological Center of China (NMC) International Training Course for Weather Forecasters 11/1, 2012,
END-USER FOCUSED VERIFICATION OF PRECIPITATION NOWCASTS DURING THE SOCHI 2014 WINTER OLYMPICS Larisa Nikitina 1, Suleiman Mostmandy 2, Pertti Nurmi (presenter)
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Accounting for Change: Local wind forecasts from the high-
WG4: Sibiy, 2 September 2013 COSMO Project CORSO: Consolidation of Operational and Research results for the Sochi Olympics Status COSMO.
Model Post Processing. Model Output Can Usually Be Improved with Post Processing Can remove systematic bias Can produce probabilistic information from.
Utskifting av bakgrunnsbilde: -Høyreklikk på lysbildet og velg «Formater bakgrunn» -Under «Fyll», velg «Bilde eller tekstur» og deretter «Fil…» -Velg ønsket.
How well can we model air pollution meteorology in the Houston area? Wayne Angevine CIRES / NOAA ESRL Mark Zagar Met. Office of Slovenia Jerome Brioude,
Page 1© Crown copyright Scale selective verification of precipitation forecasts Nigel Roberts and Humphrey Lean.
Nowcasting Trends Past and Future By Jim Wilson NCAR 8 Feb 2011 Geneva Switzerland.
WWRP_WSN05, Toulouse, 5-9 September 2005 / 1 Pertti Nurmi Juha Kilpinen Sigbritt Näsman Annakaisa Sarkanen ( Finnish Meteorological.
Production of a multi-model, convective- scale superensemble over western Europe as part of the SESAR project EMS Annual Conference, Sept. 13 th, 2013.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss A more reliable COSMO-LEPS F. Fundel, A. Walser, M. A.
Typhoon Forecasting and QPF Technique Development in CWB Kuo-Chen Lu Central Weather Bureau.
U. Damrath, COSMO GM, Athens 2007 Verification of numerical QPF in DWD using radar data - and some traditional verification results for surface weather.
Perfomance of the WWRP project FROST-2014 forecasting systems: Preliminary assessments (FROST = Forecast and Research in the Olympic Sochi Testbed) D.Kiktev,
General Meeting Moscow, 6-10 September 2010 High-Resolution verification for Temperature ( in northern Italy) Maria Stefania Tesini COSMO General Meeting.
NWP models. Strengths and weaknesses. Morten Køltzow, met.no NOMEK
Verification of ensemble precipitation forecasts using the TIGGE dataset Laurence J. Wilson Environment Canada Anna Ghelli ECMWF GIFS-TIGGE Meeting, Feb.
Trials of a 1km Version of the Unified Model for Short Range Forecasting of Convective Events Humphrey Lean, Susan Ballard, Peter Clark, Mark Dixon, Zhihong.
Overview of WG5 activities and Conditional Verification Project Adriano Raspanti - WG5 Bucharest, September 2006.
Diagnostic verification and extremes: 1 st Breakout Discussed the need for toolkit to build beyond current capabilities (e.g., NCEP) Identified (and began.
Details for Today: DATE:13 th January 2005 BY:Mark Cresswell FOLLOWED BY:Practical Dynamical Forecasting 69EG3137 – Impacts & Models of Climate Change.
Probabilistic Forecasts Based on “Reforecasts” Tom Hamill and Jeff Whitaker and
Verification methods - towards a user oriented verification The verification group.
LAM activities in Austria in 2003 Yong WANG ZAMG, AUSTRIA 25th EWGLAM and 10th SRNWP meetings, Lisbon,
Downscaling Downscaling correction of T2m For Sochi region Inna Rozinkina, Mikhail Chumakov, Sergey Chechin, Mikhail Zaychenko, Denis Blinov, Michail Nikitin,
Dmitry Alferov, Elena Astakhova, Gdaly Rivin, Inna Rozinkina Hydrometcenter of Russia 13-th COSMO General Meeting, Rome, 5-9 September 2011.
1 Application of MET for the Verification of the NWP Cloud and Precipitation Products using A-Train Satellite Observations Paul A. Kucera, Courtney Weeks,
Application of the CRA Method Application of the CRA Method William A. Gallus, Jr. Iowa State University Beth Ebert Center for Australian Weather and Climate.
1Deutscher WetterdienstMärz 2005 April 2005: 19 NWS/ 21 forecast products (1) AustriaALADIN-LACE (9.6 km) ARPEGE (2) Czech Repub ALADIN-LACE (9 km) ARPEGE.
COSMO-RU operational verification in Russia using VERSUS2
FROST-2014 : FORECAST AND RESEARCH IN THE OLYMPIC SOCHI TESTBED
LEPS VERIFICATION ON MAP CASES
Moving from Empirical Estimation of Humidity to Observation: A Spatial and Temporal Evaluation of MTCLIM Assumptions Using Regional Networks Ruben Behnke.
COSMO General Meeting 2009 WG5 Parallel Session 7 September 2009
Verification Overview
WMO NWP Wokshop: Blending Breakout
Finnish Meteorological Institute
Verification of COSMO-LEPS and coupling with a hydrologic model
Forecasting and Research: The Olympic Sochi Testbed
Verification of Tropical Cyclone Forecasts
Some Verification Highlights and Issues in Precipitation Verification
NWP activities in Austria in 2006/2007 Y. Wang
Short Range Ensemble Prediction System Verification over Greece
Presentation transcript:

Perfomance of the WWRP project FROST-2014 forecasting systems: Preliminary assessments (FROST = Forecast and Research in the Olympic Sochi Testbed) D.Kiktev, E.Astakhova, A.Muravyev, M.Tsyrulnikov WWOSC-2014, August 2014 PRESENTED by S. BELAIR (with additions)

Items in this presentation: Brief reminder Online monitoring, evaluation (deterministic and ensemble) Interesting cases Integrated forecasting What’s next

Goals of WMO WWRP RDP/FDP FROST-2014: To improve and exploit: – high-resolution deterministic mesoscale forecasts of meteorological conditions in winter complex terrain environment; – regional meso-scale ensemble forecast products in winter complex terrain environment; – nowcast systems of high impact weather phenomena (wind, precipitation type and intensity, visibility, etc.) in complex terrain. To improve the understanding of physics of high impact weather phenomena in the region; To deliver deterministic and probabilistic forecasts in real time to Olympic weather forecasters and decision makers. To assess benefits of forecast improvement (verification and societal impacts) To develop a comprehensive information resource of alpine winter weather observations;

3 rd meeting of the project participants (10-12 April 2013) International participants of the FROST-2014 project COSMO, EC, FMI, HIRLAM, KMA, NOAA, ZAMG under supervision of the WWRP WGs on Nowcasting, Mesoscale Forecasting, Verification Research

Observational network in the region of Sochi - About 50 AMS; - C-band Doppler Radar WRM200; - Temperature/Humidity profiler – HATPRO; - Wind – Scintec-3000 Radar Wind Profiler; - Two Micro Rain vertically pointing Radars (MRR-2); - 4 times/day upper air sounding in Sochi

Forecasting systems participating in RDP/FDP FROST-2014 Nowcasting: ABOM, CARDS, INCA, INTW, MeteoExpert, Joint (Multi-system forecast integration) Deterministic NWP: COSMO-RU with grid spacing 1km, 2.2 km, 7km; GEM with grid spacing 2.5 km, 1 km, 0.25 km; NMMB – 1 km; HARMONIE – 1 km; INCA – 1 km Ensemble NWP: COSMO-S14-EPS (7km), Aladin LAEF (11km), GLAMEPS (11km), NNMB-EPS (7km ), COSMO-RU2-EPS (2.2km), HARMON-EPS (2.5km)

FROST-2014 Online Monitoring of Forecast Quality: Role of resolution - GEM-2.5 km vs GEM-1 km vs GEM-250 m Forecast mean absolute errors as a function of forecast lead time. Location: Mountain skiing finish (Roza-Khutor-7 station). Period: 15 Jan – 15 March Effect is not straightforward. It depends on meteorological variable, location, lead time etc.

Some examples of diagnostic verification: Role of spatial resolution. COSMO-S14-EPS (7km grid spacing) vs COSMO-RU2-EPS (2km grid spacing) Parameter: T2m, Location: Biathlon Stadium (1075m), Verification Period: , Verification approach: Nearest point COSMO-S14-EPS (7km grid spacing) COSMO-S14-EPS (7km grid spacing) COSMO-RU2-EPS (2km grid spacing) COSMO-RU2-EPS (2km grid spacing) Better shape on Q-Q-plots and higher variability for the downscaled ensemble forecasts.

Role of spatial resolution for ensemble forecasts – continued COSMO-S14-EPS (7km grid spacing) vs COSMO-RU2-EPS (2km grid spacing) StationBIAS (for 6/12/18hr lead time)Mean Absolute Error (for 6/12/18hr lead time) COSMO-S14-EPSCOSMO-RU2-EPSCOSMO-S14-EPSCOSMO-RU2-EPS Sledge (~700m) -1.3 / -2.0/ / -1.9 / / 2.2 / / 3.5 / 1.7 Freestyle (~1000m) -2.0 / -1.8 / / -0.7 / / 2.0 / / 2.4 / 1.7 Biathlon Stadium (~1500m) -1.4 / -1.3 / / 0.0 / / 1.8 / / 2.6 / 2.3 Mountain Skiing(start) (~2000m) 1.6 / 2.2 / / 0.2 / / 3.1 / / 2.2 / 2.6 T2m: Some positive effect of downscaling from 7 to 2 km resolution. Wind Speed: No positive effect of dynamical downscaling was found. Verifications for ensemble mean Verification Period:

ROCA BSS BS COSMO-S14-EPS – red COSMO-RU2-EPS – orange LAEF-EPS – brown NMMB-EPS – black HARMON-EPS – blue GLAMEPS – green Verification approach: 13 stations in the area of Krasnaya Polyana were clustered for matching to forecasts. Some ensemble verifications: ROC Area, Brier Skill Score, and Brier Score for Precip > 0.01 mm/3h COSMO-S14-EPS, NMMB-EPS and COSMO-RU2-EPS look most informative. Lead time

ROCA, BSS, and BS scores for Precip > 5 mm/3h For higher Precip threshold (w.r.t the low threshold): = COSMO-S14-EPS, NMMB, and HARMON-EPS become worse. = In contrast, LAEF and GLAMEPS become better. BSS ROCA BS COSMO-S14-EPS – red COSMO-RU2-EPS – orange LAEF-EPS – brown NMMB-EPS – black HARMON-EPS – blue GLAMEPS – green

Camera shots from Gornaya Carousel-1500 FROST-2014 experience demonstrates that direct forecast of visibility is a serious challenge. However, some results were encouraging. Example: 17 February 2014, 11:00-12:00 UTC (Biathlon venue) – Forecast of time slot for competitions during the 3-days period with low visibility. Forecast of wind direction and relative humidity (as proxy of visibility) by COSMO-Ru1 (1km grid spacing) Wind and RH at 850 hPa. Forecast from 12 UTC Biathlon Stadium 11:00 UTC 11:30 UTC12:00 UTC 11:00 UTC 13:00 UTC 12:00 UTC RH at 2m: Forecast and observations Biathlon Stadium

How was the window of good visibility on 17 February predicted by various systems?

List of other interesting cases Case Meteorological process/phenomenon Models’ behavior Impact on competitions Foehn Poor T forecast by most models at Biathlon Stadium (forecast errors negative: 1.4…3.7°С) Poor Max Wind Speed (Vmax) forecast by most models at Krasnaya Polyana (forecast errors negative: 3.5…7 m/s) Low visibility Postponed competitions at Laura and Extreme Park Cold front Good precipitation forecast by most models Foehn Poor T forecast by most models (negative forecast errors: °С, most markedly at height 1500 m) Cold front. Low visibility Tmax not good by most models (maximum T forecast at noon, whereas in reality maximum T occurred in the morning) Postponed skiing competitions at Roza Khutor “Weak process”. Precip. Poor precipitation forecast by most models at heights above 1500 m Cold frontPoor Vmax forecast (underestimation) by most models at heights above 1500 m

It was not simple for forecasters to deal with such an amount of information under the operational time constraints => Integrated Forecast F(t) – integrated forecast (t – forecast time); O – last available observation; f i (t) – forecast of i-th participating forecasting system; α(t), β i (t) - weights; b i (t) - bias for i-th forecasting system F. Woodcock and C. Engel: Operational Consensus Forecasts, Weather and Forecasting, 2005; L.X. Huang and G.A. Isaac: Integrating NWP Forecasts and Observation Data to Improve Nowcasting Accuracy, Weather and Forecasting, 2012

FROST-2014 weather data feed for the Olympic information system Integrated objective multi-model forecasts served as a first guess for preparation of the “official forecasts” for the Olympic information system. Web-editor was developed for forecasters for correction of objective forecasts. ATOS Requrements: - 1-hour update frequency; - Temporal resolution: for a current day – 1 hour; for subsequent days – 3 hours; - Forecast outlooks for a current day and next 5 days; - Alert Warnings

Forecasters’ subjective evaluation Model Grid mesh size Ove rall usef ulne ss Forecast accuracy:Visu aliza tion (app eara nce) Timel iness and reliab ility Comments TPrec ip Win d Gust s Vis COSMO- Ru7 7 km The basic model for the forecasters. Reasonable precip fact. Overestimated precip intensity. Tmin, Tmax poor. Wind poor. dT/dt OK. COSMO- Ru2 2.2 km The basic model for the forecasters. In general better than Cosmo-Ru7. COSMO- Ru1 1.1 km Comments are contradictory. The majority of forecasters considered COSMO-Ru2 to be more useful than COSMO- Ru1. Some forecasters preferred Cosmo-Ru1 (helpful wind, humidity). Overestimates precip intensity. COSMO- S14-EPS 7 km Precip reasonable. Good tendencies. Wind poor. Was available well before the Olympics that was helpful to get used to this information. NMMB 1 km Good in T and Precip. Informative visibility NMMB- EPS 7 km Nice. Informative visibility. Precip reasonable. Tmin, Tmax poor GEM km Good precip, humidity. GEM-1 1 km Good precip, humidity. GEM m Good precip, humidity. Very detailed maps.

Forecasters’ subjective evaluation Model Grid mesh size Over all usefu lness Forecast accuracy:Visua lizati on (appe aran ce) Time lines s and relia bility Comments TPreci p Win d Gust s Vis GLAMEPS 11 km Informative tendencies. Issues with absolute values. GLAMEPS calibr.,freque nt update 11 km Interesting and helpful. HarmonEPS 2.5 km In general good in T and Precip, but there were problems with T in anticyclones and Foehn. Harmonie 1 km Good T, Precip. ALADIN LAEF 11 km Good Wind, including Vmax. Nice plots WRF 600 m Useful but late COSMO- Ru2-EPS 2.2 km Experimental

Project Social and Economic Impacts Socially significant project application areas: Education Understanding Transfer of technologies Practical forecasting – first guess for operational official forecasts. Integrated project forecasts were used as a first guess for the data feed to the Olympic information system.

Further steps Enhanced quality control of the project observations archive. Additional diagnostic tools and export facilities on the project web-site Open access for international research community. Validation and intercomparison of the participating forecasting systems, case studies and numerical experiments, assessments of predictability of various weather elements; Update of developed technologies and transfer of positive experience into operational practice.

What’s next… Workshop this Fall in Moscow (difficult to attend for some of the participants) Interesting case studies for the international community Joint paper? Lessons learned (after Vancouver and in preparation for upcoming events)

Thank you! Gratitude to all the participants !

Along with traditional verification measures some new scores were implemented. EDI - Extremal Dependence Index NOTES: - Pictures will refer to thresholds 0.01 and 3, and the last threshold at which any of the three EDI curves remains not interrupted in the 0-36h interval - The base rate has the following approximate values: P(0.01mm/3h)=0.3; P(1mm/3h)=0.2; P(2mm/3h)=0.15; P(3mm/3h)=0.1; P(4mm/3h)=0.055;P(5mm/3h)=0.05 EDI = (logF – logH) / (logF+logH) EDI is especially recommended for low base-rate thresholds, but it will give a good comparative estimate of accuracy for all thresholds (“Suggested methods for the verification of precipitation forecasts against high resolution limited area observations” by the JWGFVR (Laurie Wilson, Beth Ebert et al.)

COSMO-S14-EPS Highest threshold: 8 mm/3h Lower decision-making level Blue: EDI for 50% probability threshold Green: for 66% Red: for 90%

Conclusions, EDI Extremal Dependence Index, EDI, can be used for decision making, especially for rare events when other scores, such as PSS, approach zero. Constructing EDI for different probability decision levels (50, 66, and 90%) showed that the participated EPSs demonstrate skill for all these levels up to the following precipitation thresholds: COSMO-S14-EPS and NMMB-EPS – informative up to 8mm/3h; COSMO-RU2-EPS, Harmon-EPS, ALADIN LAEF - informative up to 6mm/3h; GLAMEPS – informative up to 4mm/3h. Sampling effects are evident for all the models, especially for higher thresholds of variables. It is not possible to single out “the best ensemble producing system”, but still some conclusions can be drawn.