On the spatial verification of FROST-2014 precipitation forecast fields Anatoly Muraviev (1), Anastasia Bundel (1), Dmitry Kiktev (1), Nikolay Bocharnikov.

Slides:



Advertisements
Similar presentations
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Improving COSMO-LEPS forecasts of extreme events with.
Advertisements

Assessment of Tropical Rainfall Potential (TRaP) forecasts during the Australian tropical cyclone season Beth Ebert BMRC, Melbourne, Australia.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Quantitative precipitation forecasts in the Alps – first.
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
WCRP Metrics/Methodologies for Extremes September 2010 Based in part on … Some Scale Considerations for Predicting/Projecting Extreme Precipitation William.
Univ of AZ WRF Model Verification. Method NCEP Stage IV data used for precipitation verification – Stage IV is composite of rain fall observations and.
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
WWOSC 2014 Assimilation of 3D radar reflectivity with an Ensemble Kalman Filter on a convection-permitting scale WWOSC 2014 Theresa Bick 1,2,* Silke Trömel.
ALADIN/RC LACE Data Assimilation Mini-Workshop, Budapest, October 20 th -22 th Smoothing of Soil Wetness Index (SWI) in ALADIN/LACE domain Stjepan.
Regional Climate Modeling in the Source Region of Yellow River with complex topography using the RegCM3: Model validation Pinhong Hui, Jianping Tang School.
The Evaluation of a Passive Microwave-Based Satellite Rainfall Estimation Algorithm with an IR-Based Algorithm at Short time Scales Robert Joyce RS Information.
4th Int'l Verification Methods Workshop, Helsinki, 4-6 June Methods for verifying spatial forecasts Beth Ebert Centre for Australian Weather and.
“High resolution ensemble analysis: linking correlations and spread to physical processes ” S. Dey, R. Plant, N. Roberts and S. Migliorini Mesoscale group.
Development of an object- oriented verification technique for QPF Michael Baldwin 1 Matthew Wandishin 2, S. Lakshmivarahan 3 1 Cooperative Institute for.
The Framework of the WMO/WWRP FROST-2014 Forecast Verification Setup and Activities The Framework of the WMO/WWRP FROST-2014 Forecast.
© Crown copyright Met Office Preliminary results using the Fractions Skill Score: SP2005 and fake cases Marion Mittermaier and Nigel Roberts.
Ebert-McBride Technique (Contiguous Rain Areas) Ebert and McBride (2000: Verification of precipitation in weather systems: determination of systematic.
“New tools for the evaluation of convective scale ensemble systems” Seonaid Dey Supervisors: Bob Plant, Nigel Roberts and Stefano Migliorini.
Latest results in verification over Poland Katarzyna Starosta, Joanna Linkowska Institute of Meteorology and Water Management, Warsaw 9th COSMO General.
The latest results of verification over Poland Katarzyna Starosta Joanna Linkowska COSMO General Meeting, Cracow September 2008 Institute of Meteorology.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Priority project « Advanced interpretation and verification.
Use of radar data in ALADIN Marián Jurašek Slovak Hydrometeorological Institute.
How well can we model air pollution meteorology in the Houston area? Wayne Angevine CIRES / NOAA ESRL Mark Zagar Met. Office of Slovenia Jerome Brioude,
Page 1© Crown copyright Scale selective verification of precipitation forecasts Nigel Roberts and Humphrey Lean.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Quantitative precipitation forecast in the Alps Verification.
Intercomparison of Spatial Verification Methods for COSMO Terrain (INSPECT): Preliminary Results Dmitry Alferov (1), Elena Astakhova (1), Dimitra Boukouvala.
DIAMET meeting 7 th-8th March 2011 “New tools for the evaluation of convective scale ensemble systems” Seonaid Dey Supervisors: Bob Plant, Nigel Roberts.
EMS 2013 (Reading UK) Verification techniques for high resolution NWP precipitation forecasts Emiel van der Plas Kees Kok Maurice.
DIAMET meeting 7 th-8th March 2011 “New tools for the evaluation of convective scale ensemble systems” Seonaid Dey Supervisors: Bob Plant, Nigel Roberts.
Ekaterina Kazakova Inna Rozinkina Mikhail Chumakov Detailed snow OA based on satellite data coupled with operational COSMO-Ru7/So2 including detection.
Feature-based (object-based) Verification Nathan M. Hitchens National Severe Storms Laboratory.
Verification of Precipitation Areas Beth Ebert Bureau of Meteorology Research Centre Melbourne, Australia
Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University.
Spatial Verification Methods for Ensemble Forecasts of Low-Level Rotation in Supercells Patrick S. Skinner 1, Louis J. Wicker 1, Dustan M. Wheatley 1,2,
Typhoon Forecasting and QPF Technique Development in CWB Kuo-Chen Lu Central Weather Bureau.
Denis Blinov, G. Rivin, M. Nikitin, A. Bundel, A. Kirsanov, I. Rozinkina Data assimilation system based on nudging for COSMO-Ru7/So2 1COSMO GM Sibiu, 2.
U. Damrath, COSMO GM, Athens 2007 Verification of numerical QPF in DWD using radar data - and some traditional verification results for surface weather.
General Meeting Moscow, 6-10 September 2010 High-Resolution verification for Temperature ( in northern Italy) Maria Stefania Tesini COSMO General Meeting.
Page 1© Crown copyright 2004 The use of an intensity-scale technique for assessing operational mesoscale precipitation forecasts Marion Mittermaier and.
Trials of a 1km Version of the Unified Model for Short Range Forecasting of Convective Events Humphrey Lean, Susan Ballard, Peter Clark, Mark Dixon, Zhihong.
WRF Verification Toolkit Workshop, Boulder, February 2007 Spatial verification of NWP model fields Beth Ebert BMRC, Australia.
NCAR, 15 April Fuzzy verification of fake cases Beth Ebert Center for Australian Weather and Climate Research Bureau of Meteorology.
VERIFICATION Highligths by WG5. 2 Outlook Some focus on Temperature with common plots and Conditional Verification Some Fuzzy verification Long trends.
Eidgenössisches Departement des Innern EDI Bundesamt für Meteorologie und Klimatologie MeteoSchweiz Weather type dependant fuzzy verification of precipitation.
Nowcasting Convection Fusing 0-6 hour observation- and model-based probability forecasts WWRP Symposium on Nowcasting and Very Short Range Forecasting.
Overview of SPC Efforts in Objective Verification of Convection-Allowing Models and Ensembles Israel Jirak, Chris Melick, Patrick Marsh, Andy Dean and.
Eidgenössisches Departement des Innern EDI Bundesamt für Meteorologie und Klimatologie MeteoSchweiz Weather type dependant fuzzy verification of precipitation.
Downscaling Downscaling correction of T2m For Sochi region Inna Rozinkina, Mikhail Chumakov, Sergey Chechin, Mikhail Zaychenko, Denis Blinov, Michail Nikitin,
WG4 Oct 2006 – Sep 2007 plans COSMO General Meeting, 21 September 2006 Pierre Eckert.
Dmitry Alferov, Elena Astakhova, Gdaly Rivin, Inna Rozinkina Hydrometcenter of Russia 13-th COSMO General Meeting, Rome, 5-9 September 2011.
Application of the CRA Method Application of the CRA Method William A. Gallus, Jr. Iowa State University Beth Ebert Center for Australian Weather and Climate.
Deutscher Wetterdienst Long-term trends of precipitation verification results for GME, COSMO-EU and COSMO-DE Ulrich Damrath.
Application of Probability Density Function - Optimal Interpolation in Hourly Gauge-Satellite Merged Precipitation Analysis over China Yan Shen, Yang Pan,
WG5 COSMO General Meeting, Rome 2011 Authors: ALL Presented by Adriano Raspanti.
Application of object-oriented verification techniques to ensemble precipitation forecasts William A. Gallus, Jr. Iowa State University June 5, 2009.
COSMO-RU operational verification in Russia using VERSUS2
QPF sensitivity to Runge-Kutta and Leapfrog core
Intensity-scale verification technique
Fuzzy verification using the Fractions Skill Score
Systematic timing errors in km-scale NWP precipitation forecasts
Verifying Precipitation Events Using Composite Statistics
Spatial Verification Intercomparison Meeting, 20 February 2007, NCAR
Convective Scale Modelling Humphrey Lean et. al
PP INSPECT report Dmitry Alferov (1), Elena Astakhova (1), Petra Baumann (4), Dimitra Boukouvala (2), Anastasia Bundel (1), Ulrich Damrath (3), Pierre.
Reinhold Steinacker Department of Meteorology and Geophysics
Adaption of an entity based verification
Numerical Weather Prediction Center (NWPC), Beijing, China
Verification of Tropical Cyclone Forecasts
Some Verification Highlights and Issues in Precipitation Verification
PP INSPECT is finished, but the work is going on
Presentation transcript:

On the spatial verification of FROST-2014 precipitation forecast fields Anatoly Muraviev (1), Anastasia Bundel (1), Dmitry Kiktev (1), Nikolay Bocharnikov (2), and Tatiana Bazlova (2) (1) Hydrometcentre of Russia/Roshydromet, Moscow, (2) Institute of Radar Meteorology, Saint-Petersburg, Russia EMS/ECAM, 07 – 11 September 2015, Sofia, Bulgaria

Outline 1.Radar and model data used 2.Neighborhood method: R SpatialVx hoods2d function 3.Contiguous Rain Area: R SpatialVx craer function 4.Conclusions on application of spatial methods for precipitation during the Sochi-2014 Games

One-hour radar precipitation analysis was prepared by IRAM From Reid et. al 4 th FROST meeting

Area of the study COSMO-Ru2 domain COSMO-Ru1 domain 349 lon points * 481 lat points with lat-lon increments. 1 grid size by longitude = 111* = 930 m, 1 grid size by latitude = cos(43°35’)*930 m = 0.72*930 = ~ 670 m COMPLEX TERRAIN !

All the models were interpolated into the radar grid using GRADS (function lterp) COSMO-Ru1 (1 km) COSMO-Ru2 (2 km) NMMB (1 km) HARMONIE (1 km) GEM-1 (1 km) GEM-2.5 (2.5 km) GEM-0.25: too small domain!

18 Feb 2014, 09 UTC, cold front: All models underestimated max precip and didn’t give precip over the sea. COSMO-Ru2 COSMO-Ru1 GEM-1 GEM-2.5 NMMB HARMONIE RADAR

18 Feb 2014, 17 UTC, all models predicted expanding precipitation area, but not the max value COSMO-Ru2 COSMO-Ru1 GEM-1 GEM-2.5 NMMB HARMONIE RADAR

hoods2d Different scores were calculated, but the FSS (Roberts and Lean 2008) is presented as one of most useful neighborhood statistics (see, e.g., COSMO INTERP project)

FSS, 18 Feb 2014, 09 UTC COSMO-Ru1 GEM1 HARMONIE NMMB COSMO-Ru2 GEM-2.5 Note: km models are interpolated onto ~1km grid! COSMO-Ru2 is best here, its FSS is useful at all scales except for the highest threshold (precip ≥ 3mm/h) GEM-1 is good for middle thresholds (0.5 and 1 mm/h)

FSS, 18 Feb 2014, 17 UTC NMMB and HARMONIE have comparable high skill. COSMO-Ru2 looses its skill for higher thresholds COSMO-Ru2 COSMO-Ru1 GEM-1 GEM-2.5 NMMB HARMONIE

22 Jan 2014, 23 UTC, intense precipitation Not avail. until 29 Jan Good forecast by all models. COSMO-Ru2 and GEM-1 are the leaders COSMO-Ru2 COSMO-Ru1 GEM-1 GEM-2.5 NMMB HARMONIE

29 Jan h GEM-1, HARMONIE and COSMO-Ru2 are good, but very bad forecast of precip>=3 mm/h by COSMO-Ru2 NMMB is worst here COSMO-Ru2 COSMO-Ru1 GEM-1 GEM-2.5 NMMB HARMONIE

11 March 2014, 09UTC Not enough cases to run hoods2d! COSMO-Ru2 COSMO-Ru1GEM-1 GEM-2.5 NMMB HARMONIE All: Bad forecast of precip>=3 mm/h

Neighborhood: conclusions All the models underestimated the maximum precipitation According to the FSS, COSMO-Ru2 tends to be better then COSMO-Ru1, GEM-1 is better than GEM-2 Bad forecast of higher thresholds We need to: aggregate neighborhood scores over all cases to estimate the systematic models’ behavior include the cases where precipitation was predicted, but not observed analyze timing errors

CRA – Contiguous Rain Area (E.E. Ebert, J.L. McBride 2000) MSEtotal = MSEdisplacement + MSEvolume + MSEpattern MSEdisplacement = MSEtotal – MSEshifted MSEvolume = ( F - X )2 where F and X are the CRA mean forecast and observed values after the shift. The CRA concept is easy to understand, but there are many important issues and nuances in application of the CRA MSEpattern = MSEshift – MSEvolume

R SpatialVx craer function Convolution threshold technique. First, the field is smoothed using a convolution smoother, and then it is set to a binary image where everything above a given threshold is set to one (Davis et al, 2006) Minboundmatch function– each object is pared to only one object according to the smallest minimum boundary separation hold <- make.SpatialVx(xx, yy, map=TRUE, loc=zz, field.type="Precipitation", units="mm/h", data.name=c("Sochi_frcsts", "R-Akhun", "GEM25")) look <- convthresh(hold, smoothpar=3, thresh=1) look2 <- minboundmatch( look ) craer( look2, type = "fast", verbose = TRUE)

Pairs of matched objects from craer, 18 Feb 2014, 09 UTC Colors indicate the 1st pair, the 2 nd pair, etc, threshold: 1mm/h COSMO-Ru1COSMO-Ru2 HARMONIE NMMB GEM-1 GEM- 2.5 A human would separate this object

COSMO-Ru1 According to these scores, most of the total MSE error comes from the small-scale pattern errors for most object pairs COSMO-Ru1

CRA threshold: 2 mm/h (3mm/h gives too many little objects!) Why these features are paired for this model? Why the blue object is not paired to the red one?

CRA: Questions There are many little objects. Can we set up a limitation on the maximum number of objects? Two apparently similar GEM fields: Different model objects are paired with the same radar object. Should there be a condition on the area size when pairing the objects? (the largest is paired to the largest) Try another pairing methods (deltamm, e.g.) with merging objects? This study shows that we are not yet able to give general CRA statistics about the location, volume, and fine-scale structure neither can we yet range the models according to these statistics

Thank you for your attention! 21COSMO GM Sibiu, Sept. 2013