UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 1 Evaluation software tools Cristian Lussana (2) and Michael Borsche.

Slides:



Advertisements
Similar presentations
Validation of Satellite Rainfall Estimates over the Mid-latitudes Chris Kidd University of Birmingham, UK.
Advertisements

April Jörg Trentmann, Uwe Pfeifroth, Jennifer Lenhardt, Richard Müller Deutscher Wetterdienst (DWD) Evaluation of EURO4M Reanalysis data using Satellite.
Slide 1ECMWF forecast User Meeting -- Reading, June 2006 Verification of weather parameters Anna Ghelli, ECMWF.
Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
Robin Hogan Ewan OConnor University of Reading, UK What is the half-life of a cloud forecast?
Quantification of Spatially Distributed Errors of Precipitation Rates and Types from the TRMM Precipitation Radar 2A25 (the latest successive V6 and V7)
Validation of Satellite Precipitation Estimates for Weather and Hydrological Applications Beth Ebert BMRC, Melbourne, Australia 3 rd IPWG Workshop / 3.
Daria Kluver Independent Study From Statistical Methods in the Atmospheric Sciences By Daniel Wilks.
Monitoring the Quality of Operational and Semi-Operational Satellite Precipitation Estimates – The IPWG Validation / Intercomparison Study Beth Ebert Bureau.
The use of the NWCSAF High Resolution Wind product in the mesoscale AROME model at the Hungarian Meteorological Service Máté Mile, Mária Putsay and Márta.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Quantitative precipitation forecasts in the Alps – first.
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
Performance of the MOGREPS Regional Ensemble
Verification of extreme events Barbara Casati (Environment Canada) D.B. Stephenson (University of Reading) ENVIRONMENT CANADA ENVIRONNEMENT CANADA.
Richard (Rick)Jones Regional Training Workshop on Severe Weather Forecasting Macau, April 8 -13, 2013.
1 On the use of radar data to verify mesoscale model precipitation forecasts Martin Goeber and Sean Milton Model Diagnostics and Validation group Numerical.
4th Int'l Verification Methods Workshop, Helsinki, 4-6 June Methods for verifying spatial forecasts Beth Ebert Centre for Australian Weather and.
4IWVM - Tutorial Session - June 2009 Verification of categorical predictands Anna Ghelli ECMWF.
ISDA 2014, Feb 24 – 28, Munich 1 Impact of ensemble perturbations provided by convective-scale ensemble data assimilation in the COSMO-DE model Florian.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Data mining in the joint D- PHASE and COPS archive Mathias.
A Comparison of the Northern American Regional Reanalysis (NARR) to an Ensemble of Analyses Including CFSR Wesley Ebisuzaki 1, Fedor Mesinger 2, Li Zhang.
SEASONAL COMMON PLOT SCORES A DRIANO R ASPANTI P ERFORMANCE DIAGRAM BY M.S T ESINI Sibiu - Cosmo General Meeting 2-5 September 2013.
Dataset Development within the Surface Processes Group David I. Berry and Elizabeth C. Kent.
We carried out the QPF verification of the three model versions (COSMO-I7, COSMO-7, COSMO-EU) with the following specifications: From January 2006 till.
Latest results in verification over Poland Katarzyna Starosta, Joanna Linkowska Institute of Meteorology and Water Management, Warsaw 9th COSMO General.
Quality control of daily data on example of Central European series of air temperature, relative humidity and precipitation P. Štěpánek (1), P. Zahradníček.
Real-time Verification of Operational Precipitation Forecasts using Hourly Gauge Data Andrew Loughe Judy Henderson Jennifer MahoneyEdward Tollerud Real-time.
Evaluation of a dynamic downscaling of precipitation over the Norwegian mainland Orskaug E. a, Scheel I. b, Frigessi A. c,a, Guttorp P. d,a,
USING THE ROSSBY RADIUS OF DEFORMATION AS A FORECASTING TOOL FOR TROPICAL CYCLOGENESIS USING THE ROSSBY RADIUS OF DEFORMATION AS A FORECASTING TOOL FOR.
Using satellite data to understand uncertainties in reanalyses: UERRA Richard Renshaw, Peter Jermey with thanks to Jörg Trentmann, Jennifer Lenhardt, Andrea.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Local Probabilistic Weather Predictions for Switzerland.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Quantitative precipitation forecast in the Alps Verification.
D11 Summary: The need for downscaling of extremes: An evaluation of interannual variations in the NCEP reanalysis over European regions.
Predicted Rainfall Estimation in the Huaihe River Basin Based on TIGGE Fuyou Tian, Dan Qi, Jingyue Di, and Linna Zhao National Meteorological Center of.
DIAMET meeting 7 th-8th March 2011 “New tools for the evaluation of convective scale ensemble systems” Seonaid Dey Supervisors: Bob Plant, Nigel Roberts.
Ui-Yong Byun, Song-You Hong, Hyeyum Shin Deparment of Atmospheric Science, Yonsei Univ. Ji-Woo Lee, Jae-Ik Song, Sook-Jung Ham, Jwa-Kyum Kim, Hyung-Woo.
Eidgenössisches Departement des Innern EDI Bundesamt für Meteorologie und Klimatologie MeteoSchweiz The challenge to verify operational weather warnings.
WSN05, 5-9 Sep.2005, Toulouse, France 1 Portable HRM Station Value Verification Package “ORMVERIF” Sultan Salim AL-Yahyai Fauzi Bader Al-Busaidi Khalid.
Evaluation of gridded multi-satellite precipitation (TRMM -TMPA) estimates for performance in the Upper Indus Basin (UIB) Asim J Khan Advisor: Prof. Dr.
Probabilistic Forecasts of Extreme Precipitation Events for the U.S. Hazards Assessment Kenneth Pelman 32 nd Climate Diagnostics Workshop Tallahassee,
U. Damrath, COSMO GM, Athens 2007 Verification of numerical QPF in DWD using radar data - and some traditional verification results for surface weather.
TOULOUSE (FRANCE), 5-9 September 2005 OBJECTIVE VERIFICATION OF A RADAR-BASED OPERATIONAL TOOL FOR IDENTIFICATION OF HAILSTORMS I. San Ambrosio, F. Elizaga.
Page 1© Crown copyright 2005 Met Office Verification -status Clive Wilson, Presented by Mike Bush at EWGLAM Meeting October 8- 11, 2007.
Nathalie Voisin 1, Florian Pappenberger 2, Dennis Lettenmaier 1, Roberto Buizza 2, and John Schaake 3 1 University of Washington 2 ECMWF 3 National Weather.
Page 1© Crown copyright 2004 The use of an intensity-scale technique for assessing operational mesoscale precipitation forecasts Marion Mittermaier and.
Operational verification system Rodica Dumitrache National Metorogical Administration ROMANIA.
Diagnostic verification and extremes: 1 st Breakout Discussed the need for toolkit to build beyond current capabilities (e.g., NCEP) Identified (and began.
WRF Verification Toolkit Workshop, Boulder, February 2007 Spatial verification of NWP model fields Beth Ebert BMRC, Australia.
10th COSMO General Meeting, Cracow, Poland Verification of COSMOGR Over Greece 10 th COSMO General Meeting Cracow, Poland.
© Crown copyright Met Office Verifying modelled currents using a threshold exceedance approach Dr Ray Mahdon An exploration of the Gerrity Skill Score.
Deutscher Wetterdienst FE VERSUS 2 Priority Project Meeting Langen Use of Feedback Files for Verification at DWD Ulrich Pflüger Deutscher.
How to use reanalyses for representativity/quality estimation Andrea K. Kaiser-Weiss, Vera Heene, Michael Borsche, and Frank Kaspar.
Seasonal Climate Forecasting for Applied Use in the Western USA Katherine Hegewisch 1, Renaud Barbero 2, John Abatzoglou 1 1 University of Idaho, Department.
NOAA Northeast Regional Climate Center Dr. Lee Tryhorn NOAA Climate Literacy Workshop April 2010 NOAA Northeast Regional Climate.
Regional Re-analyses of Observations, Ensembles and Uncertainties of Climate information Per Undén Coordinator UERRA SMHI.
Centro Nazionale di Meteorologia e Climatologia Aeronautica Common Verification Suite Zurich, Sep 2005 Alessandro GALLIANI, Patrizio EMILIANI, Adriano.
Precipitation extremes during Indian summer monsoon Jayashree Revadekar Centre for Climate Change Research Indian Institute of Tropical Meteorology PUNE,
Actions & Activities Report PP8 – Potsdam Institute for Climate Impact Research, Germany 2.1Compilation of Meteorological Observations, 2.2Analysis of.
IASC Workshop Potsdamr, Germany Polar Meteorology Group, Byrd Polar Research Center, The Ohio State University, Columbus, Ohio, USA The Arctic System Reanalysis.
Validation of Satellite Rainfall Estimates over the Mid-latitudes Chris Kidd University of Birmingham, UK.
UERRA User Workshop Toulouse, 4 th Feb 2016 Questions to the users.
1 Task – Third-party workshops: evaluation of reanalyses data and products WP leader: Gé Verver, KNMI WP8 User Feedback.
Assessment of high-resolution simulations of precipitation and temperature characteristics over western Canada using WRF model Asong. Z.E
Intensity-scale verification technique
Multi-scale validation of high resolution precipitation products
UERRA WP3 Assessing uncertainties by evaluation against independent observational datasets DWD, KNMI, MI, EDI, UEA, NMA-RO, MO Task 3.1 Coordinated uncertainty.
Michael K. Tippett1,2, Adam H. Sobel3,4 and Suzana J. Camargo4
COSMO Priority Project ”Quantitative Precipitation Forecasts”
Verification Overview
Short Range Ensemble Prediction System Verification over Greece
Presentation transcript:

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 1 Evaluation software tools Cristian Lussana (2) and Michael Borsche (1) (1) Deutscher Wetterdienst, Offenbach, Germany (2) Norwegian Meteorological Institute, Oslo, Norway

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 2 Motivation Even regional reanalyses are not perfect Uncertainty estimation is essential How far can we push resolution (before we drown in noise?)

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 3 Parameters evaluated and verification scores Precipitation – Time aggregation: daily – scores: EDA, PDF-scores, scale decomposition approach: Intensity-Scale skill score Wind speed (10m to 116m) – Time aggregation: Hourly to monthly – Scores: correlation, bias vs height, variance, seasonal cycle, daily cycle Temperature (2m)

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 4 Sharing to code Available on GitHub The “Git” in GitHub. Git is an open-source version control system that was started by Linus Trovalds – the same person who created Linux. The “Hub” in GitHub. Git is a command-line tool, but the center around which all things involving Git revolve – effectively, the Hub, is GitHub.com, where developers can store their projects and network with likeminded people. UERRA-EVA:

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 5 Evaluation against gridded observational datasets A. Choose a gridded observational dataset (Temp, Prec; daily time res) to be used as a reference for the RRA verification; B. Development of EVA_gridobs, a R-library for RRA spatial verification against gridded datasets: i. Exploratory Data Analysis and PDF-based scores. ii. First experiments on the application of a scale-decomposition technique, the Intensity-Scale skill score, to asses the added value of enhanced resolution in RRAs (SMHI-EURO4m MESAN compared against NGCD)

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 6 Conclusions. Wavelet based scale- separation MSE skill-score and scale- separation statistics are: - informative on RRA bias, error and skill on different scales - suitable for comparing models with different resolutions - RRA performances for specific intensity events (thresholded binary fields) Scale-Separation verification technique example: aggregation over JJA 2008

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 7 UERRA-EVA: Evaluation software tools Assess the added value of enhanced resolution in RRA: wavelet-based scale-separation MSE skill score

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 8 GitHub repository: EVA_stationobs R-package – reading reanalysis and observation data – calculating and plotting statistical measures – published on GitHub

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 9

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 10 Scores Correlation, bias, RMSE, anomalies, PDF-score, frequency distribution Contingency table based scores for extreme event analysis: – hit rate, false alarm rate, false alarm ratio, HKS, TS, ETS, frequency bias index, HSS, accuracy, odd ratio, EDI, SEDI

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 11 Vertical variability

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 12 Storm and calm period

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 13 Skill scores 12 skill scores based on contingency table Exceed or fall below a threshold Hit rateFalse alarm rate False alarm ratio True skill Threat scoreEquitable threat score Frequency bias Heidke skill score

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 14 Hit rate vs false alarm ratio of hourly means at Hannover Skill scores

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 15 Summary Publicly available R-package – Read station and mast tower observations – Read different regional reanalyses – Calculate statistical measures to assess uncertainty – Plot analysis results

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 16 Any questions?

UERRA user workshop, Toulouse, 3./4. Feb 2016Cristian Lussana and Michael Borsche 17 Questions to the users R scripts ok? Git/github ok? Scores (…) of priority? Verification on local, regional, national, or European scale?