Observation uncertainty in verification

Slides:



Advertisements
Similar presentations
Slide 1ECMWF forecast User Meeting -- Reading, June 2006 Verification of weather parameters Anna Ghelli, ECMWF.
Advertisements

Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
1 ATOVS and SSM/I assimilation at the Met Office Stephen English, Dave Jones, Andrew Smith, Fiona Hilton and Keith Whyte.
Brian Ancell, Cliff Mass, Gregory J. Hakim University of Washington
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Slide 1 Evaluation of observation impact and observation error covariance retuning Cristina Lupu, Carla Cardinali, Tony McNally ECMWF, Reading, UK WWOSC.
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
CARPE DIEM Centre for Water Resources Research NUID-UCD Contribution to Area-3 Dusseldorf meeting 26th to 28th May 2003.
Recent developments in data assimilation for global deterministic NWP: EnVar vs. 3D-Var and 4D-Var Mark Buehner 1, Josée Morneau 2 and Cecilien Charette.
Ensemble-variational sea ice data assimilation Anna Shlyaeva, Mark Buehner, Alain Caya, Data Assimilation and Satellite Meteorology Research Jean-Francois.
ESA DA Projects Progress Meeting 2University of Reading Advanced Data Assimilation Methods WP2.1 Perform (ensemble) experiments to quantify model errors.
Eidgenössisches Departement des Innern EDI Bundesamt für Meteorologie und Klimatologie MeteoSchweiz Statistical Characteristics of High- Resolution COSMO.
WWOSC 2014, Aug 16 – 21, Montreal 1 Impact of initial ensemble perturbations provided by convective-scale ensemble data assimilation in the COSMO-DE model.
ISDA 2014, Feb 24 – 28, Munich 1 Impact of ensemble perturbations provided by convective-scale ensemble data assimilation in the COSMO-DE model Florian.
STEPS: An empirical treatment of forecast uncertainty Alan Seed BMRC Weather Forecasting Group.
Stephanie Guedj Florence Rabier Vincent Guidard Benjamin Ménétrier Observation error estimation in a convective-scale NWP system.
Data assimilation and observing systems strategies Pierre Gauthier Data Assimilation and Satellite Meteorology Division Meteorological Service of Canada.
Verification methods - towards a user oriented verification WG5.
Latest results in verification over Poland Katarzyna Starosta, Joanna Linkowska Institute of Meteorology and Water Management, Warsaw 9th COSMO General.
The latest results of verification over Poland Katarzyna Starosta Joanna Linkowska COSMO General Meeting, Cracow September 2008 Institute of Meteorology.
Requirements from KENDA on the verification NetCDF feedback files: -produced by analysis system (LETKF) and ‘stat’ utility ((to.
Research and development on satellite data assimilation at the Canadian Meteorological Center L. Garand, S. K. Dutta, S. Heilliette, M. Buehner, and S.
Verification of Global Ensemble Forecasts Fanglin Yang Yuejian Zhu, Glenn White, John Derber Environmental Modeling Center National Centers for Environmental.
Data assimilation, short-term forecast, and forecasting error
. Outline  Evaluation of different model-error schemes in the WRF mesoscale ensemble: stochastic, multi-physics and combinations thereof  Where is.
Page 1© Crown copyright Scale selective verification of precipitation forecasts Nigel Roberts and Humphrey Lean.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Local Probabilistic Weather Predictions for Switzerland.
Verification Verification with SYNOP, TEMP, and GPS data P. Kaufmann, M. Arpagaus, MeteoSwiss P. Emiliani., E. Veccia., A. Galliani., UGM U. Pflüger, DWD.
Application of COSMIC refractivity in Improving Tropical Analyses and Forecasts H. Liu, J. Anderson, B. Kuo, C. Snyder, and Y. Chen NCAR IMAGe/COSMIC/MMM.
Preliminary results from assimilation of GPS radio occultation data in WRF using an ensemble filter H. Liu, J. Anderson, B. Kuo, C. Snyder, A. Caya IMAGe.
11th EMS & 10th ECAM Berlin, Deutschland The influence of the new ECMWF Ensemble Prediction System resolution on wind power forecast accuracy and uncertainty.
General Meeting Moscow, 6-10 September 2010 High-Resolution verification for Temperature ( in northern Italy) Maria Stefania Tesini COSMO General Meeting.
NCAR April 1 st 2003 Mesoscale and Microscale Meteorology Data Assimilation in AMPS Dale Barker S. Rizvi, and M. Duda MMM Division, NCAR
Verification of ensemble precipitation forecasts using the TIGGE dataset Laurence J. Wilson Environment Canada Anna Ghelli ECMWF GIFS-TIGGE Meeting, Feb.
Slide 1 International Typhoon Workshop Tokyo 2009 Slide 1 Impact of increased satellite data density in sensitive areas Carla Cardinali, Peter Bauer, Roberto.
VERIFICATION Highligths by WG5. 2 Outlook Some focus on Temperature with common plots and Conditional Verification Some Fuzzy verification Long trends.
MODIS Winds Assimilation Impact Study with the CMC Operational Forecast System Réal Sarrazin Data Assimilation and Quality Control Canadian Meteorological.
10th COSMO General Meeting, Cracow, Poland Verification of COSMOGR Over Greece 10 th COSMO General Meeting Cracow, Poland.
VERIFICATION OF A DOWNSCALING SEQUENCE APPLIED TO MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR GLOBAL FLOOD PREDICTION Nathalie Voisin, Andy W. Wood and.
Verification methods - towards a user oriented verification The verification group.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Experiments at MeteoSwiss : TERRA / aerosols Flake Jean-Marie.
Global vs mesoscale ATOVS assimilation at the Met Office Global Large obs error (4 K) NESDIS 1B radiances NOAA-15 & 16 HIRS and AMSU thinned to 154 km.
OSEs with HIRLAM and HARMONIE for EUCOS Nils Gustafsson, SMHI Sigurdur Thorsteinsson, IMO John de Vries, KNMI Roger Randriamampianina, met.no.
Verification of wind gust forecasts Javier Calvo and Gema Morales HIRMAM /ALADIN ASM Utrecht May 11-15, 2009.
Slide 1 Investigations on alternative interpretations of AMVs Kirsti Salonen and Niels Bormann 12 th International Winds Workshop, 19 th June 2014.
Soil analysis scheme for AROME within SURFEX
Operational Verification at HNMS
Current verification results for COSMO-EU and COSMO-DE at DWD
Observation-Based Ensemble Spread-Error Relationship
Verifying and interpreting ensemble products
Tom Hopson, Jason Knievel, Yubao Liu, Gregory Roux, Wanli Wu
Precipitation Products Statistical Techniques
S.Alessandrini, S.Sperati, G.Decimi,
Daniel Leuenberger1, Christian Keil2 and George Craig2
Predictability of 2-m temperature
Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier
Stéphane Laroche Judy St-James Iriola Mati Réal Sarrazin
FSOI adapted for used with 4D-EnVar
Item Taking into account radiosonde position in verification
Assimilation of Global Positioning System Radio Occultation Observations Using an Ensemble Filter in Atmospheric Prediction Models Hui Liu, Jefferey Anderson,
Comparison of different combinations of ensemble-based and variational data assimilation approaches for deterministic NWP Mark Buehner Data Assimilation.
Deterministic (HRES) and ensemble (ENS) verification scores
Christoph Gebhardt, Zied Ben Bouallègue, Michael Buchhold
2007 Mei-yu season Chien and Kuo (2009), GPS Solutions
Alex Gallagher and Dr. Robert Fovell
Impact of aircraft data in the MSC forecast systems
Project Team: Mark Buehner Cecilien Charette Bin He Peter Houtekamer
Ulrich Pflüger & Ulrich Damrath
Rapid Adjustment of Forecast Trajectories: Improving short-term forecast skill through statistical post-processing Nina Schuhen, Thordis L. Thorarinsdottir.
Short Range Ensemble Prediction System Verification over Greece
Presentation transcript:

Observation uncertainty in verification ET-OWFPS Beijing, China Tom Robinson CMC, Montreal March 12-16, 2018

Contents Sources of uncertainty Experiment with surface data (B. Casati) Different networks (SYNOP vs METAR) Effects of thinning Effects of quality control Representativeness and sampling (T. Haiden) Analysis vs observations Examples of dealing with uncertainty ECMWF (Zied Ben Bouallegue) CMC (V. Fortin et al.) Reducing radiosonde error

Sources of observation error and uncertainty Observations can be affected by different types of uncertainties: Measurements errors: e.g. instrument failure (abrupt or slowly degrading); Round-off and reporting procedures (precipitation trace from gauges reporting in inches vs mm; no report when no precipitation); Quality Control (elimination of large values; rejection of precipitation measurements in strong wind (undercatchment)); Representativeness and sampling error (both in space and time): is the point observation representative of the (nearest) model grid-point value? is the observation network homogeneous and representative of the region verified? Assumptions of remote-sensing retrieval algorithms. Uncertainties introduced by interpolation / gridding procedures. Driving Questions: What are the effects of the observation uncertainties on verification results? Which observation uncertainties have the largest impacts? How can we account for observation uncertainties in verification practices?

B. Casati: Experimental Design (1/2) Aim: identify observation uncertainties which have the largest impact on verification results Variables and scores T2m, TD2m, MSLP, Wind speed bias, rmse, stdev, corr Wind direction multicategorical HSS 6-hour accumulated precipitation FBI, HSS, ETS NWP systems and period/domain RDPS (10 km resolution) versus HRDPS (2.5 km resolution). Two seasons: July-August 2015 and January-February 2015. Domain: Canada. Subdomains for thinning and QC, climatology

Experiment Design (2/2) Traditional verification scores: TT,TD,PN,UV,WD: synop vs metar TT,TD,PN,UV,WD,PR6h: thinning vs no thinning TT,TD,PN,UV,WD, PR6h: QC vs no-QC TT,TD,PN,UV,WD, PR6h - verify against analysis values at observation locations - filling: from station network to whole domain Different Networks Spatial sampling Quality Control Representativeness Spatial Sampling

Network and spatial sampling ~40,000 ~20,000 ~10,000 ~17,000 ~15,000 ~12,000 METAR THIN 1Ox 1O 2Ox 2O SYNOP

SYNOP vs METAR RDPS, TT RDPS, TD RDPS, PN Bias err std dev

thin 0o (no thinning) - thin 1o - thin 2o RDPS, TT RDPS, TD RDPS, PN Bias RDPS, TT RDPS, TD RDPS, PN err std dev

SYNOP vs METAR with 20 thinning THIN 2o, TD THIN 2o, TT THIN 2o, PN Bias THIN 2o, PN THIN 2o, TD THIN 2o, TT err std dev

Season, Quality Control (QC) Summer CaPA QC Summer CaPA no QC Winter Winter CaPA Summer sample size: 1mm QC = 3,000 1mm noQC = 6,000 10mm QC = 400 10mm noQC = 1,000 Winter sample size: 1mm QC = 800 1mm noQC = 4200 10mm QC = 100 10mm noQC = 400 Quebec winter: tipping bucket gauges freeze QC reject PR obs with strong wind (undercatchment)

Quality Control vs no Quality Control Bias Err std dev RDPS 1mm Summer  Winter RDPS 10mm

Results Verification against different networks (SYNOP vs METAR) exhibits larger differences than thinning Thinning at 2o leads to more homogeneous and similar spatial sampling and sample size: reduces SYNOP / METAR differences Bias curves against SYNOP are systematically higher than those against METAR (more over-forecast than METARs). NOTE: SYNOP stations equipped with Stephenson screen, METAR stations are not: SYNOP observations are colder than METAR observations! Quality Control versus no Quality Control, PR6h, categorical scores: Summer 2015: no significant differences in verification results. Winter 2015: for the FBI, QC affects curves behaviour (diurnal cycle vs constant); 1mm HSS significantly better for Quality Controlled observations. Conclusions: Verification against different networks / with or without thinning / with or without quality control: exhibits significant differences, affect interpretation of verification results (e.g. over/under estimation for the bias, ranking for error stdev).

Impact of observation uncertainty on verification results ECMWF (Zied Ben Bouallegue) Method: “perturbed-member” approach following Saetra et al. 2004 random noise added to each ensemble member standard deviation of the observation uncertainty estimated by data assimilation experts Results: Large impact on the ensemble spread at “short” lead times major impact on the ensemble reliability decisive impact on the interpretation of the verification results Saetra O, Hersbach H, Bidlot JR, DS Richardson, 2004. Effects of observation errors on the statistics for ensemble spread and reliability. Mon. Weather Rev. 132: 1487-150.

RMSE [m/s] and ensemble spread [m/s] of wind speed at 500 hPa 2 experiments are compared: reference (dashed) and new (full lines) as a function of the lead time [d] RMSE Spread without obs. uncertainty Spread with obs. uncertainty Obs. uncertainty has no impact on the ensemble mean and so no impact on RMSE

CRPS difference [m/s] of wind speed at 500 hPa reference exp. – new exp. (positive means new is better) as a function of the lead time [d] without observation uncertainty with observation uncertainty

CMC (Fortin, et. al): apparent under-dispersion Comparison of RMSE to the spread calculated as the average of the standard deviation

Corrected spread using variance instead of standard deviation De-biased spread compared to a biased estimation of RMSE Spread multiplied by sqrt((R+1)/R), where R=20 (no. of EPS members)

RMSE corrected for observation error MSE = RMSE2F + RMSE2O De-biased spread compared to de-biased RMSE Radiosonde error evaluated at 6.9m for GZ500

Estimated radiosonde observation error hPa TT(C) T-Td(C) UV(m/s) GZ(m) 1000 1.5 1.8 2 4.9 925 1.3 2.3 2.1 4.5 850 1 3.1 2.1 4.6 700 0.9 3.6 2.2 5.3 500 0.8 3.9 2.3 6.9 400 0.7 3.6 2.4 8.4 300 0.7 3.4 2.5 10.5 250 0.9 3.4 2.6 11.4 200 1.1 4.6 2.8 12 150 1.3 4.6 3 12.8 100 1.4 4.6 3 14.8 70 1.7 4.6 3 16.6 50 1.7 4.6 2.9 18.5 30 1.7 4.6 2.8 21.7 20 1.7 4.6 3 26.8 10 2.1 4.6 3.5 36.1 7 2.5 4.6 4.5 45 5 3.6 4.6 5.5 55 3 5 4.6 4.5 70

Reducing radiosonde observation error GZ and RH Both variables are currently derived quantities Meanwhile Relative humidity is directly measured GPS sondes are able to provide significantly more accurate position data The ET-OWFPS should propose to relevant bodies that relative humidity and GPS based geopotential height be reported from the radiosonde

Conclusions To account for observation uncertainties in verification practices Identify major sources of observation uncertainties and quantify their effects on verification correct observation uncertainties → quality control incorporate observation uncertainty in verification results →probabilistic approach + confidence intervals