Comparing and Contrasting Post-processing Approaches to Calibrating Ensemble Wind and Temperature Forecasts Tom Hopson Luca Delle Monache, Yubao Liu, Gregory.

Slides:



Advertisements
Similar presentations
Medium-range Ensemble Streamflow forecast over France F. Rousset-Regimbeau (1), J. Noilhan (2), G. Thirel (2), E. Martin (2) and F. Habets (3) 1 : Direction.
Advertisements

Measuring the performance of climate predictions Chris Ferro, Tom Fricker, David Stephenson Mathematics Research Institute University of Exeter, UK IMA.
Model Evaluation Tools MET. What is MET Model Evaluation Tools ( MET )- a powerful and highly configurable verification package developed by DTC offering:
KMA will extend medium Range forecast from 7day to 10 day on Oct A post processing technique, Ensemble Model Output Statistics (EMOS), was developed.
Details for Today: DATE:3 rd February 2005 BY:Mark Cresswell FOLLOWED BY:Assignment 2 briefing Evaluation of Model Performance 69EG3137 – Impacts & Models.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Improving COSMO-LEPS forecasts of extreme events with.
Creating probability forecasts of binary events from ensemble predictions and prior information - A comparison of methods Cristina Primo Institute Pierre.
Forecasting Uncertainty Related to Ramps of Wind Power Production
A community statistical post-processing system Thomas Nipen and Roland Stull University of British Columbia.
Statistical Postprocessing of Weather Parameters for a High-Resolution Limited-Area Model Ulrich Damrath Volker Renner Susanne Theis Andreas Hense.
1 Kalman filter, analog and wavelet postprocessing in the NCAR-Xcel operational wind-energy forecasting system Luca Delle Monache Research.
United States Coast Guard 1985 Evaluation of a Multi-Model Storm Surge Ensemble for the New York Metropolitan Region Brian A. Colle Tom Di Liberto Stony.
Department of Meteorology and Geophysics University of Vienna since 1851 since 1365 TOWARDS AN ANALYSIS ENSEMBLE FOR NWP-MODEL VERIFICATION Manfred Dorninger,
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Multi-Model Ensembling for Seasonal-to-Interannual Prediction: From Simple to Complex Lisa Goddard and Simon Mason International Research Institute for.
Performance of the MOGREPS Regional Ensemble
Copyright 2012, University Corporation for Atmospheric Research, all rights reserved Verifying Ensembles & Probability Fcsts with MET Ensemble Stat Tool.
Forecasting wind for the renewable energy market Matt Pocernich Research Applications Laboratory National Center for Atmospheric Research
Ensemble Data Assimilation and Uncertainty Quantification Jeffrey Anderson, Alicia Karspeck, Tim Hoar, Nancy Collins, Kevin Raeder, Steve Yeager National.
ESA DA Projects Progress Meeting 2University of Reading Advanced Data Assimilation Methods WP2.1 Perform (ensemble) experiments to quantify model errors.
Application of a Multi-Scheme Ensemble Prediction System for Wind Power Forecasting in Ireland.
Verification of ensembles Courtesy of Barbara Brown Acknowledgments: Tom Hamill, Laurence Wilson, Tressa Fowler Copyright UCAR 2012, all rights reserved.
Recent developments in seasonal forecasting at French NMS Michel Déqué Météo-France, Toulouse.
Rank Histograms – measuring the reliability of an ensemble forecast You cannot verify an ensemble forecast with a single.
Operational Flood Forecasting for Bangladesh: Tom Hopson, NCAR Peter Webster, GT A. R. Subbiah and R. Selvaraju, ADPC Climate Forecast Applications for.
ISDA 2014, Feb 24 – 28, Munich 1 Impact of ensemble perturbations provided by convective-scale ensemble data assimilation in the COSMO-DE model Florian.
Tutorial. Other post-processing approaches … 1) Bayesian Model Averaging (BMA) – Raftery et al (1997) 2) Analogue approaches – Hopson and Webster, J.
Streamflow Predictability Tom Hopson. Conduct Idealized Predictability Experiments Document relative importance of uncertainties in basin initial conditions.
EUROBRISA Workshop – Beyond seasonal forecastingBarcelona, 14 December 2010 INSTITUT CATALÀ DE CIÈNCIES DEL CLIMA Beyond seasonal forecasting F. J. Doblas-Reyes,
Heidke Skill Score (for deterministic categorical forecasts) Heidke score = Example: Suppose for OND 1997, rainfall forecasts are made for 15 stations.
Improving Ensemble QPF in NMC Dr. Dai Kan National Meteorological Center of China (NMC) International Training Course for Weather Forecasters 11/1, 2012,
MODEL OUTPUT STATISTICS (MOS) TEMPERATURE FORECAST VERIFICATION JJA 2011 Benjamin Campbell April 24,2012 EAS 4480.
Verification of IRI Forecasts Tony Barnston and Shuhua Li.
Multi-Model or Post- processing: Pros and Cons Tom Hopson - NCAR Martyn Clark - NIWA Andrew Slater - CIRES/NSIDC.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Cost Efficient Use of COSMO-LEPS Reforecasts Felix Fundel,
Probabilistic Forecasting. pdfs and Histograms Probability density functions (pdfs) are unobservable. They can only be estimated. They tell us the density,
Short-Range Ensemble Prediction System at INM José A. García-Moya SMNT – INM 27th EWGLAM & 12th SRNWP Meetings Ljubljana, October 2005.
Model Post Processing. Model Output Can Usually Be Improved with Post Processing Can remove systematic bias Can produce probabilistic information from.
. Outline  Evaluation of different model-error schemes in the WRF mesoscale ensemble: stochastic, multi-physics and combinations thereof  Where is.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Local Probabilistic Weather Predictions for Switzerland.
Title_Sli de Quality Controlling Wind Power Data for Data Mining Applications Gerry Wiener Research Applications Laboratory Software Engineering Assembly,
Quantile regression as a means of calibrating and verifying a mesoscale NWP ensemble Tom Hopson 1 Josh Hacker 1, Yubao Liu 1, Gregory Roux 1, Wanli Wu.
18 September 2009: On the value of reforecasts for the TIGGE database 1/27 On the value of reforecasts for the TIGGE database Renate Hagedorn European.
© 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory Ensemble-4DWX.
Renewable Energy Requirements Rapid Update Analysis/Nowcasting Workshop - June 4, 2015 Sue Ellen Haupt Director, Weather Systems & Assessment Program Research.
Spatial Verification Methods for Ensemble Forecasts of Low-Level Rotation in Supercells Patrick S. Skinner 1, Louis J. Wicker 1, Dustan M. Wheatley 1,2,
Kris Shrestha James Belanger Judith Curry Jake Mittelman Phillippe Beaucage Jeff Freedman John Zack Medium Range Wind Power Forecasts for Texas.
Nathalie Voisin 1, Florian Pappenberger 2, Dennis Lettenmaier 1, Roberto Buizza 2, and John Schaake 3 1 University of Washington 2 ECMWF 3 National Weather.
Common verification methods for ensemble forecasts
NATIONAL CENTER FOR ATMOSPHERIC RESEARCH The NCAR/ATEC Operational Mesoscale Ensemble Data Assimilation and Prediction System – “Ensemble-RTFDDA” Yubao.
A Random Subgrouping Scheme for Ensemble Kalman Filters Yun Liu Dept. of Atmospheric and Oceanic Science, University of Maryland Atmospheric and oceanic.
Judith Curry James Belanger Mark Jelinek Violeta Toma Peter Webster 1
Details for Today: DATE:13 th January 2005 BY:Mark Cresswell FOLLOWED BY:Practical Dynamical Forecasting 69EG3137 – Impacts & Models of Climate Change.
Development of an Ensemble Gridded Hydrometeorological Forcing Dataset over the Contiguous United States Andrew J. Newman 1, Martyn P. Clark 1, Jason Craig.
DOWNSCALING GLOBAL MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR FLOOD PREDICTION Nathalie Voisin, Andy W. Wood, Dennis P. Lettenmaier University of Washington,
VERIFICATION OF A DOWNSCALING SEQUENCE APPLIED TO MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR GLOBAL FLOOD PREDICTION Nathalie Voisin, Andy W. Wood and.
National Oceanic and Atmospheric Administration’s National Weather Service Colorado Basin River Forecast Center Salt Lake City, Utah 11 The Hydrologic.
Improving Numerical Weather Prediction Using Analog Ensemble Presentation by: Mehdi Shahriari Advisor: Guido Cervone.
Google Meningitis Modeling Tom Hopson October , 2010.
Data Assimilation Research Testbed Tutorial
Ensemble Forecasting: Calibration, Verification, and use in Applications Tom Hopson.
Verifying and interpreting ensemble products
Tom Hopson, Jason Knievel, Yubao Liu, Gregory Roux, Wanli Wu
Precipitation Products Statistical Techniques
Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier
Google Meningitis Modeling
Integration of NCAR DART-EnKF to NCAR-ATEC multi-model, multi-physics and mutli- perturbantion ensemble-RTFDDA (E-4DWX) forecasting system Linlin Pan 1,
COSMO-LEPS Verification
Ensemble-4DWX update: focus on calibration and verification
Measuring the performance of climate predictions
Presentation transcript:

Comparing and Contrasting Post-processing Approaches to Calibrating Ensemble Wind and Temperature Forecasts Tom Hopson Luca Delle Monache, Yubao Liu, Gregory Roux, Wanli Wu, Will Cheng, Jason Knievel, Sue Haupt

Army Test and Evaluation Command: Dugway Proving Ground

Dugway Proving Grounds, Utah e.g. T Thresholds Includes random and systematic differences between members. Not an actual chance of exceedance unless calibrated.

Xcel Energy Service Areas Wind Farms (50+) ~3200 MW Northern States Power (NSP) Public Service of Colorado (PSCO) Southwestern Public Service (SPS) 3.4 million customers (electric) Annual revenue $11B Copyright 2010 University Corporation for Atmospheric Research

WRF RTFDDA Model Domains Ensemble System (30 members) D1 = 30 km D2 = 10 km 0-48 hrs Real Time Four Dimensional Data Assimilation (RTFDDA) 41 vertical levels Vary: Multi-models Lateral B.Cs. Model Physics External forcing Yubao Liu -- for further

Goals of an EPS Predict the observed distribution of events and atmospheric states Predict uncertainty in the day’s prediction Predict the extreme events that are possible on a particular day Provide a range of possible scenarios for a particular forecast

Outline I.Brief overview of: 1) quantile regression (QR), 2) logistic regression (LR), 3) umbrella post- processing procedure, 4) “analog Kalman filter” (ANKF) II.2 nd moment calibration via rank histograms III.Skill score comparisons and improvements with increased hindcast data III.Example of blending approaches IV.Conclusions

Example of Quantile Regression (QR) Our application Fitting T quantiles using QR conditioned on: 1)Ranked forecast ens 2)ensemble mean 3)ensemble median 4) ensemble stdev 5) Persistence Hopson and Hacker 2012

Logistic Regression for probability of exceedance (climatological thresholds)

Probability/°K Temperature [K] Probability/°K Temperature [K] Forecast PDF climatological PDF Step I: determine climatological quantiles Probability Temperature [K] Step 3: use conditioned CDF to interpolate desired quantiles prior posterior Step 2: calculate conditional probs for each climat quan Probability Temperature [K] Final result: “sharper” posterior PDF represented by interpolated quans Hopson and Hacker 2012

T [K] Time forecastsobserved Regressor set: 1. reforecast ens 2. ens mean 3. ens stdev 4. persistence 5. LR quantile (not shown) Probability/°K Temperature [K] climatological PDF Step I: Determine climatological quantiles Step 2: For each quan, use forward step-wise cross-validation to select best regress set Selection requires: a) min QR cost function, b) binomial distrib at 95% confidence If requirements not met, retain climatological “prior” Step 3: segregate forecasts based on ens dispersion; refit models (Step 2) for each range Time forecasts T [K] I.II.III.II.I. Probability/°K Temperature [K] Forecast PDF prior posterior Final result: “sharper” posterior PDF represented by interpolated quans Hopson and Hacker 2012

National Security Applications Program Research Applications Laboratory Significant calibration regressors 3hr Lead-time 42hr Lead-time Station DPG S01

National Security Applications Program Research Applications Laboratory RMSE of ensemble members 3hr Lead-time 42hr Lead-time Station DPG S01

Time t = 0 day -1 day -2 day -6 day -5 day -4 day -3 day -7 OBS PRED KF-weight KF Delle Monache et al Analog Kalman Filter (ANKF) Deterministic method applied to each individual ensemble KF weighting run in analog space

Time t = 0 day -1 day -2 day -6 day -5 day -4 day -3 day -7 OBS PRED KF-weight KF ANKF AN “Analog” Space day -4 day -7 day -5 day -3 day -2 day -1 day -6 PRED OBS farthest analog closest analog NOTE This procedure is applied independently at each observation location and for a given forecast time Delle Monache et al. 2010

Outline I.Brief overview of: 1) quantile regression (QR), 2) logistic regression (LR), 3) umbrella post- processing procedure, 4) “analog Kalman filter” (ANKF) II.2 nd moment calibration via rank histograms III.Skill score comparisons and improvements with increased hindcast data III.Example of blending approaches IV.Conclusions

42-hr dewpoint time series Before Calibration After Calibration (QR) Station DPG S01

Original ensemble QR LRANKF Rank Histograms 15hr lead wind forecasts

Skill measures used: 1)Rank histogram (converted to scalar measure) 2)Root Mean square error (RMSE) 3)Rank Probability Score (RPS) 4)Relative Operating Characteristic (ROC) curve Skill Scores Comparing to original ensemble forecast, but with bias removed => “reference forecast”

Blue - QR Red - ANKF Green - LR Skill Score Comparison For wind farm CEDC, 3hr lead forecasts Reference forecast: original wind speed ensemble w/ bias removed Data size: 900pts

Rank Histogram scalar QR LR RMSE Skill Scores Dependence on Training Data Size Upper dashed – 900pts Solid – 600pts Lower dashed – 300pts Reference Forecast: Original wind speed ensemble w/ bias removed

ROC QR LR RPSS Skill Scores Dependence on Training Data Size (cont) Reference Forecast: Original wind speed ensemble w/ bias removed Upper dashed – 900pts Solid – 600pts Lower dashed – 300pts

23 RPS ROC RMSE Brier Score Wind farm TWBT ANKF QR QR + ANKF

24 original ANKF QR QR + ANKF TWBT 6-h

Summary  “step-wise cross-validation”-based post-processing framework provides a method to ensure forecast skill no worse than climatological and persistence  Also provides an umbrella to blend together multiple post-processing approaches as well as multiple regressors, and to diagnose their utility for a variety of cost functions  Quantile regression and logistic regression useful tools for improving 2 nd moment of ensemble distributions  See significant skill gains with increasing “hindcast data” amount for a variety of skill measures  Blending of post-processing approaches can also further enhance final forecast skill (e.g. ANKF and QR) by capturing “best of both worlds” Further questions: or