The University of Washington Mesoscale Short-Range Ensemble System Eric P. Grimit, F. Anthony Eckel, Richard Steed, Clifford F. Mass University of Washington.

Slides:



Advertisements
Similar presentations
Chapter 13 – Weather Analysis and Forecasting
Advertisements

ECMWF long range forecast systems
Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
Mesoscale Probabilistic Prediction over the Northwest: An Overview Cliff Mass Adrian Raftery, Susan Joslyn, Tilmann Gneiting and others University of Washington.
Gridded OCF Probabilistic Forecasting For Australia For more information please contact © Commonwealth of Australia 2011 Shaun Cooper.
2012: Hurricane Sandy 125 dead, 60+ billion dollars damage.
Mesoscale Probabilistic Prediction over the Northwest: An Overview Cliff Mass University of Washington.
Mesoscale Probabilistic Prediction over the Northwest: An Overview Cliff Mass University of Washington.
Description and Preliminary Evaluation of the Expanded UW S hort R ange E nsemble F orecast System Maj. Tony Eckel, USAF University of Washington Atmospheric.
May 30, 2003 Tony Eckel, Eric Grimit, and Cliff Mass UW Atmospheric Sciences This research was supported by the DoD Multidisciplinary University Research.
Revolutions in Remote Sensing Greatly Enhanced Weather Prediction from the 1950s Through Today.
Introduction to Weather Forecasting Cliff Mass Department of Atmospheric Sciences University of Washington.
Patrick Tewson University of Washington, Applied Physics Laboratory Local Bayesian Model Averaging for the UW ProbCast Eric P. Grimit, Jeffrey Baars, Clifford.
Update on the Regional Modeling System NASA Roses Meeting April 13, 2007.
MOS Developed by and Run at the NWS Meteorological Development Lab (MDL) Full range of products available at:
Overview of the Pacific Northwest Environmental Prediction System.
Update on the Regional Modeling System Cliff Mass, David Ovens, Richard Steed, Mark Albright, Phil Regulski, Jeff Baars, David Carey Northwest Weather.
Brian Ancell, Cliff Mass, Gregory J. Hakim University of Washington
Update on the Regional Modeling System Cliff Mass, David Ovens, Richard Steed, Mark Albright, Phil Regulski, Jeff Baars, Eric Grimit.
The Puget Sound Regional Environmental Prediction System: An Update.
Evaluation of a Mesoscale Short-Range Ensemble Forecasting System over the Northeast United States Matt Jones & Brian A. Colle NROW, 2004 Institute for.
Verification of Numerical Weather Prediction systems employed by the Australian Bureau of Meteorology over East Antarctica during the summer season.
JEFS Status Report Department of Atmospheric Sciences University of Washington Cliff Mass, Jeff Baars, David Carey JEFS Workshop, August
Forecasting Boot Camp. Major Steps in the Forecast Process Data Collection Quality Control Data Assimilation Model Integration Post Processing of Model.
22 May :30 PM General Examination Presentation Toward Short-Range Ensemble Prediction of Mesoscale Forecast Skill Eric P. Grimit University of Washington.
Northwest AIRQUEST 12/4/06 Cliff Mass University of Washington.
Ensembles and Probabilistic Forecasting. Probabilistic Prediction Because of forecast uncertainties, predictions must be provided in a probabilistic framework,
MOS Performance MOS significantly improves on the skill of model output. National Weather Service verification statistics have shown a narrowing gap between.
Consortium Meeting June 3, Thanks Mike! Hit Rates.
19 September :15 PM Ensemble Weather Forecasting Workshop; Val-Morin, QC Canada Toward Short-Range Ensemble Prediction of Mesoscale Forecast Error.
Introduction to Numerical Weather Prediction and Ensemble Weather Forecasting Tom Hamill NOAA-CIRES Climate Diagnostics Center Boulder, Colorado USA.
The Expanded UW SREF System and Statistical Inference STAT 592 Presentation Eric Grimit 1. Description of the Expanded UW SREF System (How is this thing.
Implementation and Evaluation of a Mesoscale Short-Range Ensemble Forecasting System Over the Pacific Northwest Eric P. Grimit and Clifford F. Mass Department.
Ensembles and The Future of Mesoscale Prediction Cliff Mass, University of Washington Tony Eckel, USAF and Univ. of WA.
30 January :00 NWSFO-Seattle An Update on Local MM5 Products at NWSFO-Seattle Eric P. Grimit SCEP NOAA/NWS-Seattle Ph.D. Candidate, University of.
A Brief History of Weather Forecasting. The Stone Age Prior to approximately 1955, forecasting was basically a subjective art, and not very skillful.
Mesoscale Deterministic and Probabilistic Prediction over the Northwest: An Overview Cliff Mass University of Washington.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Effective Mesoscale, Short-Range Ensemble Forecasting Tony Eckel** and Clifford F. Mass Presented By: Eric Grimit University of Washington Atmospheric.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
Forecasting Boot Camp.
“1995 Sunrise Fire – Long Island” Using an Ensemble Kalman Filter to Explore Model Performance on Northeast U.S. Fire Weather Days Michael Erickson and.
Performance of the MOGREPS Regional Ensemble
Forecasting and Numerical Weather Prediction (NWP) NOWcasting Description of atmospheric models Specific Models Types of variables and how to determine.
Forecast Skill and Major Forecast Failures over the Northeastern Pacific and Western North America Lynn McMurdie and Cliff Mass University of Washington.
Observing Strategy and Observation Targeting for Tropical Cyclones Using Ensemble-Based Sensitivity Analysis and Data Assimilation Chen, Deng-Shun 3 Dec,
How can LAMEPS * help you to make a better forecast for extreme weather Henrik Feddersen, DMI * LAMEPS =Limited-Area Model Ensemble Prediction.
Forecasting Streamflow with the UW Hydrometeorological Forecast System Ed Maurer Department of Atmospheric Sciences, University of Washington Pacific Northwest.
23 June :30 PM Session 2: Mesoscale Predictability I; 10th Mesoscale Conference; Portland, OR Toward Short-Range Ensemble Prediction of Mesoscale.
Celeste Saulo and Juan Ruiz CIMA (CONICET/UBA) – DCAO (FCEN –UBA)
Notes on reforecasting and the computational capacity needed for future SREF systems Tom Hamill NOAA Earth System Research Lab presentation for 2009 National.
Probabilistic Forecasting. pdfs and Histograms Probability density functions (pdfs) are unobservable. They can only be estimated. They tell us the density,
Short-Range Ensemble Prediction System at INM José A. García-Moya SMNT – INM 27th EWGLAM & 12th SRNWP Meetings Ljubljana, October 2005.
Model Post Processing. Model Output Can Usually Be Improved with Post Processing Can remove systematic bias Can produce probabilistic information from.
. Outline  Evaluation of different model-error schemes in the WRF mesoscale ensemble: stochastic, multi-physics and combinations thereof  Where is.
The Challenge of Mesoscale Probabilistic Weather Prediction and Communication Cliff Mass University of Washington.
APPLICATION OF NUMERICAL MODELS IN THE FORECAST PROCESS - FROM NATIONAL CENTERS TO THE LOCAL WFO David W. Reynolds National Weather Service WFO San Francisco.
Insights from CMC BAMS, June Short Range The SPC Short-Range Ensemble Forecast (SREF) is constructed by post-processing all 21 members of the NCEP.
An Investigation of the Mesoscale Predictability over the Northeast U.S.        Brian A. Colle, Matthew Jones, and Joseph Olson Institute for Terrestrial.
Ensembles and Probabilistic Prediction. Uncertainty in Forecasting All of the model forecasts I have talked about reflect a deterministic approach. This.
Fly - Fight - Win 2 d Weather Group Mr. Evan Kuchera HQ AFWA 2 WXG/WEA Template: 28 Feb 06 Approved for Public Release - Distribution Unlimited AFWA Ensemble.
VERIFICATION OF A DOWNSCALING SEQUENCE APPLIED TO MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR GLOBAL FLOOD PREDICTION Nathalie Voisin, Andy W. Wood and.
Toward an Effective Short-Range Ensemble Forecast System October 17, 2003 Maj Tony Eckel, USAF University of Washington Atmospheric Sciences Department.
University of Washington Ensemble Systems for Probabilistic Analysis and Forecasting Cliff Mass, Atmospheric Sciences University of Washington.
Ensembles and Probabilistic Prediction
A Northwest Consortium for Regional Climate Modelling
Mesoscale Probabilistic Prediction over the Northwest: An Overview
The Stone Age Prior to approximately 1960, forecasting was basically a subjective art, and not very skillful. Observations were sparse, with only a few.
Post Processing.
14th Cyclone Workshop Brian Ancell The University of Washington
Presentation transcript:

The University of Washington Mesoscale Short-Range Ensemble System Eric P. Grimit, F. Anthony Eckel, Richard Steed, Clifford F. Mass University of Washington

The UW Mesoscale Ensemble System: The Big Picture The UW Mesoscale Ensemble System was born out of our experience with high-resolution prediction: MM5 run at 36, 12 and 4 km twice a day for many years. High-resolution forecasts can produce highly realistic mesoscale structures, but there is considerable uncertainty in initial conditions and physics. High resolution can amplify such uncertainty and thus it is dangerous to provide users with high resolution output for direct and literal use. Mesoscale ensembles are probably the best way to provide the probabilistic information required by users…information they are currently denied…but there are significant roadblocks that need to be overcome.

Mesoscale Ensembles: In its Infancy At a national level, mesoscale ensembles are at a very primitive stage: NCEP’s system at 48 km grid spacing is not really on the mesoscale and uses a method (breeding) that is probably not ideal for short range ensembles. Operational SREF have not had bias removal or proper post-processing There have been a few short-term ensemble experiments (e.g. SAMEX)--generally for convection The value of mesoscale SRRF have not been proven, useful intuitive products are lacking, and there is little experience in the user community. But most of us are convinced that this is the way to go.

The UW Mesoscale Ensemble System Essential Features A true mesoscale system: km. Out to 48 h Testing the value of mesoscale ensembles over a different environment: eastern Pacific, coastal zone, area of terrain. Moist to desert locations. The diversity generation is based on using the varying initial conditions and boundary conditions from a broad range of operational synoptic models… all with differing data assimilation, model structure and numerics, and physics. Finesses BC problem. This approach is politically unacceptable to many operational centers (who don’t like to be dependent on others), but probably represents a high bar for others to attempt to better. Additional diversity from varying model physics and surface boundary conditions.

UW Mesoscale Ensemble System Single limited-area mesoscale modeling system (MM5) 2-day (48-hr) forecasts at 0000 UTC in real-time since January Now twice a day 36 and 12-km domains. Configurations of the MM5 short-range ensemble grid domains. (a) Outer 151  127 domain with 36-km horizontal grid spacing. (b) Inner 103  100 domain with 12-km horizontal grid spacing. a)b) 36-km 12-km

Resolution ( 45  N ) Objective Abbreviation/Model/Source Type Computational Distributed Analysis avn, Global Forecast System (GFS), SpectralT254 / L641.0  / L14 SSI National Centers for Environmental Prediction~55 km~80 km3D Var cmcg, Global Environmental Multi-scale (GEM), Finite0.9  0.9  /L  / L113D Var Canadian Meteorological Centre Diff ~70 km ~100 km eta, limited-area mesoscale model, Finite32 km / L45 90 km / L37SSI National Centers for Environmental Prediction Diff.3D Var gasp, Global AnalysiS and Prediction model,SpectralT239 / L291.0  / L11 3D Var Australian Bureau of Meteorology~60 km~80 km jma, Global Spectral Model (GSM),SpectralT106 / L  / L13OI Japan Meteorological Agency~135 km~100 km ngps, Navy Operational Global Atmos. Pred. System,SpectralT239 / L301.0  / L14OI Fleet Numerical Meteorological & Oceanographic Cntr. ~60 km~80 km tcwb, Global Forecast System,SpectralT79 / L181.0  / L11 OI Taiwan Central Weather Bureau~180 km~80 km ukmo, Unified Model, Finite5/6  5/9  /L30same / L123D Var United Kingdom Meteorological Office Diff.~60 km “Native” Models/Analyses Available

UW Ensemble System Made use of the infrastructure already in place (grids, data feeds, systems and application programmers), plus took advantage the natural parallelization using large clusters..which are ideal for ensemble work. The system was built and maintained by two exceptional graduate students (Eric Grimit and Tony Eckel), plus key staff members (Rick Steed, David Ovens) Was designed as a real-time system from the beginning, with verification as a core component Had two operational groups as prime subjects: the Seattle NWS office and the Navy Whidbey Island forecasting detachment. Had strong partners with UW Statistics and APL (under MURI support)

“Ensemblers” Eric Grimit (l ) and Tony Eckel (r) are besides themselves over the acquisition of the new 20 processor Athlon cluster Computer Infrastructure: Linux Dual- Processor Clusters

Key Goals: End-to-End Evaluation To build a viable, operational mesoscale SREF To verify it using both deterministic (ensemble mean) and probabilistic approaches. To determine whether a system with members of varying skill can be combined to produce reliable and usefully sharp ensemble pdfs. To determine the best approaches for post-processing (e.g., bias removal, calibration, optimal pdf generation) To determine whether the ensemble system can be used to predict deterministic and probabilistic skill To create ensemble-based products that are valuable to users. To learn how to optimally combine high resolution deterministic forecasts and lower-res ensembles

UW Ensemble Web Page

48 h Probabilistic Forecast Of 1 inch In 12h:

UW’s Ensemble of Ensembles # of EF Initial Forecast Forecast Name Members Type Conditions Model(s) Cycle Domain ACME 17SMMA 8 Ind. Analyses, “Standard” 00Z 36km, 12km 1 Centroid, MM5 8 Mirrors UWME 8SMMA 8 Independent “Standard” 00Z 36km, 12km Analyses MM5 UWME+ 8PMMA 8 Independent 8 MM5 00Z 36km, 12km Analyses variations 8 PME 8 MMMA8 Independent operational, 00Z, 12Z 36km Analyses large-scale Homegrown Imported ACME: Analysis-Centroid Mirroring Ensemble PME: Poor Man’s Ensemble MM5: 5 th Generation PSU/NCAR Mesoscale Modeling System SMMA: Single Model Multi-Analysis PMMA: Perturbed-model Multi-Analysis MMMA: Multi-model Multi-Analysis

Multi-Analysis, Mixed Physics : UWME + see Eckel (2003) for further details

 Total of 129, 48-h forecasts (Oct 31, 2002 – Mar 28, 2003) all initialized at 00z - Missing forecast case days are shaded  Analyzed Parameters :  - 36 km Domain: Mean Sea Level Pressure (MSLP), 500mb Geopotential Height (Z km Domain: 10-m Wind Speed (WS 10 ), 2-m Temperature (T 2 ) Research Dataset 36 km Domain (151  127) 12 km Domain (101  103)  Verification: - 36 km Domain: centroid analysis (mean of 8 independent analyses, available at 12-h increments) - 12 km Domain: RUC20 analysis (NCEP 20 km mesoscale analysis, available at 3-h increments) NovemberDecemberJanuary February March

Subjective Evaluation Often large differences in initializations and forecasts Very useful forecasting tool

Thanksgiving Day 2001 Wind Forecast Bust eta-MM5 Initialized 00z, 21 Nov 01 (Tue. evening) 42h Forecast, valid 10AM Thursday Eta-MM5 model 12-km runs on Tue & Wed forecast severe wind storm for the Puget Sound on Thu AM. Expected widespread damage and power outage was all over the news. Verification, 10AM Thursday The storm came ashore weaker and further south giving light and variable winds in the Puget Sound.

42h forecast (valid Thu 10AM) 13: avn* 11: ngps* 12: cmcg* 10: tcwb* 9: ukmo* 8: eta* Verification 1: cent 7: avn 5: ngps 6: cmcg 4: tcwb 3: ukmo 2: eta - Reveals high uncertainty in storm track and intensity - Indicates low probability of Puget Sound wind event SLP and winds

The Importance of Grid-Based Bias Removal Particularly important for mesoscale SREF in which model biases are often large Significantly improves SREF utility by correctly adjusting the forecast PDF

Training Period Bias-corrected Forecast Period Training Period Bias-corrected Forecast Period Training Period Bias-corrected Forecast Period Gridded Bias Removal N number of forecast cases (14) f i,j,t forecast at grid point (i, j ) and lead time (t ) o i,j observation (centroid-analysis or ruc20 verification) For the current forecast cycle: 1) Calculate bias at every grid point and lead time using previous 2 weeks’ forecasts 2) Postprocess current forecast to correct for bias: f i,j,t bias-corrected forecast at grid point (i, j ) and lead time (t) * NovemberDecemberJanuary February March

Average RMSE (  C) and (shaded) Average Bias Uncorrected ACME core+ T 2 12 h 24 h 36 h 48 h

Average RMSE (  C) and (shaded) Average Bias Bias-Corrected ACME core+ T 2 12 h 24 h 36 h 48 h

Physics and Surface Diversity Substantially Enhance a Mesoscale SREF, Particularly for Surface Quantities

(d) T 2 (c) WS 10 (b) MSLP (a) Z 500 *UWME *UWME+ VOP 5.0 % 4.2 % 9.0 % 6.7 % 25.6 % 13.3 % 43.7 % 21.0 % Surface/Mesoscale Variable ( Errors Depend on Model Uncertainty ) Synoptic Variable ( Errors Depend on Analysis Uncertainty ) Comparison of 36-h VRHs *UWME *UWME+ *UWME *UWME+ *UWME *UWME+

*UWME core UWME core *UWME core+ UWME core+ Uncertainty Skill for P(T 2 < 0°C) *Indicates bias removal Importance Of Bias Removal And Physics Diversity

Smaller Scales Generate Ensemble Dispersion Ensemble Variance (m 2 /s 2 ) UWME core WS km 12-km 36­km Grid Spacing 12­km Grid Spacing 3­h Cumulative Precipitation

Using members with varying skill is OK, but there is a limit to how bad a member can be and still add value to the ensemble. Removing Very Unskillful Members Can Help

Relating Forecast Skill and Model Spread Mean Absolute Error of Wind Direction is Far Less When Spread is Low

[c.f. Grimit and Mass 2002] UW MM5 SREF 10-m Wind Direction

12-km T 2 Spatial Distribution of Local Spread-Error Correlation Domain-Averaged STD-AEM correlation ~ 0.62 Maximum Local STD-AEM correlation ~ 0.54 (no bias correction) UWME

A Simple Stochastic Model of Spread-Skill An extension of the Houtekamer (1993) model of spread-skill PURPOSES: 1)To establish practical limits of forecast error predictability that could be expected given perfect ensemble forecasts of finite size. 2)To address the user-dependent nature of forecast error estimation by employing a variety of predictors and error metrics. 3)To extend spread-skill analysis to a probabilistic framework of forecast error prediction.

A Simple Stochastic Model of Spread-Skill Statistical ensemble forecasts at a single, arbitrary location 10 4 realizations (cases) Assumed: –Gaussian statistics –statistically consistent (perfectly reliable) ensemble forecasts Varied: –temporal spread variability (  ) –finite ensemble size (M) –spread and skill metrics (continuous and categorical) 1.Draw today’s “forecast uncertainty” from a log-normal distribution (Houtekamer 1993 model). ln(  ) ~ N( ln(  f ),    2.Create synthetic ensemble forecasts by drawing M values from the “true” distribution. F i ~ N( Z,   ) ; i = 1,2,…,M 3.Draw the verifying observation from the same “true” distribution (statistical consistency). V ~ N( Z,   )

Idealized Spread-Error Correlations STD-AEM correlation spread STD =Standard Deviation error AEM = Absolute Error of the ensemble Mean STD-error correlation N =  = 0.5 error AES = Absolute Error of a Single ensemble member AAE = ensemble-Average Absolute Error RASE = square Root of ensemble-Average Squared Error CRPS = Continuous Ranked Probability Score

The Conditional Error Climatology (CEC) Method Use historical errors, conditioned by spread category, as probabilistic forecast error predictions Idealized, statistical ensemble forecasts. N = 2000 M = 50;  = 0.5

Probabilistic Forecast Error Predictability Or might use the ensemble variance directly to get a probabilistic error forecast ENS-PDF –Most skillful approach if PDF is well-forecast Idealized, statistical ensemble forecasts. N = M = 50;  = 0.5

Effect of Post-Processing Bias correction reduces spread-error correlations and effectiveness of the VAR-CEC approach ENS-PDF closes the gap in performance, but is still below the baseline UWME+ (14-day grid point bias correction) 12-km T 2

Future UW Ensemble Work Evaluation of value of temporal ensembles for adding to diversity of on-time ensembles and for prediction of ensemble skill Perfect grid-based bias removal of component members Replace physics ensemble with one based on key uncertainties in parameterizations. (Stochastic physics is not ready for prime time yet) Creation of a new generation of products to help break the ice with forecasters. Work with statistics on emos and BMA front-end to our ensemble system

Future Work Add model diversity using available WRF dynamical cores. Creation of web interface that combines ensemble and high-resolution products. Evaluation of value of nudging outer domain towards parent forecasts to improve diversity.

The END

Ensemble Post-Processing In several of the talks today you will be viewing some results of gridded bias correction of the individual ensemble forecasts. More terminology: * indicates bias removal All mesoscale modeling system have significant systematic biases. The biases vary by ensemble member, season, time of day, etc. Removal of such biases has a very beneficial effect on the value of ensembles.

Data Info Average of 65 forecasts (25 Nov 02 – 01 Feb 03) 36km domain from Rockies to central Pacific 2-week bias training for each forecast Verification: centroid analysis Results

Idealized Probabilistic Error Forecast Skill May use the ensemble variance directly to get a probabilistic error forecast ENS-PDF –Most skillful approach if PDF is well-forecast Predictability highest for extreme spread cases –Reinforces earlier results Continuous case Idealized, statistical ensemble forecasts. N = M = 50;  = 0.5

Idealized Probabilistic Error Forecast Skill Idealized, statistical ensemble forecasts. N = M = 50;  = 0.5 (categorical case)

The Future is Probabilistic We never will know exactly what the forecast will be due to initialization uncertainty, inadequate model physics, and other reasons. Thus, probabilistic forecasting is the only rational way to forecast. We will also gain some ability to forecast (probabilistically) forecast skill We have to retrain ourselves AND our users. The UW system is an attempt to develop and evaluate this approach using ensembles. We acutely need feedback from forecasters.

Probabilistic Products Currently Using Uniform Ranks (UR) method. Democratic voting (DV) method was good as MOS. UR is even better. Calibration would provide further improvements.

Comparison of Brier skill scores for NGM MOS and 12-km ACME core forecasts of 12h probability of precipitation accumulations greater than 0.01 in (CAT1). The skill scores are relative to the sample climatology during the period from 1 Nov 2002 – 20 Jan 2003.