Using Ensemble Model Output Statistics to Improve 12-Hour Probability of Precipitation Forecasts John P. Gagan NWS Springfield, MO Chad Entremont NWS Jackson,

Slides:



Advertisements
Similar presentations
Chapter 13 – Weather Analysis and Forecasting
Advertisements

Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
Some Behaviors of CPC Monthly Precipitation Forecasts In Nebraska James McCormick University of Nebraska-Lincoln School of Natural Resources August 16,
A Brief Guide to MDL's SREF Winter Guidance (SWinG) Version 1.0 January 2013.
Climographs.
1 of Introduction to Forecasts and Verification.
Upcoming Changes in Winter Weather Operations at the Weather Prediction Center (WPC) Great Lakes Operational Meteorological Workshop Dan Petersen, Wallace.
“Where America’s Climate, Weather and Ocean Services Begin” NCEP CONDUIT UPDATE Brent A Gordon NCEP Central Operations January 31, 2006.
March 17, 2011 Severe Weather Workshop Mike York (Forecaster / Winter Weather Program Leader)
Analysis of Model Forecasts of Significant Cold Fronts Using MOS Output Steve Amburn, SOO WFO Tulsa, Oklahoma.
Introduction to Probability and Probabilistic Forecasting L i n k i n g S c i e n c e t o S o c i e t y Simon Mason International Research Institute for.
National Weather Service Protecting Lives and Property Precipitation Potential Placement ER Flash Flood Workshop Jeff Myers/Jim Noel NOAA/NWS/OHRFC
Allentown, PA (A true craptastic town). Forecast GFS MOS ~0.15 NAM MOS ~0.05 USL ~0.04.
1 Climatographs. 2 How to Read a Climatograph:  Every place on Earth has weather.   However, different places on Earth have different types of "typical"
Huntsville, AL CST (UTC -6 Hours) Week #2 Forecast November 15, 2010.
PERFORMANCE OF NATIONAL WEATHER SERVICE FORECASTS VERSUS MODEL OUTPUT STATISTICS Jeff Baars Cliff Mass Mark Albright University of Washington, Seattle,
Reliability Trends of the Global Forecast System Model Output Statistical Guidance in the Northeastern U.S. A Statistical Analysis with Operational Forecasting.
MOS Developed by and Run at the NWS Meteorological Development Lab (MDL) Full range of products available at:
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
4-1 Collect & Interpret Data The Quality Improvement Model Use SPC to Maintain Current Process Collect & Interpret Data Select Measures Define Process.
Transitioning unique NASA data and research technologies to the NWS 1 Evaluation of WRF Using High-Resolution Soil Initial Conditions from the NASA Land.
Evaluation of a Mesoscale Short-Range Ensemble Forecasting System over the Northeast United States Matt Jones & Brian A. Colle NROW, 2004 Institute for.
MOS Performance MOS significantly improves on the skill of model output. National Weather Service verification statistics have shown a narrowing gap between.
The 10th annual Northeast Regional Operational Workshop, Albany, NY Verification of SREF Aviation Forecasts at Binghamton, NY Justin Arnott NOAA / NWS.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
The March 01/02 Non-Winter Weather Event: Part 1 Michael W. Cammarata Anthony W. Petrolito.
Weather Forecasting - II. Review The forecasting of weather by high-speed computers is known as numerical weather prediction. Mathematical models that.
1 Localized Aviation Model Output Statistics Program (LAMP): Improvements to convective forecasts in response to user feedback Judy E. Ghirardelli National.
Using Short Range Ensemble Model Data in National Fire Weather Outlooks Sarah J. Taylor David Bright, Greg Carbin, Phillip Bothwell NWS/Storm Prediction.
Phillip Bothwell and Patrick Marsh-Storm Prediction Center Lindsey Richardson –CIMMS Dry Thunderstorm Forecasting Using Perfect Prog(nosis) Forecast results.
Analyses of Rainfall Hydrology and Water Resources RG744
Jamie Wolff Jeff Beck, Laurie Carson, Michelle Harrold, Tracy Hertneky 15 April 2015 Assessment of two microphysics schemes in the NOAA Environmental Modeling.
“1995 Sunrise Fire – Long Island” Using an Ensemble Kalman Filter to Explore Model Performance on Northeast U.S. Fire Weather Days Michael Erickson and.
Aaron Reynolds WFO Buffalo.  All NWS radars have dual polarization capability.  Dual Pol Expectations:  Ability to determine Precip type.  More info.
2013 Spring Weather Outlook MARAC Region V Meeting April 24, 2013 Cindy Bean Meteorologist National Weather Service Weather Forecast Office San Joaquin.
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Air Force Weather Agency Lt Col Jeffrey S. Tongue Individual Mobilization Augmentee Air Force Weather.
Sensitivity Study of Precipitation and T2m to Soil Moisture Using NCEP WRF Ensemble Jun Du and George Gayno EMC/NCEP/NOAA.
National Weather Service Model Flip-Flops and Forecast Opportunities Bernard N. Meisner Scientific Services Division NWS Southern Region Fort Worth, Texas.
Ensemble products (at 12 hour intervals from 00 hr to 144 hrs): Africa Desk GFS Ensemble Web Page Spaghetti plots 500mb height (5500 and 5670, 5700 and.
Verification of the Cooperative Institute for Precipitation Systems‘ Analog Guidance Probabilistic Products Chad M. Gravelle and Dr. Charles E. Graves.
1 The Historic Ice Storm of January 26-28, OUTLINE Brief review of the stormBrief review of the storm Review of the environment and forcing (Why.
NOAA’s Seasonal Hurricane Forecasts: Climate factors influencing the 2006 season and a look ahead for Eric Blake / Richard Pasch / Chris Landsea(NHC)
Regional Climate Simulations of summer precipitation over the United States and Mexico Kingtse Mo, Jae Schemm, Wayne Higgins, and H. K. Kim.
Chapter 9: Weather Forecasting Acquisition of weather information Acquisition of weather information Weather forecasting tools Weather forecasting tools.
1 Climate Test Bed Seminar Series 24 June 2009 Bias Correction & Forecast Skill of NCEP GFS Ensemble Week 1 & Week 2 Precipitation & Soil Moisture Forecasts.
MDL Lightning-Based Products Kathryn Hughes NOAA/NWS/OST December 3, 2003
Highest Confidence Forecasts Model agreement –CMC=NAM=GFS Run-to-run changes (dMod/dt) very small Models trending toward agreement –Example: OLD run: NAM=GFS.
Verification of Global Ensemble Forecasts Fanglin Yang Yuejian Zhu, Glenn White, John Derber Environmental Modeling Center National Centers for Environmental.
Model Post Processing. Model Output Can Usually Be Improved with Post Processing Can remove systematic bias Can produce probabilistic information from.
CC Hennon ATMS 350 UNC Asheville Model Output Statistics Transforming model output into useful forecast parameters.
Use of Mesoscale Ensemble Weather Predictions to Improve Short-Term Precipitation and Hydrological Forecasts Michael Erickson 1, Brian A. Colle 1, Jeffrey.
1 OUTPUT ANALYSIS FOR SIMULATIONS. 2 Introduction Analysis of One System Terminating vs. Steady-State Simulations Analysis of Terminating Simulations.
Aaron Reynolds WFO Buffalo
Ensemble variability in rainfall forecasts of Hurricane Irene (2011) Molly Smith, Ryan Torn, Kristen Corbosiero, and Philip Pegion NWS Focal Points: Steve.
The Record South Carolina Rainfall Event of 3-5 October 2015: NCEP Forecast Suite Success story John LaCorte Richard H. Grumm and Charles Ross National.
On the Challenges of Identifying the “Best” Ensemble Member in Operational Forecasting David Bright NOAA/Storm Prediction Center Paul Nutter CIMMS/Univ.
CFI GROUP WORLDWIDE ANN ARBOR ATLANTA BUENOS AIRES KUALA LUMPUR LONDON MADRID MELBOURNE MILAN PARIS PORTO ALEGRE SEOUL SHANGHAI STOCKHOLM National Weather.
Pearl River Coordination Meeting October 7, 2009 Dave Reed Hydrologist in Charge Lower Mississippi River Forecast Center.
2008 AT540 Forecast Contest! Compete against your classmates and TA for bragging rights and a chance to win extra points on your final lab grade! Apply.
Climate Prediction and Products Breakout CTB Meeting November 10, 2015.
Application of the CRA Method Application of the CRA Method William A. Gallus, Jr. Iowa State University Beth Ebert Center for Australian Weather and Climate.
Weather & Climate. Weather & Climate Definitions Weather- “the state of the atmosphere with respect to heat or cold, wetness or dryness, calm or storm,
Bob Livezey and Marina Timofeyeva
Mike Staudenmaier NWS/WR/STID
Model Post Processing.
MOS Developed by and Run at the NWS Meteorological Development Lab (MDL) Full range of products available at:
Post Processing.
The Importance of Reforecasts at CPC
Alex Gallagher and Dr. Robert Fovell
Forecast system development activities
Presentation transcript:

Using Ensemble Model Output Statistics to Improve 12-Hour Probability of Precipitation Forecasts John P. Gagan NWS Springfield, MO Chad Entremont NWS Jackson, MS

TROUBLE GOOD Wet PoP Dry PoP

The Problem Dry Bias –Improvement noted with No Precip –Forecaster not as “wet” as GFS MOS when there is Precip National problem –Almost all areas exhibit same tendencies –Issues in both cold and warm seasons

The Problem (con’t) PoP definition –The probability of occurrence of measurable precipitation (0.01 inch) at a given point for each 12-hour period through Day 7. The Interpretation –In spite of a straightforward definition, it’s as unique as the individual asked “PoPs ‘look’ too high today” “It’s not going to rain that much, so I’m lowering PoPs” “I never go likely beyond 48 hours”

A Generic Forecast ‘In Words’ “Models increasing PWs to 200% of normal” “High Θ e air being pumped into region by 50kt LLJ” “Area in right entrance region of ULJ” “Large area of rain and embedded thunderstorms will move over the area today” And so on…

A Generic Forecast ‘By Numbers’ MOS PoP Forecasts –MAV – 90% –MET – 85% –Ensemble MOS – 80% Forecaster’s PoP –70% area wide

What Happened? Numerous reasons why it WILL rain Yet, the forecast is drier than MOS Why? –Mistrust/misunderstanding of MOS? –Lack of understanding of the 12-hr PoP? –Reasons vary by individual CONFIDENCE –The main issue - CONFIDENCE

A Solution Ensemble MOS –Started April 2001 –Currently a 16-member suite Operational MEX Control Run 14 Perturbations –Run 1-time per day (00z issuance) A bulletin is created showing the Max/Min/Avg of MOS output

A Solution (con’t) Use the Ensemble Average PoP as a means to improve PoP forecasts –DO NOT use the ensemble average value as an EXPLICIT forecast –Use the ensemble average value as CONFIDENCE factor The higher the ensemble average, the more confidence in precip occurrence

Data Manipulation This investigation is for the COLD SEASON ONLY! –October to April Data collected from Oct 2003 – Apr 2006 Investigated 6 sites –SGF CWFA – KSGF, KVIH –JAN CWFA – KGWO, KTVR, KJAN, KMEI

Data Manipulation (con’t) ~ 4000 data points collected –Stratified by rain/no rain Periods 1-10 studied (Days 1-5) Graphs produced to highlight rainfall frequency for a given value of the ensemble average PoP confidenceEnsemble Average PoP is NOT used as a PoP but a confidence factor

7 / / / / / 390 Cases Rain / All Cases

5 / / / / / 25 Cases Rain / All Cases

776 / / / / 471 Cases Rain / All Cases

19 / / / / 263 Cases Rain / All Cases

55 / / / / 344 Cases Rain / All Cases

Using Ensemble PoP in Real Time Using the ensemble average alone does well However, using this in tandem with the full suite of models/guidance is best –SREF – (Probabilities from SPC web page) –Other global models –Mesoscale models confidenceThe more datasets that say “YES” should increase confidence and result in a better quality PoP

Observations and Further Study Watch the day-to-day trend of the ensemble average PoP confidence –If the value increases for a particular period, confidence increases –Should be able to hone in on hour periods (“windows of opportunity”) Watch for MEX PoP values LESS THAN the ensemble average –Observation has shown that it does not rain as often

Does It Work? A quick look at verification from KJAN

Dry Bias Prominent

Dry Bias Eliminated

Questions, Comments? If you are interested in this study, we’d like to hear your opinions