Analysis of Model Forecasts of Significant Cold Fronts Using MOS Output Steve Amburn, SOO WFO Tulsa, Oklahoma.

Slides:



Advertisements
Similar presentations
Solar Energy Forecasting Using Numerical Weather Prediction (NWP) Models Patrick Mathiesen, Sanyo Fellow, UCSD Jan Kleissl, UCSD.
Advertisements

What’s your favorite season?
Examining the Damaging New England Windstorm of February 2010 as a Shapiro-Keyser Cyclone Stacie Hanes/NOAA NWS Gray ME Jim Hayes/NOAA NWS Mount.
CH 27. * Data were collected on 208 boys and 206 girls. Parents reported the month of the baby’s birth and age (in weeks) at which their child first crawled.
Trying to Stop a Leak in the Operational Global Model Thomas J. Galarneau, Jr. Mesoscale Dynamics Section Mesoscale and Microscale Meteorology Laboratory.
GROUP 4 : UPDATED 22 Feb 2007 Huigang Yang Wendi Kaufeld Matt Sienko.
Weather Forecasting Making Predictions About Our Weather.
Managing Intermittency:
Probabilistic QPF Steve Amburn, SOO WFO Tulsa, Oklahoma.
NCEP Model Comparisons over the East and West Coast of the US Lynn McMurdie Garrett Wedam Cliff Mass May 27, 2009 Wedam, McMurdie and Mass, 2009: Comparison.
ATM 111 Forecast Discussion R. Grotjahn W Forecast Discussion III. Making a forecast A. Overview of general forecast presentation 1. Primary charts:
Allentown, PA (A true craptastic town). Forecast GFS MOS ~0.15 NAM MOS ~0.05 USL ~0.04.
Huntsville, AL CST (UTC -6 Hours) Week #2 Forecast November 15, 2010.
An Investigation of Cool Season Extratropical Cyclone Forecast Errors Within Operational Models Brian A. Colle 1 and Michael Charles 1,2 1 School of Marine.
A few tips on writing a good forecast discussion: Monday, January 24 NWS Discussion as an example Atmo 456 Conlee/Seroka.
Reliability Trends of the Global Forecast System Model Output Statistical Guidance in the Northeastern U.S. A Statistical Analysis with Operational Forecasting.
MOS Developed by and Run at the NWS Meteorological Development Lab (MDL) Full range of products available at:
Hydrometeorological Prediction Center HPC Medium Range Grid Improvements Mike Schichtel, Chris Bailey, Keith Brill, and David Novak.
East Coast vs. West Coast: A Documentation of Model Forecast Failures for Eta, NAM, GFS, GEM, and ECMWF Garrett Wedam Lynn McMurdie, Cliff Mass.
Brian Ancell, Cliff Mass, Gregory J. Hakim University of Washington
Muskegon, MI November 29- December 9. Where is Muskegon?
GFS (left), ECMWF (right) 500 mb Height, Winds, and Temp 00 hr forecast valid 12z 24 Dec 2010.
MOS What does acronym stand for ? –MODEL OUTPUT STATISTICS What is the difference between the GFS and GFS MOS ?
Fly - Fight - Win 25 th Operational Weather Squadron University of Arizona 1.8km WRF Verification 2Lt Erik Neemann Weather Operations Officer 30 Apr 08.
GFS (green), ECMWF (blue) 500 mb height 00 hr forecasts 12z 24 Dec
MOS Performance MOS significantly improves on the skill of model output. National Weather Service verification statistics have shown a narrowing gap between.
Upper Air. Cold in the center (-46 C) Warm (-31 C)
The 10th annual Northeast Regional Operational Workshop, Albany, NY Verification of SREF Aviation Forecasts at Binghamton, NY Justin Arnott NOAA / NWS.
Wind errors in the GFS Tommy Gun's Valentine's Massacre Dinner Show - February 14 and 17 – Forget dat romantic Valentine's Day and.
GDAS analysis 500 mb height/absolute vorticity 00z 24 Dec 2010.
Mike Evans NOAA/NWS WFO Binghamton.  Overview of Narrow Cold Frontal Rain Bands  November 16, 2006  January 25, 2010  Summary and Conclusion.
“1995 Sunrise Fire – Long Island” Using an Ensemble Kalman Filter to Explore Model Performance on Northeast U.S. Fire Weather Days Michael Erickson and.
Verification of a Blowing Snow Model and Applications for Blizzard Forecasting Jeff Makowski, Thomas Grafenauer, Dave Kellenbenz, Greg Gust National Weather.
National Weather Service Model Flip-Flops and Forecast Opportunities Bernard N. Meisner Scientific Services Division NWS Southern Region Fort Worth, Texas.
An Analysis of Eta Model Forecast Soundings in Radiation Fog Forecasting Steve Amburn National Weather Service, Tulsa, OK.
Jordan G. Powers, Steven M. Cavallo, and Kevin W. Manning.
On Improving GFS Forecast Skills in the Southern Hemisphere: Ideas and Preliminary Results Fanglin Yang Andrew Collard, Russ Treadon, John Derber NCEP-EMC.
Forecasting process, issues and the public Joe Koval Senior Software Developer The Weather Channel, Atlanta, GA.
United States Weather Maps. How to Read a Surface Map Surface maps depict the large-scale elements of the weather. These elements include high and low.
An Experiment to Evaluate the Use of Quantitative Precipitation Forecasts from Numerical Guidance by Operational Forecasters Joshua M. Boustead and Daniel.
Identification of side-door/back-door cold fronts for fire weather forecasting applications Joseph J. Charney USDA Forest Service Northern Research Station,
Model Post Processing. Model Output Can Usually Be Improved with Post Processing Can remove systematic bias Can produce probabilistic information from.
CC Hennon ATMS 350 UNC Asheville Model Output Statistics Transforming model output into useful forecast parameters.
Ensembling Medium Range Forecast MOS GUIANCE By Richard H. Grumm National Weather Service State College PA and Robert Hart The Florida State University.
MOS AVN = Dynamical Model –Seven fundamental equations ! AVN MOS = Statistical Model –No seven fundamental equations ! –Equations are statistical, not.
Presented by Bradley G. Illston 1, Steven Piltz 2, and Steve Amburn 2 1 Oklahoma Mesonet; 2 National Weather Service Forecast Office, Tulsa, OK Utilizing.
A STATISTICAL COMPARISON OF AMPS 10-KM AND 3.3-KM DOMAINS Michael G. Duda, Kevin W. Manning, and Jordan G. Powers Mesoscale and Microscale Meteorology.
Models: General Characteristics Much better in Short Term –Doubling of error about every 2.5 days.
MOS and Evolving NWP Models Developer’s Dilemma: Frequent changes to NWP models… Make need for reliable statistical guidance more critical Helps forecasters.
HIDE & SEEK Prepositions PLAY.
Forecast Review - Bangor Alyson Hoegg, Anna Schneider, Rachel Rice Meteorology 415.
Using Ensemble Model Output Statistics to Improve 12-Hour Probability of Precipitation Forecasts John P. Gagan NWS Springfield, MO Chad Entremont NWS Jackson,
National Weather Service Houston/Galveston Lance Wood Science and Operations Officer Assessing the Impact of MODIS SST Utilizing a local WRF.
CONFIDENCE GRIDS: A METHODOLOGY FOR EFFICIENT THREAT ASSESSMENT IN THE GFE ERA Jeffrey D. Fournier, Paul Duval, & Andrew I. Watson NOAA/National Weather.
Developing GFS-based MOS Thunderstorm Guidance for Alaska Phillip E. Shafer* and Kathryn Gilbert Meteorological Development Laboratory, NWS, NOAA
Weather Progression. Weather changes with time Follow a week’s weather Plan ahead for the best days.
Statistical Evaluation of High-resolution WRF Model Forecasts Near the SF Bay Peninsula By Ellen METR 702 Prof. Leonard Sklar Fall 2014 Research Advisor:
Huge bodies of air that forms over water or land in tropical or polar regions.
Medium Range Forecasting at the Weather Prediction Center (WPC) –
Penn State Weather Camps
Winter Forecasting Tips By Rick Garuckas and Andrew Calvi
High Pressure vs. Low Pressure
Cold Front Warm Front Weather can Last for a 2-3 Days 1
Warm front 3 hours later Cold air is getting shallower.. Inversion at 1.5 km. Cold air advection still going on. Inbound winds intensifying in cold air.
Weather Symbols.
Great! on in in front of under above next to behind between next.
Carrot Varieties in Storage Methods
A Climate Study of Daily Temperature Change From the Previous Day
Where are the hot and cold temperatures?
Types of Errors And Error Analysis.
Presentation transcript:

Analysis of Model Forecasts of Significant Cold Fronts Using MOS Output Steve Amburn, SOO WFO Tulsa, Oklahoma

The Question Does a model bias exist when significant cold fronts move through eastern Oklahoma? Forecasters say yes. – Too cool ahead of the front – Too warm behind the front

Method Define significant front (~15°F delta) Select the frontal cases (over 30) Use proxies for model data – MAV and MET MOS for periods 1-5 – MEX MOS and ECMWF output for periods 6-13 Compute applicable statistics

Data Time period: Oct 2007 through Feb 2009 Periods 1-5 = 38 Significant Fronts Periods 6-13 = 35 Significant Fronts Significant = high temp change ≥ 15°F – Day before to day after frontal passage MOS data examined GFS MOS (MAV and MEX) NAM MOS (MET) ECMWF (3-hourly max/min output)

# Mean Absolute Errors (Pds 1-5)

Count of Pos/Neg Errors (Pds 1-5)

# Mean Absolute Errors (pds 6-13)

Count of Pos/Neg Errors (pds 6-13)

Statistics for Periods 1-5 MAV/GFS – Pre-frontal Avg bias = -5.31°F MAE = 6.12°F # errors to warm = 20 # errors too cool = 164 – Post-frontal Avg bias= 4.62°F MAE= 5.07°F # errors to warm = 160 # errors too cool = 19 MET/NAM – Pre-frontal Avg bias = -4.42°F MAE = 5.15°F # errors to warm = 35 # errors too cool = 150 – Post-frontal Avg bias= 2.97°F MAE= 3.95°F # errors to warm = 137 # errors too cool = 34

Statistics for Periods 6-13 MEX/GFS – Pre-frontal Avg bias = -9.32°F MAE = 10.05°F # errors to warm = 21 # errors too cool = 256 – Post-frontal Avg bias= 7.29°F MAE= 8.03°F # errors to warm = 240 # errors too cool = 27 ECMWF – Pre-frontal Avg bias = -8.66°F MAE = 9.70°F # errors to warm = 24 # errors too cool = 248 – Post-frontal Avg bias= 0.89°F MAE= 6.04°F # errors to warm = 128 # errors too cool = 131

Summary for Significant Fronts GFS and MET for Periods 1-5 – Both have significant cool bias ahead of fronts – Both have significant warm bias behind fronts GFS and ECMWF for Periods 6-13 – Both have significant cool bias ahead of fronts – GFS has significant warm bias behind fronts – ECMWF showed almost no bias behind fronts

Analysis of Model Forecasts of Significant Cold Fronts Using MOS Output Steve Amburn, SOO WFO Tulsa, Oklahoma