NFUSE Conference Call 4/11/07 How Good Does a Forecast Really Need To Be? David Myrick Western Region Headquarters Scientific Services Division.

Slides:



Advertisements
Similar presentations
JKL Aviation Grid Services Dusty Harbage – Aviation Program Leader Brian Schoettmer – Asst. Aviation Program Leader.
Advertisements

Effects of the Great Salt Lake’s Temperature and Size on the Regional Precipitation in the WRF Model Joe Grim Jason Knievel National Center for Atmospheric.
For the Lesson: Eta Characteristics, Biases, and Usage December 1998 ETA-32 MODEL CHARACTERISTICS.
Mesoscale Circulations during VTMX John Horel Lacey Holland, Mike Splitt, Alex Reinecke
RTMA (Real Time Mesoscale Analysis System) NWS New Mesoscale Analysis System for verifying model output and human forecasts.
ASSIMILATION of RADAR DATA at CONVECTIVE SCALES with the EnKF: PERFECT-MODEL EXPERIMENTS USING WRF / DART Altuğ Aksoy National Center for Atmospheric Research.
SEReGAP Land Cover Mapping Summary and Results Southwest Regional GAP Project Arizona, Colorado, Nevada, New Mexico, Utah US-IALE 2004, Las Vegas, Nevada:
Daniel P. Tyndall and John D. Horel Department of Atmospheric Sciences, University of Utah Salt Lake City, Utah.
Brian Ancell, Cliff Mass, Gregory J. Hakim University of Washington
MOS Performance MOS significantly improves on the skill of model output. National Weather Service verification statistics have shown a narrowing gap between.
Western Water Supply Kevin Werner, Andrew Murray, WR/SSD Jay Breidenbach, WFO Boise Cass Goodman, Steve Shumate, CBRFC Alan Takamoto, Scott Staggs, CNRFC.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Chapter 8: Problem Solving
49 COMET Hydrometeorology 00-1 Matt Kelsch Tuesday, 19 October 1999 Radar-Derived Precipitation Part 3 I.Radar Representation of.
Real Time Mesoscale Analysis John Horel Department of Meteorology University of Utah RTMA Temperature 1500 UTC 14 March 2008.
Parameter estimation: To what extent can data assimilation techniques correctly uncover stochasticity? Jim Hansen MIT, EAPS (with lots.
Gpegpe P Introduction A three-year NSF project is underway to investigate the processes leading to the formation, maintenance and destruction of.
Development of an object- oriented verification technique for QPF Michael Baldwin 1 Matthew Wandishin 2, S. Lakshmivarahan 3 1 Cooperative Institute for.
June 19, 2007 GRIDDED MOS STARTS WITH POINT (STATION) MOS STARTS WITH POINT (STATION) MOS –Essentially the same MOS that is in text bulletins –Number and.
VERIFICATION OF NDFD GRIDDED FORECASTS IN THE WESTERN UNITED STATES John Horel 1, David Myrick 1, Bradley Colman 2, Mark Jackson 3 1 NOAA Cooperative Institute.
Slide 1 Impact of GPS-Based Water Vapor Fields on Mesoscale Model Forecasts (5th Symposium on Integrated Observing Systems, Albuquerque, NM) Jonathan L.
Part III: ROMAN and MesoWest: resources for observing surface weather  MesoWest and ROMAN are software that require ongoing maintenance and development.
VERIFICATION OF NDFD GRIDDED FORECASTS USING ADAS John Horel 1, David Myrick 1, Bradley Colman 2, Mark Jackson 3 1 NOAA Cooperative Institute for Regional.
1 An overview of the use of reforecasts for improving probabilistic weather forecasts Tom Hamill NOAA / ESRL, Physical Sciences Div.
Real-time Verification of Operational Precipitation Forecasts using Hourly Gauge Data Andrew Loughe Judy Henderson Jennifer MahoneyEdward Tollerud Real-time.
Synthesizing Weather Information for Wildland Fire Decision Making in the Great Lakes Region John Horel Judy Pechmann Chris Galli Xia Dong University of.
Part II  Access to Surface Weather Conditions:  MesoWest & ROMAN  Surface Data Assimilation:  ADAS.
Project: Improving Air Quality Modeling for the Wasatch Front/Cache Valley Winter Air Pollution Episodes Erik Crosman 1, John Horel 1, Lance Avey 2, Chris.
P1.7 The Real-Time Mesoscale Analysis (RTMA) An operational objective surface analysis for the continental United States at 5-km resolution developed by.
Estimating the radiative impacts of aerosol using GERB and SEVIRI H. Brindley Imperial College.
Experiences in assessing deposition model uncertainty and the consequences for policy application Rognvald I Smith Centre for Ecology and Hydrology, Edinburgh.
Robert LaPlante NOAA/NWS Cleveland, OH David Schwab Jia Wang NOAA/GLERL Ann Arbor, MI 15 March 2012.
Quantifying lake-effect precipitation in the Great Salt Lake Basin Kristen Yeager University of Utah Department of Atmospheric Sciences Salt Lake City,
APPLICATION OF NUMERICAL MODELS IN THE FORECAST PROCESS - FROM NATIONAL CENTERS TO THE LOCAL WFO David W. Reynolds National Weather Service WFO San Francisco.
Key Synoptic Features Retreating shallow cold air mass SFC18zSFC18z Broad SE surface flow along East Coast kt 850mb over eastern Lakes 850W850W.
VERIFICATION OF NDFD GRIDDED FORECASTS USING ADAS John Horel 1, David Myrick 1, Bradley Colman 2, Mark Jackson 3 1 NOAA Cooperative Institute for Regional.
1 Gridded Localized Aviation MOS Program (LAMP) Guidance for Aviation Forecasting Judy E. Ghirardelli and Bob Glahn National Weather Service Meteorological.
Verification of Precipitation Areas Beth Ebert Bureau of Meteorology Research Centre Melbourne, Australia
` Observations of Great Salt Lake Breezes During Salt Lake Valley Persistent Cold Air Pools Erik Crosman, John Horel, Neil Lareau, and Xia Dong University.
1 Semileptonic physics in FOCUS D  K  0 l form factor measurement –Motivation –Method and Signals D   l form factor measurement –Motivation –Signals.
Wind Gust Analysis in RTMA Yanqiu Zhu, Geoff DiMego, John Derber, Manuel Pondeca, Geoff Manikin, Russ Treadon, Dave Parrish, Jim Purser Environmental Modeling.
NWS Digital Services 1 CB Operations Committee Lynn Maximuk DSPO Operations Team Eastern Region HPC Day 4-7 Grid Proposal Review, Findings and Recommendations.
Workshop on Seasonal Forecast Improvements Kevin Werner, NOAA December 15, 2015 Las Vegas, NV.
Proposed THORPEX/HEPEX Hydrologic Ensemble Project (THEPS) Presentation for 3 rd THORPEX Science Symposium September 14-18, 2009 Prepared by John Schaake,
Assimilation of radar observations in mesoscale models using approximate background error covariance matrices (2006 Madison Flood Case) 1.
MoPED temperature, pressure, and relative humidity observations at sub- minute intervals are accessed and bundled at the University of Utah into 5 minute.
Global vs mesoscale ATOVS assimilation at the Met Office Global Large obs error (4 K) NESDIS 1B radiances NOAA-15 & 16 HIRS and AMSU thinned to 154 km.
Tooele Valley Change Application Policy. Summary Review Water Right Issues Policy Conclusions.
Summary of the Report, “Federal Research and Development Needs and Priorities for Atmospheric Transport and Diffusion Modeling” 22 September 2004 Walter.
Application of Probability Density Function - Optimal Interpolation in Hourly Gauge-Satellite Merged Precipitation Analysis over China Yan Shen, Yang Pan,
Breakout Session 3: Analysis Strategies Charge(s): –Identify and evaluate the current capabilities to develop AORs –Recommendations on overcoming current.
Medium Range Forecasting at the Weather Prediction Center (WPC) –
ECMWF/EUMETSAT NWP-SAF Satellite data assimilation Training Course
Jason Levit NOAA NextGen Weather Program June, 2013
Using RTMA Analysis to Substitute for Missing Airport Observations
Rapid Update Cycle-RUC
  Robert Gibson1, Douglas Drob2 and David Norris1 1BBN Technologies
Radar/Surface Quantitative Precipitation Estimation
Composite-based Verification
NOAA - LAPS Albers, S., 1995: The LAPS wind analysis. Wea. and Forecasting, 10, Albers, S., J. McGinley, D. Birkenheuer, and J. Smart, 1996:
Objective Analyses & Data Representativeness
Ensemble Situational Awareness Table and Other WR UpdatesHow I Learned to Stop Worrying and Love the Global Ensembles NAEFS Trevor Alcott Science and Technology.
Rapid Update Cycle-RUC Rapid Refresh-RR High Resolution Rapid Refresh-HRRR RTMA.
Validation of Satellite Precipitation Estimates using High-Resolution Surface Rainfall Observations in West Africa Paul A. Kucera and Andrew J. Newman.
NPRS December 7, 2011.
Utilizing Automated Tools in ArcGIS
TRAX Monitoring of Air Quality in the Salt Lake Valley
TRAX Monitoring of Air Quality in the Salt Lake Valley
P2.5 Sensitivity of Surface Air Temperature Analyses to Background and Observation Errors Daniel Tyndall and John Horel Department.
Presentation transcript:

NFUSE Conference Call 4/11/07 How Good Does a Forecast Really Need To Be? David Myrick Western Region Headquarters Scientific Services Division

NFUSE Conference Call 4/11/07 Motivating Question Can we use uncertainty information as a threshold for gauging when a forecast is good enough? –This is an informal talk! –Lots of examples –Approach question from the viewpoint of observational uncertainty

NFUSE Conference Call 4/11/07 Points Area(Grid Boxes) No longer forecasting for 8-10 CCF points Each CWA – 1000’s of 2.5 or 5 km grid boxes Twofold need for grid-based verification: –Forecaster feedback across the entire grid –Identifying ways to evolve our services to focus more attention on high impact events

NFUSE Conference Call 4/11/07 WR Service Improvement Project Initially began as a grid-based verification project using BOIVerify Morphed into learning how we can evolve our services to focus more effort on high impact events Project got us thinking about: “What is a good forecast for a small area?”

NFUSE Conference Call 4/11/07 Observations Grid-based verification requires an objective analysis based on ASOS & non-ASOS observations Lots of known problems with surface & analysis data Ob = Value ± Uncertainty

NFUSE Conference Call 4/11/07 Observational Errors Instrument errors Gross errors Siting errors Errors of “representativeness” Photo: J. Horel

NFUSE Conference Call 4/11/07 Errors of “representativeness” Observation is accurate –Reflects synoptic & microscale conditions But… the microscale phenomena it captures is not resolvable by analysis or model Example: cold pool in narrow valley –Observation on valley floor may be correct –Not captured by analysis system

NFUSE Conference Call 4/11/07 +9 Representativeness Error Temperature ( o C) Example Tooele Valley Rush Valley

NFUSE Conference Call 4/11/07 Variability in Observations Examples - WR/SSD RTMA Evaluation Comparing analysis solutions along a terrain profile near SLC, UT ~70 mesonet obs in a 60 x 60 km area ~60 km Great Salt Lake Wasatch Mountains

NFUSE Conference Call 4/11/07 Large Spread in Observations >11 o C spread between m How do we analyze this?

NFUSE Conference Call 4/11/07 Objective Analysis 101 Analysis Value = Background Value + Observation Corrections Analysis Errors come from: –Errors in the background field –Observational errors A “good” analysis takes into account the uncertainty in the obs & background –A “best fit” to the obs –Won’t always match the obs

NFUSE Conference Call 4/11/07 Forecast Verification Forecasters are comfortable with: –Verification against ASOS obs –Assessing forecast skill vs. MOS But is judging a forecast against a few points without any regard for observational and representativeness errors really the scientific way to verify forecasts? Is there a better way? Can we define a “good enough” forecast?

NFUSE Conference Call 4/11/07 Proposal Evaluate grid-based forecasts vs. RTMA Use RTMA to scientifically assign uncertainty Reward forecasts that are within the bounds of analysis uncertainty

NFUSE Conference Call 4/11/07 RTMA Uncertainty Estimates RTMA/AOR provides a golden opportunity to revamp verification program Analysis uncertainty varies by location Techniques under development at EMC to assign analysis uncertainty to RTMA –Backing out an estimate of the analysis error by taking the inverse of the Hessian of the analysis cost function –Cross Validation (expensive)

NFUSE Conference Call 4/11/07 Example Verify forecasts based on the amount of uncertainty that exists in an analysis Example: –Forecast = 64 o F –Analysis Value = 66 o F –Analysis Uncertainty = +/- 3 o F –No penalty for forecasts between o F (the forecast fell in the “good enough” range) –This is a “distributions-oriented” approach…

NFUSE Conference Call 4/11/07 “Distributions-oriented” forecast verification Murphy and Winkler (1987) – original paper Brooks and Doswell (1996) - reduced dimensionality problem by using wider bins

NFUSE Conference Call 4/11/07 Problem with “distributions” approach Brooks and Doswell (1996) example used 5 o F bins Setup bins -5 to 0 o F, 0 to 5 o F, 5 to 10 o F etc. Forecast = 4.5 o F Verification = 0.5 o F = good forecast Verification = 5.5 o F = bad forecast

NFUSE Conference Call 4/11/07 Myrick and Horel (2006) Verified NDFD grid-based forecasts using floating bins whose width was based on the observational uncertainty (~2.5 o C)

NFUSE Conference Call 4/11/07 54 Temperature ( o F) Forecast Example ForecastRTMA RTMA Uncertainty Populated Valley Mountains Green = Forecasts are “good enough” Red = abs(RTMA – Forecast) > Uncertainty

NFUSE Conference Call 4/11/07 Summary Challenge: How do we define a “good enough” forecast Proposal: –Verify against RTMA ± Uncertainty Uncertainty based on observational, representativeness, & analysis errors –Give the forecaster credit for forecast areas that are within the uncertainty Goal: Provide better feedback as to which forecast areas are “good enough” and which areas need more attention

NFUSE Conference Call 4/11/07 Special Thanks! Tim Barker (BOI WFO) Brad Colman (SEW WFO) Kirby Cook (SEW WFO) Andy Edman (WR/SSD) John Horel (Univ. Utah) Chad Kahler (WR/SSD) Mark Mollner (WR/SSD) Aaron Sutula (WR/SSD) Ken Pomeroy (WR/SSD) Manuel Pondeca (NCEP/EMC) Kevin Werner (WR/SSD)

NFUSE Conference Call 4/11/07 References Brooks H. E., and C. A. Doswell, 1996: A comparison of measures-oriented and distributions-oriented approaches to forecast verification. Wea. Forecasting, 11, 288–303. Murphy A. H., and R. L. Winkler, 1987: A general framework for forecast verification. Mon. Wea. Rev., 115, 1330–1338. Myrick, D. T., and J. D. Horel, 2006: Verification of surface temperature forecasts from the National Digital Forecast Database over the Western United States. Wea. Forecasting. 21, Representativeness Errors – Western Region Training Module: Western Region Service Evolution Project Internal Page: