1 How Are We Doing? A Verification Briefing for the SAWS III Workshop April 23, 2010 Chuck Kluepfel National Weather Service Headquarters Silver Spring,

Slides:



Advertisements
Similar presentations
Verification of Probabilistic Forecast J.P. Céron – Direction de la Climatologie S. Mason - IRI.
Advertisements

Canadian Aviation Forecast Verification
14 May 2001QPF Verification Workshop Verification of Probability Forecasts at Points WMO QPF Verification Workshop Prague, Czech Republic May 2001.
Interpreting TAF Verification Statistics: The Impact of TEMPO Forecasts (Corrected May 21, 2007) Chuck Kluepfel National Weather Service Headquarters Silver.
Validation of Satellite Precipitation Estimates for Weather and Hydrological Applications Beth Ebert BMRC, Melbourne, Australia 3 rd IPWG Workshop / 3.
Dan Shoemaker Aviation Curmudgeon, NWS FWD From a 2005 study done with: Rick Curtis, Chief Meteorologist, SWA Paul Witsaman, Southern Region RAM.
NWS TAF Verification Brandi Richardson NWS Shreveport, LA.
Daria Kluver Independent Study From Statistical Methods in the Atmospheric Sciences By Daniel Wilks.
Lead Time Aviation Verification Onset and Cessation of Ceiling and Visibility Flight Category Conditions (IFR, MVFR, VFR) at FAA Core Airports NWS Aviation.
1 Introduction to Categorical Amendment Criteria (CAC) Joe Jurecka Aviation Program Leader NWS Lubbock, TX Commercial Pilot ASMEL Instrument SAWS III Phoenix,
Chapter 3 Measures of Central Tendency. Chapter Outline  Introduction  The Mode  The Median  Other Measures of Position: Percentiles, Deciles, and.
Paul Fajman NOAA/NWS/MDL September 7,  NDFD ugly string  NDFD Forecasts and encoding  Observations  Assumptions  Output, Scores and Display.
Reliability Trends of the Global Forecast System Model Output Statistical Guidance in the Northeastern U.S. A Statistical Analysis with Operational Forecasting.
MOS Performance MOS significantly improves on the skill of model output. National Weather Service verification statistics have shown a narrowing gap between.
The 10th annual Northeast Regional Operational Workshop, Albany, NY Verification of SREF Aviation Forecasts at Binghamton, NY Justin Arnott NOAA / NWS.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
1 Localized Aviation Model Output Statistics Program (LAMP): Improvements to convective forecasts in response to user feedback Judy E. Ghirardelli National.
Aviation Verification and Convection Chris Leonardi WFO RLX August 31, 2005.
Aviation Cloud Forecasts – A True Challenge for Forecasters v       Jeffrey S. Tongue NOAA/National Weather Service - Upton, NY Wheee !
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
Creating Empirical Models Constructing a Simple Correlation and Regression-based Forecast Model Christopher Oludhe, Department of Meteorology, University.
Probabilistic forecasts of (severe) thunderstorms for the purpose of issuing a weather alarm Maurice Schmeits, Kees Kok, Daan Vogelezang and Rudolf van.
PRACTICAL TAF WRITING Karen Oudeman NWS – Jackson, KY October 16, 2003.
AVIATION VERIFICATION NWS KEY WEST 2005 Bill South Aviation Program Leader.
Great Basin Verification Task 2008 Increased Variability Review of 2008 April through July Period Forecast for 4 Selected Basins Determine what verification.
Richard (Rick)Jones Regional Training Workshop on Severe Weather Forecasting Macau, April 8 -13, 2013.
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Air Force Weather Agency Lt Col Jeffrey S. Tongue Individual Mobilization Augmentee Air Force Weather.
Generation and Application of Gridded Aviation Forecast Parameters in GFE and AvnFPS Chris Leonardi Aviation Focal Point, NWS Charleston WV National Weather.
Verification of a Blowing Snow Model and Applications for Blizzard Forecasting Jeff Makowski, Thomas Grafenauer, Dave Kellenbenz, Greg Gust National Weather.
National Weather Service Model Flip-Flops and Forecast Opportunities Bernard N. Meisner Scientific Services Division NWS Southern Region Fort Worth, Texas.
4IWVM - Tutorial Session - June 2009 Verification of categorical predictands Anna Ghelli ECMWF.
1 What’s New in Verification? A Verification Briefing for the SAWS IV Workshop October 26, 2011 Chuck Kluepfel National Weather Service Headquarters Silver.
Verification of the Cooperative Institute for Precipitation Systems‘ Analog Guidance Probabilistic Products Chad M. Gravelle and Dr. Charles E. Graves.
Utilizing Localized Aviation MOS Program (LAMP) for Improved Forecasting Judy E. Ghirardelli National Weather Service Meteorological Development Laboratory.
Summer WAS*IS 2006 National Weather Service Verification Program Overview Brenton MacAloney II National Weather Service Headquarters Silver Spring, MD.
Quality Assessment - National Ceiling and Visibility (NCV) Analysis (now, not forecast) Product Tressa L. Fowler, Matthew J. Pocernich, Jamie T. Braid,
HOW FAR DOES SOUTHERN CALIFORNIA REACH? Distance-Decay.
A Preliminary Verification of the National Hurricane Center’s Tropical Cyclone Wind Probability Forecast Product Jackie Shafer Scitor Corporation Florida.
Measuring forecast skill: is it real skill or is it the varying climatology? Tom Hamill NOAA Earth System Research Lab, Boulder, Colorado
Latest results in verification over Poland Katarzyna Starosta, Joanna Linkowska Institute of Meteorology and Water Management, Warsaw 9th COSMO General.
61 st IHC, New Orleans, LA Verification of the Monte Carlo Tropical Cyclone Wind Speed Probabilities: A Joint Hurricane Testbed Project Update John A.
Polygon Warnings The Sharp Focus on Service The Sharp Focus on Service NWS Partners’ Workshop Silver Spring, MD June, 2006.
Model Post Processing. Model Output Can Usually Be Improved with Post Processing Can remove systematic bias Can produce probabilistic information from.
HEMS Weather Summit – 21 March The Outlook for National-Scale Ceiling and Visibility Products Paul Herzegh Lead, FAA/AWRP National C&V Team.
L3MTO Transfer to CPC Preliminary Meeting February 7, 2008.
Verification of Rare Extreme Events
Traditional Verification Scores Fake forecasts  5 geometric  7 perturbed subjective evaluation  expert scores from last year’s workshop  9 cases x.
A Superior Alternative to the Modified Heidke Skill Score for Verification of Categorical Versions of CPC Outlooks Bob Livezey Climate Services Division/OCWWS/NWS.
INTERPRETING WEATHER INFORMATION
Statistical Post Processing - Using Reforecast to Improve GEFS Forecast Yuejian Zhu Hong Guan and Bo Cui ECM/NCEP/NWS Dec. 3 rd 2013 Acknowledgements:
Briefing by: Roque Vinicio Céspedes Finally Here! The MOST awaited briefing ever! February 16, 2011.
Eastern Region Aviation Overview Fred McMullen Regional Aviation Meteorologist Fred McMullen Regional.
Furthermore… References Katz, R.W. and A.H. Murphy (eds), 1997: Economic Value of Weather and Climate Forecasts. Cambridge University Press, Cambridge.
Localized Aviation MOS Program (LAMP) Judy E. Ghirardelli National Weather Service Meteorological Development Laboratory February 05, 2009.
Gridded warning verification Harold E. Brooks NOAA/National Severe Storms Laboratory Norman, Oklahoma
NCAR, 15 April Fuzzy verification of fake cases Beth Ebert Center for Australian Weather and Climate Research Bureau of Meteorology.
10th COSMO General Meeting, Cracow, Poland Verification of COSMOGR Over Greece 10 th COSMO General Meeting Cracow, Poland.
2008 AT540 Forecast Contest! Compete against your classmates and TA for bragging rights and a chance to win extra points on your final lab grade! Apply.
Verification of C&V Forecasts Jennifer Mahoney and Barbara Brown 19 April 2001.
Aviation Products from the Localized Aviation MOS Program (LAMP) Judy E. Ghirardelli National Weather Service Meteorological Development Laboratory Presented.
TAIWIN model verification task
COSMO Priority Project ”Quantitative Precipitation Forecasts”
Air Quality Forecast Verification (AFQx)
Binary Forecasts and Observations
Digital Services for Aviation
Validation-Based Decision Making
Quantitative verification of cloud fraction forecasts
Drivers Influencing Weather-related NAS Metrics
Quality Assessment Activities
Verification of SPE Probability Forecasts at SEPC
Presentation transcript:

1 How Are We Doing? A Verification Briefing for the SAWS III Workshop April 23, 2010 Chuck Kluepfel National Weather Service Headquarters Silver Spring, Maryland x132 Prepared October 2008

2 Part 1 Traditional Statistics

3 The Basics: POD and FAR You can drive up your POD (also called hit rate) by over-forecasting IFR and below conditions. This practice simultaneously drives up the FAR. The CSI provides a mathematical way of correcting an inflated POD by using the FAR. The 2-category Heidke Skill Score has a similar affect, and it passes tests for equitability (statistical balance). Heidke also considers the “not forecast / not observed” situations (in an appropriately balanced manner), which are ignored by the CSI.

4 Prevailing vs. Operational Impact Forecast (OIF) Which should we use? OIF considers TEMPO groups. GPRA system uses OIF. MOS / LAMP – Do not produce TEMPOs When comparing to guidance, I used prevailing.

5 Modified SW US, March 2009 to Feb 2010 Flight Category: IFR and Below

6 Traditional Stats for Modified Southwest United States: Colorado New Mexico Utah Arizona Nevada California Minus these WFOs: San Diego, Los Angeles, San Francisco

7

8 Modified SW US, March 2009 to Feb 2010 Flight Category: IFR and Below

9

10 Modified SW United States TAF Performance vs. Projection March 2009 to February 2010

11 Modified SW US - March 2009 to Feb 2010 Scheduled 3-6 hr IFR and Below GFS LAMP (77K - 36K) K ÷ (77K – 63K) ~ 2.9 The 3-6 hr GFS LAMP false alarmed almost 3 times for every additional hit it got over the forecasters! POD 0.55 FAR 0.50 CSI 0.35 Forecast YesNo ObsObs Yes77 K64 K No77 K 3.4 Million Prevailing POD 0.45 FAR 0.36 CSI 0.36 Forecast YesNo ObsObs Yes63 K77 K No36 K 3.4 Million

12 Modified SW US - March 2009 to Feb 2010 Scheduled 3-6 hr IFR and Below NAM MOS (117K – 36K) ÷ (75K - 63 K) ~ hr NAM MOS false alarmed over 6 times for every additional hit it got over the forecasters! POD 0.54 FAR 0.60 CSI 0.30 Forecast YesNo ObsObs Yes76 K64 K No117 K 3.3 Million Prevailing POD 0.45 FAR 0.36 CSI 0.36 Forecast YesNo ObsObs Yes63 K77 K No36 K 3.4 Million

13 WFOs San Diego, Los Angeles, San Francisco TAF Performance vs. Projection IFR and Below March 2009 to February 2010

14 WFOs El Paso, Tucson, Phoenix (Low Desert Southwest) IFR and Below March 2009 to February 2010

15 Part 2 Lead-Time Software

16

17 Part 3 The Future

18 The Future Output to CSV files (just posted) oStarting with Flight Category and Sig Wx Data oCeiling / Visibility (next) oWinds (last) Plots of POD / FAR / CSI Sort Elements by Sig Wx Type

OBSERVED (A) FORECAST (F) (X) (Y) (W) (Z) POD FOM POFD PON FOHFAR DFR FOCN (Ma,Mf) A(F): Regression of Observations upon the forecast F(A): Regression of Forecast upon observations Ma: Average of Observations (x+y)/N Mf: Average of Forecasts (x+z)/N Improving the Current System: Geometric Interpretation

20 Observed ForecastYesNoTotal Yes606 No066 Total6612 Observed ForecastYesNoTotal Yes066 No606 Total6612 Basic Interpretation: Extreme Cases PERFECT FORECAST RESIGN FROM THE NWS

21 Observed ForecastYesNoTotal Yes336 No336 Total6612 RANDOM CHANCE Basic Interpretation: Random Chance

22 Observed ForecastYesNoTotal Yes500 No Total Observed ForecastYesNoTotal Yes No0100 Total Under-forecast Over-forecast Assesses Bias in one glance!! Basic Interpretation: False Alarms vs. Misses

23 Finis

24 Traditional Stats Entire National Weather Service

25 Nation, March 2009 to Feb 2010 Flight Category: IFR and Below

26 Nation, March 2009 to Feb 2010 Flight Category: IFR and Below

27 Nation, March 2009 to Feb 2010 Flight Category: IFR and Below

28 Nation Performance vs. Projection March 2009 to February 2010