Paul Fajman NOAA/NWS/MDL September 7, 2011.  NDFD ugly string  NDFD Forecasts and encoding  Observations  Assumptions  Output, Scores and Display.

Slides:



Advertisements
Similar presentations
Slide 1ECMWF forecast User Meeting -- Reading, June 2006 Verification of weather parameters Anna Ghelli, ECMWF.
Advertisements

Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
How do clouds form and precipitation types (Do not write what is in blue) RRB
© Crown copyright /0181 Met Office and the Met Office logo are registered trademarks Met Office Fitzroy Road Exeter EX1 3PB United Kingdom Tel:
ounding nalog etrieval ystem Ryan Jewell Storm Prediction Center Norman, OK SARS Sounding Analog Retrieval System.
Jess Charba Fred Samplatsky Phil Shafer Meteorological Development Laboratory National Weather Service, NOAA Updated September 06, 2013 LAMP Convection.
NWS TAF Verification Brandi Richardson NWS Shreveport, LA.
Ounding nalog etrieval ystem Ryan Jewell Storm Prediction Center Norman, OK SARS Sounding Analog Retrieval System.
452 Precipitation. Northwest Weather = Terrain + Ocean Influence.
Univ of AZ WRF Model Verification. Method NCEP Stage IV data used for precipitation verification – Stage IV is composite of rain fall observations and.
Improving Gridded Localized Aviation MOS Program (LAMP) Guidance by Using Emerging Forecast and Observation Systems Judy E. Ghirardelli, Jerome P. Charba,
The 10th annual Northeast Regional Operational Workshop, Albany, NY Verification of SREF Aviation Forecasts at Binghamton, NY Justin Arnott NOAA / NWS.
Printed Reports and Forecasts
1 Localized Aviation Model Output Statistics Program (LAMP): Improvements to convective forecasts in response to user feedback Judy E. Ghirardelli National.
MDSS Challenges, Research, and Managing User Expectations - Weather Issues - Bill Mahoney & Kevin Petty National Center for Atmospheric Research (NCAR)
ATS/ESS 452: Synoptic Meteorology
Probabilistic forecasts of (severe) thunderstorms for the purpose of issuing a weather alarm Maurice Schmeits, Kees Kok, Daan Vogelezang and Rudolf van.
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
1 How Are We Doing? A Verification Briefing for the SAWS III Workshop April 23, 2010 Chuck Kluepfel National Weather Service Headquarters Silver Spring,
NOAA’s National Weather Service Tools To Access the NDFD
Radar Summary Charts Radar Summary Chart – An example of the radar echo intensity information available every hour from the national radar network is.
1 On the use of radar data to verify mesoscale model precipitation forecasts Martin Goeber and Sean Milton Model Diagnostics and Validation group Numerical.
1 “The dawn came late and breakfast had to wait When the gales of November came slashing When afternoon came it was freezing rain In the face of a hurricane.
1.Weather – the condition of the earth’s atmosphere 2.Relative humidity – a measure of the amount of water vapor in the air compared with the amount.
Station Models How Meteorologist can look at a lot of cities’ data at once!
4IWVM - Tutorial Session - June 2009 Verification of categorical predictands Anna Ghelli ECMWF.
Sampling Uncertainty in Verification Measures for Binary Deterministic Forecasts Ian Jolliffe and David Stephenson 1EMS September Sampling uncertainty.
Verification of the Cooperative Institute for Precipitation Systems‘ Analog Guidance Probabilistic Products Chad M. Gravelle and Dr. Charles E. Graves.
HOW DO CLOUDS FORM AND PRECIPITATION TYPES (DO NOT WRITE WHAT IS IN BLUE) RRB
Verification Summit AMB verification: rapid feedback to guide model development decisions Patrick Hofmann, Bill Moninger, Steve Weygandt, Curtis Alexander,
NWS Winter Weather Products & Media Coordination NWS Winter Weather Products & Media Coordination National Weather Service Des Moines, IA.
Verification in NCEP/HPC Using VSDB-fvs Keith F. Brill November 2007.
Chapter 19.  Result of intense convection  Associated with heating Earth’s surface ◦ During spring, summer, and fall  Three-stage life cycle: ◦ Beginning.
June 19, 2007 GRIDDED MOS STARTS WITH POINT (STATION) MOS STARTS WITH POINT (STATION) MOS –Essentially the same MOS that is in text bulletins –Number and.
By John Metz Warning Coordination Meteorologist WFO Corpus Christi.
EGOWS 2008 Systematic forecasting of weather “type” in the GFE John Bally CAWCR.
Improving Ensemble QPF in NMC Dr. Dai Kan National Meteorological Center of China (NMC) International Training Course for Weather Forecasters 11/1, 2012,
OUTLINE Current state of Ensemble MOS
Latest results in verification over Poland Katarzyna Starosta, Joanna Linkowska Institute of Meteorology and Water Management, Warsaw 9th COSMO General.
NOAA’s National Weather Service NDFD Data Flow, Format, and Statistics Tim Boyer Meteorological Development Laboratory National Digital Forecast Database.
An Experiment to Evaluate the Use of Quantitative Precipitation Forecasts from Numerical Guidance by Operational Forecasters Joshua M. Boustead and Daniel.
NWS St. Louis Decision Support Workshop Watch, Warning, and Advisory Products and Criteria.
Severe Weather.
The Benefit of Improved GOES Products in the NWS Forecast Offices Greg Mandt National Weather Service Director of the Office of Climate, Water, and Weather.
Model Post Processing. Model Output Can Usually Be Improved with Post Processing Can remove systematic bias Can produce probabilistic information from.
HEMS Weather Summit – 21 March The Outlook for National-Scale Ceiling and Visibility Products Paul Herzegh Lead, FAA/AWRP National C&V Team.
Weather and meteorology npor. Tereza Holubova. CONTENT Meteorological phenomena influencing aviation turbulence wind precipitation, icing clouds visibility.
Meteorology 415 October Numerical Guidance Skill dependent on: Skill dependent on: Initial Conditions Initial Conditions Resolution Resolution Model.
Eastern Region Aviation Overview Fred McMullen Regional Aviation Meteorologist Fred McMullen Regional.
Types of Precipitation Precipitation comes in two forms Precipitation comes in two forms Liquid – rain, drizzle Liquid – rain, drizzle Solid - freezing.
OPERATIONAL 2-H THUNDERSTORM GUIDANCE FCSTS TO 24 HRS ON 20-KM GRID JESS CHARBA FRED SAMPLATSKY METEOROLOGICAL DEVELOPMENT LABORATORY OST / NWS / NOAA.
The LAMP/HRRR MELD FOR AVIATION FORECASTING Bob Glahn, Judy Ghirardelli, Jung-Sun Im, Adam Schnapp, Gordana Rancic, and Chenjie Huang Meteorological Development.
Probabilistic Forecasts of Extreme Precipitation Events for the U.S. Hazards Assessment Kenneth Pelman 32 nd Climate Diagnostics Workshop Tallahassee,
Clouds
1 Validation for CRR (PGE05) NWC SAF PAR Workshop October 2005 Madrid, Spain A. Rodríguez.
Improved Gridded Localized Aviation MOS Program (LAMP) Ceiling and Visibility Guidance using HRRR model output Presenting: Judy E. Ghirardelli/Adam Schnapp.
MDL Requirements for RUA Judy Ghirardelli, David Myrick, and Bruce Veenhuis Contributions from: David Ruth and Matt Peroutka 1.
Gridded WAFS Icing Verification System Matt Strahan WAFC Washintgon.
Verification of C&V Forecasts Jennifer Mahoney and Barbara Brown 19 April 2001.
Developing GFS-based MOS Thunderstorm Guidance for Alaska Phillip E. Shafer* and Kathryn Gilbert Meteorological Development Laboratory, NWS, NOAA
RUC Convective Probability Forecasts using Ensembles and Hourly Assimilation Steve Weygandt Stan Benjamin Forecast Systems Laboratory NOAA.
Paper Review Jennie Bukowski ATS APR-2017
Weather and meteorology npor. Eva Slovák Kubalová
Relevant publications
TAIWIN model verification task
Dan Petersen Bruce Veenhuis Greg Carbin Mark Klein Mike Bodner
Warm Season Flash Flood Detection Performance at WFO Binghamton
Post Processing.
Quantitative verification of cloud fraction forecasts
Microwave radiometer at Linkenholt 16:00 to 17:00 29/06/05, 0–1 km
Presentation transcript:

Paul Fajman NOAA/NWS/MDL September 7, 2011

 NDFD ugly string  NDFD Forecasts and encoding  Observations  Assumptions  Output, Scores and Display  Results  Future Work

 Weather element has 5 parts: ◦ Coverage/Probability ◦ Weather Type ◦ Intensity ◦ Visibility ◦ Attributes  Combine those 5 parts to form the ugly string Sample Weather StringMeaning : : : :No Weather Def:R:+:4SM:Definite heavy rain, visibility at 4 statute miles Lkly:S:m: :^Chc:ZR:-: : ^Chc:IP:-: : ^Areas:BS: : : Likely moderate snow, chance light freezing rain, chance light ice pellets, areas of blowing snow

 Forecasts produced on a 5 km grid ◦ Extract data (using degrib) at points where there are METAR stations. ◦ Very specific list of points which have been approved by WFOs ◦ At this time, only points are being verified  NDFD forecasts can be updated every hour.  Forecasts are valid from the top of the hour until 59 minutes past the hour.

 The forecasts are verified with METAR observations that occur at the top of the hour. ◦ There are up to 3 independent weather types reported ◦ Verify weather types and  Thunderstorms are verified with METAR observations and the 20km convective predictand dataset (Charba and Samplatsky) which is a combination of radar data and NLDN. ◦ Observations are reported over a one hour range

 Ignored

 Forecasts ◦ 1. Forecasts that fall within a chosen probability range and their corresponding observations are used in the computation of the threat score. ◦ 2. Observations that have a corresponding valid forecast and were missed will count as both a false alarm and a miss. For example, if snow was forecasted and rain was observed, the event would be counted as a false alarm for the snow forecast and a miss for the rain. ◦ 3. Frost, freezing spray, water spouts, and snow grain forecasts were considered no weather forecasts.

 Constrained by what is reported in the METARs and how those data are processed  Observations ◦ 1. Multiple weather types can verify various forecast precipitation types. Rain verifies rain, rain shower, and drizzle forecasts and so on. ◦ 2. Unknown precipitation verifies rain, rain shower, drizzle, snow, snow shower, ice pellet, freezing rain, and freezing drizzle forecasts. ◦ 3. All fog forecasts (normal, freezing, and ice) are verified by any fog observation. ◦ 4. Blowing dust or sand forecasts are verified by any observation of blowing dust or sand.

 Observations ◦ 5. Observations reported to be within sight of the observation location do not verify a forecast as a hit. (e.g. 40 = VCFG Fog between 5-10 miles from the station.) ◦ 6. Dust, mist, spray, tornado, and blowing spray are considered no weather observations. ◦ 7. If a forecast is considered a false alarm, the observation is not always considered a miss. No weather and unknown precipitation observations are not counted as misses. ◦ 8. When the coded observation is ambiguous, only the most likely precipitation type is considered the missed observation. In most cases, this applies to coded observations 68 (light rain/snow/drizzle mix) and 69 (moderate or heavy mix).

 Default setting: Analyze entire month of data for both the 00Z and 12Z cycle, for all locations, for all forecast projections, using all weather strings (except NoWx forecasts) outputting the results for each cycle and forecast projection.  In manual mode, a user can control these forecast parameters: ◦ weather (ugly) string ◦ cycle ◦ date range ◦ coverage/probability groups ◦ forecast projection hours ◦ locations (Region, WFO, or multiple stations)

 LocationCSICases ◦ CSI for CONUS and Regions heads the output ◦ Followed by individual station and WFO data.  At the bottom of text file, individual weather element statistics are printed. ◦ WxElement Hits False Alarms Misses  Total  Rain  Snow  Fog

 Knowing the Hits, False Alarms, and Misses, four quality measures can be calculated: ◦ Probability of Detection (POD) = A/(A+C) ◦ False Alarm Ratio (FAR)= B/(A+B) ◦ Bias= (A+B)/(A+C) ◦ CSI= A/(A+B+C) These commonly used measures are mathematically related and can be geometrically represented on the same diagram.

BIAS CSI Overforecast Skillful Not Skillful Underforecast Many False Alarms No False Alarms Never Miss Always Miss

 Thunderstorm forecast scores improved considerably with convective observations  Cool season had higher CSI for all probability groups.  Warm season had more cases in every prob group, except % and non-QPF probabilities.  Rarer events (freezing rain, freezing drizzle, ice pellets) don’t verify very well at any probability group.

 Verify GMOS at points  Compare GMOS vs. NDFD  Add ability to handle a matched sample of cases for any number of forecast sources  Add POD and FAR to text output.  Automate the entire process from data ingest to production of plots.  Verify more seasons

Cool Season 00Z Cool Season 12Z

Warm Season 00Z Warm Season 12Z