Ounding nalog etrieval ystem Ryan Jewell Storm Prediction Center Norman, OK SARS Sounding Analog Retrieval System.

Slides:



Advertisements
Similar presentations
Synoptic/Meso-scale Comparison of Recent Historic Tornado Events Marc Kavinsky Senior Forecaster – Milwaukee/Sullivan WFO NWA 31 st Annual Meeting
Advertisements

ounding nalog etrieval ystem Ryan Jewell Storm Prediction Center Norman, OK SARS Sounding Analog Retrieval System.
Matthew Vaughan, Brian Tang, and Lance Bosart Department of Atmospheric and Environmental Sciences University at Albany/SUNY Albany, NY NROW XV Nano-scale.
Forecasting convective outbreaks using thermodynamic diagrams. Anthony R. Lupo Atms 4310 / 7310 Lab 10.
Outline  Introduction  CAPE Description  Parcel Choice  Fat vs Skinny  Other Forms  Conclusion.
MesoscaleM. D. Eastin Deep Convection: Forecast Parameters.
Aspects of 6 June 2007: A Null “Moderate Risk” of Severe Weather Jonathan Kurtz Department of Geosciences University of Nebraska at Lincoln NOAA/NWS Omaha/Valley,
The Impact of Gravity Wave/Undular Bore Dissipation on the June 22, 2003 Deshler and Aurora Nebraska Tornadic Supercells AARON W. JOHNSON NOAA/NWS Weather.
My grandparents’ farm or so The farm NW of Sac City near Nemaha.
An Overview of Environmental Conditions and Forecast Implications of the 3 May 1999 Tornado Outbreak Richard L. Thompson and Roger Edwards Presentation.
Paul Fajman NOAA/NWS/MDL September 7,  NDFD ugly string  NDFD Forecasts and encoding  Observations  Assumptions  Output, Scores and Display.
Meteorology 503 Meteorology 503 Tornadic Analysis Severe Weather Outbreak Dodge City, KS May 7, 2002 Julio C. Garcia! SFSU Julio C. Garcia! SFSU.
6/26/2015 RUC Convective Parameters and Upscale Events in Southern Ontario Mike Leduc Environment Canada.
Hail Large hail is not a killer, but does considerable damage.
Determining Favorable Days for Summertime Severe Convection in the Deep South Chad Entremont NWS Jackson, MS.
© Craig Setzer and Al Pietrycha Supercell (mesocyclone) tornadoes: Supercell tornado environments Developed by Jon Davies – Private Meteorologist – Wichita,
Use of TAMDAR Data in a Convective Weather Event Saturday, May 21, 2005.
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
Ensemble Numerical Prediction of the 4 May 2007 Greensburg, Kansas Tornadic Supercell using EnKF Radar Data Assimilation Dr. Daniel T. Dawson II NRC Postdoc,
Accuracy: The closeness of a measurement to the true or actual value
How can LAMEPS * help you to make a better forecast for extreme weather Henrik Feddersen, DMI * LAMEPS =Limited-Area Model Ensemble Prediction.
Mike Evans NWS / WFO BGM. CSTAR V – Severe convection in scenarios with low-predictive skill SUNY Albany researchers are examining SPC forecasts and associated.
Indices of Violent Tornado
Environmental Conditions Associated with Cool Season Significant Tornadoes over the North Central United States Mark F. Britt and Fred H. Glass National.
A Study on the Environments Associated with Significant Tornadoes Occurring Within the Warm Sector versus Those Occurring Along Boundaries Jonathan Garner.
Observations of Near-Surface Thermodynamic and Wind Shear Profiles on Significant Tornado Days Observations of Near-Surface Thermodynamic and Wind Shear.
Using The Short Fuse Composite to Forecast Severe Convection: Part II – “The Next Generation,” Updating the Technique By Jim Johnson & Mike Umscheid NWS.
Composite Analysis of Environmental Conditions Favorable for Significant Tornadoes across Eastern Kansas Joshua M. Boustead, and Barbara E. Mayes NOAA/NWS.
Improving the Forecasting of High Shear, Low CAPE Severe Weather Environments Keith Sherburn and Jason Davis Department of Marine, Earth, and Atmospheric.
Improving the Forecasting of High Shear, Low CAPE Severe Weather Environments Keith Sherburn and Jason Davis Department of Marine, Earth, and Atmospheric.
Soundings and Adiabatic Diagrams for Severe Weather Prediction and Analysis Continued.
A Preliminary Investigation of Supercell Longevity M ATTHEW J. B UNKERS, J EFFREY S. J OHNSON, J ASON M. G RZYWACZ, L EE J. C ZEPYHA, and B RIAN A. K LIMOWSKI.
Model Post Processing. Model Output Can Usually Be Improved with Post Processing Can remove systematic bias Can produce probabilistic information from.
Forecast Parameters. CAPE Convective Available Potential Energy – obviously, positive buoyancy is helpful for producing convection –100 mb mixed layer.
The Ingredients Based Tornado Parameter Matt Onderlinde.
The Similar Soundings Technique For Incorporating Pattern Recognition Into The Forecast Process at WFO BGM Mike Evans Ron Murphy.
Mike Evans NWS Binghamton, NY. Outline The checklist Example – April 28, 2011 Verification Summary / Conclusion.
SPC Ensemble Applications: Current Status, Recent Developments, and Future Plans David Bright Storm Prediction Center Science Support Branch Norman, OK.
Soundings and Adiabatic Diagrams for Severe Weather Prediction and Analysis.
Using Ensemble Probability Forecasts And High Resolution Models To Identify Severe Weather Threats Josh Korotky NOAA/NWS, Pittsburgh, PA and Richard H.
Steve Koch National Severe Storms Laboratory Steve Koch National Severe Storms Laboratory WELCOME to the WoF – HiW Workshop of 2014.
Severe Weather: Tornadoes Harold E. Brooks NOAA/National Severe Storms Laboratory Norman, Oklahoma
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Air Force Weather Agency Probabilistic Lightning Forecasts Using Deterministic Data Evan Kuchera.
40 th Annual Meeting, National Weather Association, Broadcast Meteorology Workshop Oct. 18, 2015 Understanding SPC’s Outlooks or Everything you wanted.
THE BARON TORNADO INDEX (BTI)
Spatial Verification Methods for Ensemble Forecasts of Low-Level Rotation in Supercells Patrick S. Skinner 1, Louis J. Wicker 1, Dustan M. Wheatley 1,2,
Evaluating the ability of climate models to simulate extremes Eric Robinson Natalie McLean Christine Radermacher Ross Towe Yushiang Tung Project 6.
Summer Tornadoes – NWA 2015 Statistical Severe Convective Risk Assessment Model (SSCRAM) (Hart & Cohen, 2015) SPC Mesoanalysis Data Every hour from
Title card A Look at Environments Associated with Nighttime Supercell Tornadoes in the Central Plains Meteorologist Jon Davies Private © Dick McGowan &
Tornado Warning Skill as a Function of Environment National Weather Service Sub-Regional Workshop Binghamton, New York September 23, 2015 Yvette Richardson.
Statistical Severe Convective Risk Assessment Model (SSCRAM) SPC Mesoanalysis Data every hour from (Bothwell et al. 2002) + CG NLDN Lightning.
Applied Meteorology Unit 1 High Resolution Analysis Products to Support Severe Weather and Cloud-to-Ground Lightning Threat Assessments over Florida 31.
A Rare Severe Weather and Tornado Event in Central New York and Northeast Pennsylvania: July 8, 2014 Presented by Mike Evans 1.
Quantifying the Significance of the April 2011 Severe Weather Outbreak Matthew S. Stalley, Chad M. Gravelle, Charles E. Graves Saint Louis University.
C. Schultz, W. Petersen, L. Carey GLM Science Meeting 12/01/10.
Soundings and Adiabatic Diagrams for Severe Weather Prediction and Analysis Continued.
Environmental Features Discriminating Between High Shear/Low CAPE Severe Convection and Null Events Keith Sherburn Matthew Parker North Carolina State.
The Effect of Downdraft Strength on Tornado Intensity and Path Length Dylan R. Card and Ross A. Lazear Department of Atmospheric & Environmental Sciences,
Paper Review Jennie Bukowski ATS APR-2017
Soundings and Adiabatic Diagrams for Severe Weather Prediction and Analysis Ooohhhh!!!!!!!!!!! Aaaahhhhhhhh!!!!!! Look at the pretty picture!
Pamela Eck, Brian Tang, and Lance Bosart University at Albany, SUNY
Michael K. Tippett1,2, Adam H. Sobel3,4 and Suzana J. Camargo4
Model Post Processing.
Bow Echo Workshop March 2017
Forecast parameters, Tornadogensis, maintenance and decay
High Shear Low CAPE Severe Wx: March 4-5, 2008
A Neural Network for Detecting and Diagnosing Tornadic Circulations
Differences Between High Shear / Low CAPE Environments in the Northeast US Favoring Straight-Line Damaging Winds vs Tornadoes Michael E. Main, Ross A.
Can we distinguish wet years from dry years?
Supercell tornado environments
Presentation transcript:

ounding nalog etrieval ystem Ryan Jewell Storm Prediction Center Norman, OK SARS Sounding Analog Retrieval System

What is SARS? SARS is a sounding matching algorithm. Enables the forecaster to quickly compare soundings to historical proximity soundings associated with a particular type of severe weather. Uses key sounding parameters to find matching soundings within the historical database. Matching database includes ~1900 proximity soundings for tornadic and non- tornadic supercells as well as severe hail. (CONUS - All Seasons). SARS returns date, location and associated severe weather for each match. Example: ( OUN SIGTOR) or ( FWD 3.00” HAIL). SARS has also produce “probabilistic” forecasts for SIG Hail and Tornadoes, based upon the distribution of matches. More later.

Lets pretend it’s a forecast sounding 1) Load sounding, run SARS 3) Display SARS results. SARS Supercell / Tornado Matches  SARS Hail Matches  (Actual significant tornado proximity sounding from SARS Database) Probabilistic SARS Forecast of Tornadoes and Significant Hail If this was a forecast sounding, SARS would indicate a high likelihood of Tornadic Supercells, possibly F2+, and hail > 2.00” diameter. 2)SARS finds sounding matches based on various parameters. Hail: MUCAPE, MUMR, 0-6 SHR, mb LR, 500 temp

What can be said about the distribution of matches? Example: Forecast sounding yields 20 hail matches from sounding database. 10 are 0.75”, 5 are 1.00”, and 5 are 2.75”. What size of hail is most likely? Probabilistic SARS Forecast

Need to CALIBRATE SARS SARS has been calibrated to forecast the following… SIG vs. Non-SIG Hail (2.00” is threshold) SARS forecasts Significant hail if majority of matches are ≥2.00”. Tornadic vs. Non-Tornadic Supercells SARS forecasts Tornadic Supercells if majority of supercell matches are tornadic. Need to answer the following questions: What parameters should be used for matching? How large should the search ranges be for each parameter?

SARS Calibration Process Choose Matching Parameters – Relevant parameters associated with severe storms (various measures of instability and shear associated with hail, supercells, tornadoes). No “kitchen sink” method (yet). Define initial ranges for each parameter to be used in search. (Example +/- 500 CAPE, +/ km SRH, +/- 500 m LCL height). Test. Run soundings through the database, analyze matches. Adjust parameters and ranges until the desired result is received. Desired Result = The majority of matches agree on a particular type and magnitude of severe weather, and it verifies. If a sounding is associated with 3.00” hail, MOST of the SARS matches should be very large hail (rather than very small). If a sounding is associated with a tornadic supercell, MOST of the SARS matches should be tornadic supercells (rather than non-tornadic supercells).

Example – Calibration for hail forecast Test each sounding independently against the others. Determine if SARS produced a good forecast or not. Test various combinations of parameters and parameter ranges…see which one is the best. Start with list of 900+ historical proximity severe hail soundings.

SARS Matching Parameters (For THIS sounding database only ! ) Significant Hail Matching MUCAPE Mixing Ratio of MU Parcel mb Lapse Rate 500 mb Temperature 0-6 km Bulk Shear Supercell / Tornado Matching 100 mb MLCAPE MLLCL Height (agl) mb Lapse Rate 0-1 km SRH 0-6 km Bulk Shear

SARS Parameters Ranges (For THIS database only!) Significant Hail Ranges MUCAPE +/ % Mixing Ratio of MU Parcel +/- 2.0 g/kg mb Lapse Rate +/- 1.5 C/km 500 mb Temperature +/- 4.5 C 0-6 km Bulk Shear +/- 6 m/s Supercell / Tornado Ranges 100 mb MLCAPE +/ J/kg MLLCL Height (agl) +/- 400 m mb Lapse Rate +/- 1.0 C/km 0-1 km SRH +/- 30% 0-6 km Bulk Shear +/- 7 m/s

SARS Skill Scores Significant Hail Forecast HitMiss False Alarm Correct Null No Matches Found CSITSSPODFAR NOTE: Highest skill score AND highest % with matches

SARS Skill Scores Significant Hail Forecast - Filtered (≤ 1.5” vs. ≥ 2.5”) Removes “grey” area near 2” threshold CSITSSPODFAR HitMiss False Alarm Correct Null No Matches Found NOTE: Highest skill score AND highest % with matches

SARS Skill Scores Tornadic vs. Non-Tornadic Supercell Forecast CSITSSPODFAR HitMiss False Alarm Correct Null No Matches Found NOTE: Highest skill score AND highest % with matches

SARS MATCHING EXAMPLES

F5 Tornado

F4 Tornado

F5 Tornado

GRIDDED SARS EXAMPLE

RUC Forecast Lines – Number of SARS matches for each grid point sounding. Color Fill - % of matches that had >2.00” hail.

SARS Strengths & Weaknesses SARS cannot predict whether storms will form (capping, forcing issues). Usefulness of SARS depends upon the accuracy of model forecasts. SARS can heighten awareness in severe weather situations which are not clearly evident. Can be adjusted to match different types of severe weather.

SPC NSHARP Sounding Display on Web Available only for observed soundings.

1.75” Hail

5/19/60, Topeka KS, Long Track F4Date, Location, Event…Anyone?

Example – Calibration for hail forecast Remove 1 sounding for testing. This essentially leaves the dataset unchanged (removing 1 out of 900+ soundings). Run the sounding through the entire list, choose initial parameters (CAPE, shear, etc) and initial matching ranges for each parameter. Note the number and types of matches (hail < 2”, and hail ≥ 2”). Determine if Hit, Miss, False Alarm, Correct Null. Hit = Sounding associated with hail ≥ 2.00” and ≥50% matches ≥ 2.00” Miss = Sounding associated with hail ≥2.00” but < 50% matches ≥ 2.00” FA = Sounding associated with hail < 2.00” but ≥ 50% matches ≥ 2.00” CN = Sounding associated with hail < 2.00” and ≥ 50% matches < 2.00” Repeat for each sounding in the list. Calculate final skill scores for this calibration run (call it run 1). Start with list of 900+ historical proximity severe hail soundings. Example – Calibration for hail forecast Repeat the entire process using different combinations of parameters AND parameter ranges. Test thousands of combinations. Find optimal parameters and search range for those parameters which result in the highest skill score (CSI, TSS). Use this parameter set and calculated ranges for SARS guidance UNTIL…. More soundings are added, in which case the calibration process can be run again. SARS should get “smarter” as more cases are added to the database.