Presentation is loading. Please wait.

Presentation is loading. Please wait.

ounding nalog etrieval ystem Ryan Jewell Storm Prediction Center Norman, OK SARS Sounding Analog Retrieval System.

Similar presentations


Presentation on theme: "ounding nalog etrieval ystem Ryan Jewell Storm Prediction Center Norman, OK SARS Sounding Analog Retrieval System."— Presentation transcript:

1

2 ounding nalog etrieval ystem Ryan Jewell Storm Prediction Center Norman, OK SARS Sounding Analog Retrieval System

3 What is SARS? SARS is a forecast system based on sounding analogs. The algorithm matches forecast soundings to a large database of proximity soundings associated with severe weather. SARS finds matches using a small number of parameters and parameter ranges determined by a calibration process.

4 Name inspired by MARS – Map Analog Retrieval System Greg Carbin – http://www.spc.noaa.gov/exper/mref_mars/ Safe to use! Used experimentally at the SPC. Integrated into NSHARP (Sounding displays) RUC and NAM plan view display (Model Grids) What is SARS?

5 Two types: Hail and Supercell/Tornado Hail: 1148 Severe hail proximity soundings (Observed) Supercell Tornado: 938 Supercell proximity soundings (RUC) (Under Development) Types of SARS Hail SARS can forecast: 1) Probability of SIG (≥ 2.00”) hail. 2) Maximum expected hail size (≥ 0.75”).

6 Matching Sounding Database Includes 1148 observed hail soundings 1989-2006. Within 100 nm and +/- 2.5 hrs either side of 2330Z (21-02). Had to be in same air mass as storm. Modified for surface conditions (if needed). Thrown out if contaminated by outflow, etc. Expansion of dataset used in Jewell and Brimelow (WAF 2009). Severe Hail Proximity Soundings

7 Matching Sounding Database Assume dataset is “representative.” Spans all seasons 18 years of data All regions of the CONUS

8 1989 - 2006 A function of climatology and quality of soundings.

9 SARS Calibration Method 1Matching Parameters – Relevant parameters associated with severe storms (various measures of instability and shear associated with hail). 2Define initial ranges for each parameter to be used in search. (Example +/- 500 CAPE) 3Test each sounding independently against the database, analyze matches. 4Adjust parameters and ranges until the desired result is received. Desired Result = The majority of matches agree on a particular type and magnitude of severe weather, and it verifies. If a sounding is associated with 3.00” hail, most of the SARS matches should be very large hail. Determine matching parameters and ranges

10 Example – Calibration for hail SARS. Remove 1 sounding…test against remaining soundings (1147). Calculate skill scores for parameter set #1 and range combination # 1... Test various combinations of parameters and parameter ranges. 8 different parameters with 5 ranges each = 5 8 or 390,625 combinations.

11 SARS Matching Parameters Most Unstable (MU) CAPE Mixing Ratio of MU Parcel 700-500 mb Lapse Rate 500 mb Temperature 0-6 km Bulk Shear Final list of matching parameters (out of about 20) Notably showed little or no skill: Freezing Level Wet Bulb Zero Heights 0-3 Storm Relative Helicity (SRH)

12 SARS Parameters Ranges Significant Hail Parameter Ranges (Resulted in best skill scores) MUCAPE +/- 40% Mixing Ratio of MU Parcel +/- 2.0 g/kg 700-500 mb Lapse Rate +/- 1.5 C/km 500 mb Temperature +/- 7 C 0-6 km Bulk Shear +/- 9 m/s Large ranges, but all 5 must overlap.

13 Performance SARS SIG Hail Algorithm

14 SARS Skill Scores Significant Hail (≥ 2.0”) Forecast Total Soundings = 1148 HitMiss False Alarm Correct Null No Matches Found 4868497475* 5 CSITSSPODFAR 0.7290.6830.8530.166 NOTE: Highest skill score AND highest % with matches * 1 Tie

15 SARS Skill Scores Significant Hail Forecast - Filtered Remove Golf Ball (1.75”) and 2.00” reports (near 2” threshold) Total Soundings = 889 CSITSSPODFAR 0.8430.7840.9450.113 HitMiss False Alarm Correct Null No Matches Found 4772861318* 4 NOTE: Highest skill score AND highest % with matches * 1 Tie

16 Performance SARS SIZE Algorithm

17 Mean value of SARS binned by report size – Observed vs. Forecast MEAN STDEV: 0.43” Correlation (All): 0.68 r 2 = 0.47 Correlation (Filtered): 0.75 r 2 = 0.56

18 SARS MATCHING EXAMPLES (one HAIL of a year!)

19 National Record - July 23, 2010 – Vivian, SD

20 Contours = # Matches Color Fill = % that are SIG (≥ 2.00”) RUC Model

21 GRIDDED SARS EXAMPLE Mean SARS Hail Size (inches) RUC Model

22 KS Record (dia) - Sep 15, 2010 – Wichita, KS

23 7.75” Hail

24 2.75” Hail

25 4.25” Hail

26 Contours = # Matches Color Fill = % that are SIG (≥ 2.00”) RUC Model CIN

27 RUC Model Mean SARS Hail Size (inches) CIN

28 RARE EVENTS

29 Put AZ Hail Case HERE Contours = # Matches Color Fill = % that are SIG (≥ 2.00”) RUC Model

30 Put AZ Hail Case HERE Mean SARS Hail Size (inches) RUC Model

31 Contours = # Matches Color Fill = % that are SIG (≥ 2.00”) RUC Model

32 SUPERCELL SARS Forecast Soundings

33 F5 Tornado

34 F4 Tornado

35

36 F5 Tornado

37 SARS Summary The SARS method can be applied to various types of severe weather (hail, tornado, wind). SARS forecasts storm REPORTS! Local biases in reporting WILL be reflected in SARS! SARS may miss rare events if they have not been accounted for in the database, but may also find rare events and heighten awareness. SARS is conditional…cannot predict whether storms will form (capping, forcing issues). And oh, by the way…accuracy of SARS heavily depends upon the forecasts models.

38 This slide intentionally left blank.

39 ND Record - Jul 14, 2010 – Sioux County, ND

40

41

42 1.75” Hail

43


Download ppt "ounding nalog etrieval ystem Ryan Jewell Storm Prediction Center Norman, OK SARS Sounding Analog Retrieval System."

Similar presentations


Ads by Google