Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Development and Calibration of Ensemble Based Hazardous Weather Products at the Storm Prediction Center David Bright Gregg Grosshans, Jack Kain, Jason.

Similar presentations


Presentation on theme: "1 Development and Calibration of Ensemble Based Hazardous Weather Products at the Storm Prediction Center David Bright Gregg Grosshans, Jack Kain, Jason."— Presentation transcript:

1 1 Development and Calibration of Ensemble Based Hazardous Weather Products at the Storm Prediction Center David Bright Gregg Grosshans, Jack Kain, Jason Levit, Russ Schneider, Dave Stensrud, Matt Wandishin, Steve Weiss October 11, 2005 NCEP Predictability Discussion Group Where Americas Climate and Weather Services Begin

2 2 STORM PREDICTION CENTER MISSION STATEMENT The Storm Prediction Center (SPC) exists solely to protect life and property of the American people through the issuance of timely, accurate watch and forecast products dealing with tornadoes, wildfires and other hazardous mesoscale weather phenomena. MISSION STATEMENT The Storm Prediction Center (SPC) exists solely to protect life and property of the American people through the issuance of timely, accurate watch and forecast products dealing with hazardous mesoscale weather phenomena.

3 3 Hail, Wind, Tornadoes Excessive rainfall Fire weather Winter weather STORM PREDICTION CENTER HAZARDOUS PHENOMENA

4 4 TORNADO & SEVERE THUNDERSTORM WATCHES WATCH STATUS MESSAGE CONVECTIVE OUTLOOK – Day 1; Day 2; Day 3; Days 4-8 MESOSCALE DISCUSSION – Severe Thunderstorm Potential/Outlook Upgrade – Thunderstorms not expected to become severe – Hazardous Winter Weather – Heavy Rainfall FIRE WEATHER OUTLOOK – Day 1; Day 2; Days 3-8 OPERATIONAL FORECASTS ARE BOTH DETERMINISTIC AND PROBABILISTIC SPC Forecast Products 75% of all SPC products are valid for < 24h period

5 5 Tornadoes Probability of 2 or more tornadoesLow (10%) Probability of 1 or more strong (F2-F4) tornadoesLow (<5%) Wind Probability of 10 or more severe wind eventsMod (60%) Probability of 1 or more wind event > 65 knotsLow (10%) Hail Probability of 10 or more severe hail eventsLow (10%) Probability of 1 or more hailstones >2 inchesLow (<5%) Combined Severe Hail/Wind Probability of 6 or more combined severe wind/hail eventsMod (60%) Severe Thunderstorm Watch 688 Probability Table EXPERIMENTAL WATCH PROBABILITIES

6 6 CONVECTIVE OUTLOOKS Operational through Day 3

7 7 Thunderstorm Outlooks: 24h General Thunderstorm 12h Enhanced Thunderstorm (Tonight) 12h Enhanced Thunderstorm (Today) 24h Period (> 10%) 12h Periods (> 10%; 40%; 70%)

8 8 Operational emphasis on… –Observational data –Short-term, high-resolution NWP guidance –Specific information predicting hazardous mesoscale phenomena NWP needs range from the very-short range to medium range –Very short-range: Hourly RUC; 4.5 km WRF-NMM –Short-range: NAM, GFS, SREF –Medium-range: GFS, ECMWF, MREF Today’s focus: SREF –Overview of the ensemble product suite –Specific ensemble calibrated guidance Product Guidance at the SPC

9 9 Overview of Ensemble Guidance Objective: Provide a wide range of ensemble guidance covering all of the SPC program areas

10 10 Sample of Ensemble Products Available… http://www.spc.noaa.gov/exper/sref/ MEAN & SD: 500 mb HGHT MEAN: PMSL, DZ, 10M WIND MEAN: MUCAPE, 0-6 SHR, 0-3 HLCY SPAGHETTI: SFC LOW

11 11 http://www.spc.noaa.gov/exper/sref/ PROB: DENDRITIC GROWTH Omega < -3 -11 < T < -17 RH > 80% PROB: SIG TOR PARAM > 3 MEDIAN, UNION, INTERSECTION: SIG TOR PARAM MAX OR MIN: MAX FOSBERG INDEX STP = F (mlCAPE, mlLCL, SRH, Shear) Thompson et al. (2003) STP = F (mlCAPE, mlLCL, SRH, Shear) Thompson et al. (2003) Sample of Ensemble Products Available…

12 12 F63 SREF POSTAGE STAMP VIEW: PMSL, HURRICANE FRANCES Red = EtaBMJ Yellow= EtaKF Blue = RSM White = OpEta SREF Member

13 13 Combined Probability Probability surface CAPE >= 1000 J/kg –Relatively low –Ensemble mean is < 1000 J/kg (no gold dashed line) CAPE (J/kg) Green solid= Percent Members >= 1000 J/kg ; Shading >= 50% Gold dashed = Ensemble mean (1000 J/kg) F036: Valid 21 UTC 28 May 2003

14 14 Probability deep layer shear >= 30 kts –Strong mid level jet through Iowa 10 m – 6 km Shear (kts) Green solid= Percent Members >= 30 kts ; Shading >= 50% Gold dashed = Ensemble mean (30 kts) F036: Valid 21 UTC 28 May 2003 Combined Probability

15 15 Convection is likely WI/IL/IN –Will the convection become severe? 3 Hour Convective Precipitation >= 0.01 (in) Green solid= Percent Members >= 0.01 in; Shading >= 50% Gold dashed = Ensemble mean (0.01 in) F036: Valid 21 UTC 28 May 2003 Combined Probability

16 16 Prob Cape >= 1000 X Prob Shear >= 30 kts X Prob Conv Pcpn >=.01” F036: Valid 21 UTC 28 May 2003 Combined Probability A quick way to determine juxtaposition of key parameters Fosters an ingredients-based approach Not a “true” probability –Dependence –Different members contribute

17 17 Severe Reports Red=Tor; Blue=Wind; Green=Hail Prob Cape >= 1000 X Prob Shear >= 30 kts X Prob Conv Pcpn >=.01” F036: Valid 21 UTC 28 May 2003 Combined Probability A quick way to determine juxtaposition of key parameters Fosters an ingredients-based approach Not a “true” probability –Dependence –Different members contribute

18 18 F15 SREF PROBABILITY TPCP x RH x WIND x TMPF ( 30 mph x > 60 F) Ingredients for extreme fire weather conditions over the Great Basin Combined Probability

19 19 Objective: Develop calibrated probabilistic guidance for CG lightning Calibrated Thunderstorm Guidance

20 20 Combine Lightning Ingredients into a Single Parameter Three first-order ingredients (readily available from NWP models): –Lifting condensation level > -10 o C –Sufficient CAPE in the 0 o to -20 o C layer –Equilibrium level temperature < -20 o C Cloud Physics Thunder Parameter (CPTP) CPTP = (-19 o C – T el )(CAPE -20 – K) K where K = 100 Jkg -1 and CAPE -20 is MUCAPE in the 0 o C to -20 o C layer

21 21 Example CPTP: One Member 18h Eta Forecast Valid 03 UTC 4 June 2003 Plan view chart showing where grid point soundings support lightning (given a convective updraft)

22 22 SREF Probability CPTP > 1 15h Forecast Ending: 00 UTC 01 Sept 2004 Uncalibrated probability: Solid/Filled; Mean CPTP = 1 (Thick dashed) 3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004

23 23 SREF Probability Precip >.01” 15h Forecast Ending: 00 UTC 01 Sept 2004 Uncalibrated probability: Solid/Filled; Mean precip = 0.01” (Thick dashed) 3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004

24 24 Joint Probability (Assume Independent) 15h Forecast Ending: 00 UTC 01 Sept 2004 Uncalibrated probability: Solid/Filled P(CPTP > 1) x P(Precip >.01”) 3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004

25 25 Perfect Forecast No Skill Climatology P(CPTP > 1) x P(P03I >.01”) Uncalibrated Reliability (5 Aug to 5 Nov 2004) Frequency [0%, 5%, …, 100%]

26 26 Adjusting Probabilities Calibrate ensemble thunderstorm guidance based on the observed frequency of occurrence

27 27 Ensemble Thunder Calibration 1)Bin separately P(CPTP > 1) and P(P03M > 0.01”) into 11 bins (0-5%; 5-15%; …; 85-95%; 95-100%) 2)Combine the two binned probabilistic forecasts into one of 121 possible combinations (0%,0%); (0%,10%); … (100%,100%) 3)Use NLDN CG data over the previous 366 days to calculate the frequency of occurrence of CG strikes for each of the 121 binned combinations Construct for each grid point using 1/r weighting 4)Bin ensemble forecasts as described in steps 1 and 2 and assign the observed CG frequency (step 3) as the calibrated probability of a CG strike 5)Calibration is performed for each forecast cycle (09 and 21 UTC) and each forecast hour; domain is entire U.S. on 40 km grid (CG strike within ~12 miles)

28 28 Before Calibration

29 29 Joint Probability (Assumed Independence) P(CPTP > 1) x P(Precip >.01”) 3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004 15h Forecast Ending: 00 UTC 01 Sept 2004 Uncorrected probability: Solid/Filled

30 30 After Calibration

31 31 Calibrated Ensemble Thunder Probability 15h Forecast Ending: 00 UTC 01 Sept 2004 Calibrated probability: Solid/Filled 3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004

32 32 Calibrated Ensemble Thunder Probability 15h Forecast Ending: 00 UTC 01 Sept 2004 Calibrated probability: Solid/Filled; NLDN CG Strikes (Yellow +) 3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004

33 33 Perfect Forecast No Skill Perfect Forecast No Skill Calibrated Reliability (5 Aug to 5 Nov 2004) Calibrated Thunder Probability Climatology Frequency [0%, 5%, …, 100%]

34 34 3h probability of > 1 CG lightning strike within ~12 mi 09Z and 21Z SREF valid at F003 through F063 May 15 – Sept 15 2005 Economic Potential Value Reliability

35 35 12h probability of > 1 CG lightning strike within ~12 mi 09Z SREF valid at F012 through F063 May 15 – Sept 15 2005 Economic Potential Value Reliability

36 36 Objective: Develop calibrated probabilistic guidance of the occurrence of severe convective weather (Available for 3h, 12h, and 24h periods; calibration not described today) Calibrated Severe Thunderstorm Guidance

37 37 24h probability of > 1 severe thunderstorm within ~25 mi SREF: 2005051109 Valid 12 UTC May 11, 2005 to 12 UTC May 12, 2005 SVR WX ACTIVITY 12Z 11 May to 12Z 12 May, 2005 a= Hail w=Wind t=Tornado

38 38 Hail >.75”Wind > 50 ktsTornado 24h probability of > 1 severe thunderstorm within ~25 mi 21Z SREF valid at F039 through F039 (i.e., Day 1 Outlook) May 15 – Sept 15 2005 Economic Potential Value Reliability

39 39 Objective: Develop calibrated probabilistic guidance of snow accumulation on road surfaces Experimental Calibrated Snow Accumulation Guidance

40 40 Use frequency of occurrence technique -- similar to the calibrated probability of CG lightning Produce 8 calibrated joint probability tables Take power mean (RMS average) of all 8 tables for the 3h probability of snow accumulating on roads in the grid cell Calibration period is Oct. 1, 2004 through Apr. 30, 2005 MADIS “road-state” sensor information is truth (SREF is interpolated to MADIS road sensor) Ensemble Snow Calibration

41 41 SREF probability predictors (1)Two precipitation-type algorithms Baldwin algorithm in NCEP post. (Pr[Sn, ZR, IP]) Czys algorithm applied in SPC SREF post-processing. (Pr[Sn, ZR, IP]) (2) Two parameters sensitive to lower tropospheric and ground temperature Snowmelt parameterization (RSAE)– Evaluates fluxes to determine if 3” of snow melts over a 3h period. If yes, then parameter is assigned: 273.15 – T G. (Pr[>1; >2; >4]) Simple algorithm (RSAP) F (T pbl, T G, Q sfc net rad. flux, ) where values > 1 indicate surface cold enough for snow to accumulate. (Pr[>1]) Goal: Examine the parameter space around the lower PBL T, ground T, and precip type and calibrate using road sensor data.

42 42 Frequency Calibration Tables LAYER SREF INGREDIENT 1 SREF INGREDIENT 2 1 Prob(RSAE > 1) Prob(Baldwin Snow, ZR, or IP) 2 Prob(RSAE > 2) Prob(Baldwin Snow, ZR, or IP) 3 Prob(RSAE > 4) Prob(Baldwin Snow, ZR, or IP) 4 Prob(RSAE > 1) Prob(Czys Snow, ZR, or IP) 5 Prob(RSAE > 2) Prob(Czys Snow, ZR, or IP) 6 Prob(RSAE > 4) Prob(Czys Snow, ZR, or IP) 7 Prob(RSAP > 1) Prob(Baldwin Snow, ZR, or IP) 8 Prob(RSAP > 1) Prob(Czys Snow, ZR, or IP)

43 43 SREF 32F Isotherm (2 meter air temp) Mean (dash) Union (At least one SREF member at or below 32 F - dots) Intersection (All members at or below 32F- solid) 3h probability of freezing or frozen pcpn (Baldwin algorithm; uncalibrated) Example: New England Blizzard (F42: 23 January 2005 03Z) SREF 32F Isotherm (Ground Temp) Mean (dash) Union (At least one SREF member at or below 32 F - dots) Intersection (All members at or below 32F- solid) 3h calibrated probability of snow accumulating on roads

44 44 SREF 32F Isotherm (2 meter air temp) Mean (dash) Union (dots) Intersection (solid) 3h probability of freezing or frozen pcpn (Baldwin algorithm; uncalibrated) Example: Washington, DC Area (F21: 28 February 2005 18Z) SREF 32F Isotherm (Ground Temp) Mean (dash) Union (dots) Intersection (solid) 3h calibrated probability of snow accumulating on roads

45 45 6h Prob Snow Accum on Roads Oct 15, 2005 (F006 v15 UTC) 3h Prob Snow Accum on Roads Oct 15, 2005 (F006 v15 UTC)

46 46 Blind Test Calibration period: Oct 1, 2004 through April 30, 2005 5 days randomly selected for each month in the sample => 35 days in test Test days withheld from the monthly calibration tables (i.e., cross validation used) The SREF forecasts were reprocessed for the 35 days and verified against the MADIS surface state observations (F03 – F63)

47 47 Verification Reliability Diagram: All 3 h forecasts (F00 – F63); 35 days (Oct 1 – Apr 30) Economic Potential ValueReliability

48 48 Test Results 3 h forecast results (F00 – F63) –Forecast are reliable –Brier score is a 21% improvement over sample climatology –ROC area =.919 –Ave probability where new snow detected: 23% –Ave probability where new snow not detected: 4% –Economic value for a wide range of users peaking over 0.7

49 49 Road-Snow: Summary Method appears reliable – although 3h probabilities rarely exceed 50% Highlights importance of ground temp predictions from SREF and deterministic models Possible improvements: –Bias correction to 2m and ground temps from SREF* –Statistical post-processing of 2m and ground temps* prior to road-state calibration –Addition of asphalt tile to LSM of SREF members * See the next slide for temp correction information

50 50 Under dispersive SREF 2m temp forecast (F15) and cold bias F15 cold bias in 2m temp removed but remains under dispersive Uniform VOR after statistical adjustment to SREF Raw 2m Temp Bias adjusted 2m Temp Recalibrated 2m Temp Bias adjustment and recalibration with the addition of asphalt-type ground temp tile in LSM might be very useful for snow accumulation from SREF F15 SREF 2m Temp Verf Period: ~August, 2005


Download ppt "1 Development and Calibration of Ensemble Based Hazardous Weather Products at the Storm Prediction Center David Bright Gregg Grosshans, Jack Kain, Jason."

Similar presentations


Ads by Google