Storm Prediction Center Highlights NCEP Production Suite Review December 3, 2013 Steven Weiss, Israel Jirak, Chris Melick, Andy Dean, Patrick Marsh, and.

Slides:



Advertisements
Similar presentations
Radar Climatology of Tornadoes in High Shear, Low CAPE Environments in the Mid-Atlantic and Southeast Jason Davis Matthew Parker North Carolina State University.
Advertisements

Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
Integrated Regional Modeling Geoff DiMego EMC Stan Benjamin GSD Steve Weiss & Israel Jirak SPC Dave Novak & Wallace Hogsett WPC David Bright AWC, Joe Sienkiewicz.
SPC Potential Products and Services and Attributes of Operational Supporting NWP Probabilistic Outlooks of Tornado, Severe Hail, and Severe Wind.
User Perspective: Using OSCER for a Real-Time Ensemble Forecasting Experiment…and other projects Currently: Jason J. Levit Research Meteorologist, Cooperative.
Travis Smith NSSL / OU / CIMMS The Hazardous Weather Testbed / Experimental Warning Program.
Forecasting a Continuum of Environmental Threats (FACETs): A Proposed Next-Generation Hazardous Watch/Warning Paradigm Lans P. Rothfusz Acting Deputy Director.
Louisville, KY August 4, 2009 Flash Flood Frank Pereira NOAA/NWS/NCEP/Hydrometeorological Prediction Center.
Aspects of 6 June 2007: A Null “Moderate Risk” of Severe Weather Jonathan Kurtz Department of Geosciences University of Nebraska at Lincoln NOAA/NWS Omaha/Valley,
ESRL – Some Recommendations for Mesoscale Ensemble Forecasts Consolidate all NCEP regional storm-scale model runs perhaps under HRRRE (or other) banner.
Improving Excessive Rainfall Forecasts at HPC by using the “Neighborhood - Spatial Density“ Approach to High Res Models Michael Eckert, David Novak, and.
1 WRF Development Test Center A NOAA Perspective WRF ExOB Meeting U.S. Naval Observatory, Washington, D.C. 28 April 2006 Fred Toepfer NOAA Environmental.
Weather Research & Forecasting Model (WRF) Stacey Pensgen ESC 452 – Spring ’06.
Travis Smith (OU/CIMMS) February 25–27, 2015 National Weather Center
Tara Jensen for DTC Staff 1, Steve Weiss 2, Jack Kain 3, Mike Coniglio 3 1 NCAR/RAL and NOAA/GSD, Boulder Colorado, USA 2 NOAA/NWS/Storm Prediction Center,
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
Evaluation and Comparison of Multiple Convection-Allowing Ensembles Examined in Recent HWT Spring Forecasting Experiments Israel Jirak, Steve Weiss, and.
Warn on Forecast Briefing September 2014 Warn on Forecast Brief for NCEP planning NSSL and GSD September 2014.
Objective Evaluation of Aviation Related Variables during 2010 Hazardous Weather Testbed (HWT) Spring Experiment Tara Jensen 1*, Steve Weiss 2, Jason J.
Storm Prediction Center: Storm-Scale Ensemble of Opportunity Israel Jirak & Steve Weiss Science Support Branch Storm Prediction Center Norman, OK Acknowledgments:
The 2014 Flash Flood and Intense Rainfall Experiment Faye E. Barthold 1,2, Thomas E. Workoff 1,3, Wallace A. Hogsett 1*, J.J. Gourley 4, and David R. Novak.
2012 AAG Annual Meeting Panel Session “Where America’s Climate, Weather, Ocean and Space Weather Services Begin” Working at the Intersection of Geography.
1 FY10-14 Planning. 2 Vision for 2015 NOAA is closer to its customers and better able to respond to severe events. NOAA information is routinely incorporated.
Fly - Fight - Win 16 th Weather Squadron Evan Kuchera Fine Scale Models and Ensemble 16WS/WXN Template: 28 Feb 06 Air Force Weather Ensembles.
GOES-R Proving Ground Activities at the NWS Storm Prediction Center Dr. Russell S. Schneider Chief, Science Support Branch NWS Storm Prediction Center.
GOES-R Proving Ground Storm Prediction Center and Hazardous Weather Testbed Chris Siewert 1,2, Kristin Calhoun 1,3 and Geoffrey Stano OU-CIMMS, 2.
Collaborating on the Development of Warn-On-Forecast Mike Foster / David Andra WFO Norman OK Feb. 18, 2010 Mike Foster / David Andra WFO Norman OK Feb.
HWT Spring Forecasting Experiment: History and Success Dr. Adam Clark February 25, 2015 National Weather Center Norman, Oklahoma.
Assimilating Reflectivity Observations of Convective Storms into Convection-Permitting NWP Models David Dowell 1, Chris Snyder 2, Bill Skamarock 2 1 Cooperative.
Performance of the Experimental 4.5 km WRF-NMM Model During Recent Severe Weather Outbreaks Steven Weiss, John Kain, David Bright, Matthew Pyle, Zavisa.
3 rd Annual WRF Users Workshop Promote closer ties between research and operations Develop an advanced mesoscale forecast and assimilation system   Design.
Ligia Bernardet, S. Bao, C. Harrop, D. Stark, T. Brown, and L. Carson Technology Transfer in Tropical Cyclone Numerical Modeling – The Role of the DTC.
1 Development of a convection- resolving CONUS ensemble system (aka: Warn on Forecast) Presented By: Lou Wicker (NSSL) Contributors Geoff DiMego (NCEP)
NSSL’s Warn-on-Forecast Project Dr. Lou Wicker February 25–27, 2015 National Weather Center Norman, Oklahoma.
1 11/25/2015 Developmental Testbed Center (DTC) Bob Gall June 2004.
SPC Ensemble Applications: Current Status, Recent Developments, and Future Plans David Bright Storm Prediction Center Science Support Branch Norman, OK.
APPLICATION OF NUMERICAL MODELS IN THE FORECAST PROCESS - FROM NATIONAL CENTERS TO THE LOCAL WFO David W. Reynolds National Weather Service WFO San Francisco.
NOAA Hazardous Weather Test Bed (SPC, OUN, NSSL) Objectives – Advance the science of weather forecasting and prediction of severe convective weather –
Gary Jedlovec Roadmap to Success transitioning unique NASA data and research technologies to operations.
INFORMATION EXTRACTION AND VERIFICATION OF NUMERICAL WEATHER PREDICTION FOR SEVERE WEATHER FORECASTING Israel Jirak, NOAA/Storm Prediction Center Chris.
Forecasting a Continuum of Environmental Threats (FACETs): Overview, Plans and Early Impressions of a Proposed High-Impact Weather Forecasting Paradigm.
1 Proposal for a Climate-Weather Hydromet Test Bed “Where America’s Climate and Weather Services Begin” Louis W. Uccellini Director, NCEP NAME Forecaster.
Application of Short Range Ensemble Forecasts to Convective Aviation Forecasting David Bright NOAA/NWS/Storm Prediction Center Norman, OK Southwest Aviation.
An Examination of “Parallel” and “Transition” Severe Weather/Flash Flood Events Kyle J. Pallozzi and Lance F. Bosart Department of Atmospheric and Environmental.
Spatial Verification Methods for Ensemble Forecasts of Low-Level Rotation in Supercells Patrick S. Skinner 1, Louis J. Wicker 1, Dustan M. Wheatley 1,2,
R2O and O2R between NSSL and SPC: The benefits of Collocation Jack Kain, Steve Weiss, Mike Coniglio, Harold Brooks, Israel Jirak, and Adam Clark.
New Products and Services at the Storm Prediction Center Dr. Russell S. Schneider Dr. Joseph T. Schaefer Dr. David R. Bright Steven J. Weiss Andy R. Dean.
Convective-Scale Numerical Weather Prediction and Data Assimilation Research At CAPS Ming Xue Director Center for Analysis and Prediction of Storms and.
A Rare Severe Weather and Tornado Event in Central New York and Northeast Pennsylvania: July 8, 2014 Presented by Mike Evans 1.
Evolving the NCEP Production Suite Dr. William M. Lapenta Director, National Centers for Environmental Prediction NOAA/National Weather Service NCEP Production.
Transitioning unique NASA data and research technologies to operations Short-term Prediction Research and Transition (SPoRT) Project Future Directions.
SPC Mesoscale Analysis (aka “sfcOA”) Performance and Validation Efforts Steven Weiss, Israel Jirak, Andy Dean, Greg Carbin, Phillip Bothwell, and Corey.
DTC Overview Bill Kuo September 25, Outlines DTC Charter DTC Management Structure DTC Budget DTC AOP 2010 Processes Proposed new tasks for 2010.
Fly - Fight - Win 2 d Weather Group Mr. Evan Kuchera HQ AFWA 2 WXG/WEA Template: 28 Feb 06 Approved for Public Release - Distribution Unlimited AFWA Ensemble.
Storm Prediction Center NCEP Production Suite Review December 7, 2015 Steven Weiss, Russell Schneider, Israel Jirak, Chris Melick, Andy Dean, and Patrick.
Comparison of Convection-permitting and Convection-parameterizing Ensembles Adam J. Clark – NOAA/NSSL 18 August 2010 DTC Ensemble Testbed (DET) Workshop.
Extracting probabilistic severe weather guidance from convection-allowing model forecasts Ryan Sobash 4 December 2009 Convection/NWP Seminar Series Ryan.
STORM-SCALE DATA ASSIMILATION AND ENSEMBLE FORECASTING WITH THE NSSL EXPERIMENTAL WARN-ON-FORECAST SYSTEM 40 th National Weather Association Annual Meeting.
Weather Prediction Center 2015 NCEP Production Suite Review December 7, 2015 Mark Klein Science and Operations Officer Contributions by David Novak, Kathy.
HWT Experimental Warning Program: History & Successes Darrel Kingfield (CIMMS) February 25–27, 2015 National Weather Center Norman, Oklahoma.
The 1 November 2004 tornadic QLCS event over southwest Illinois Ron W. Przybylinski Science and Operations Officer National Weather Service – St. Louis.
Overview of SPC Efforts in Objective Verification of Convection-Allowing Models and Ensembles Israel Jirak, Chris Melick, Patrick Marsh, Andy Dean and.
1 National Centers for Environmental Prediction: Where America’s Climate and Weather Services Begin Louis W. Uccellini Director, NCEP January 28, 2004.
The Quantitative Precipitation Forecasting Component of the 2011 NOAA Hazardous Weather Testbed Spring Experiment David Novak 1, Faye Barthold 1,2, Mike.
A few examples of heavy precipitation forecast Ming Xue Director
Hydrometeorological Predication Center
Center for Analysis and Prediction of Storms (CAPS) Briefing by Ming Xue, Director CAPS is one of the 1st NSF Science and Technology Centers established.
ATM 401/501 Status of Forecasting: Spring 2013
CAPS Mission Statement
Presentation transcript:

Storm Prediction Center Highlights NCEP Production Suite Review December 3, 2013 Steven Weiss, Israel Jirak, Chris Melick, Andy Dean, Patrick Marsh, and Russ Schneider Storm Prediction Center, Norman, OK National Weather Center

Outline SPC mission and responsibility Convection-allowing model (CAM) guidance for severe weather –Deterministic models: 2 cases NAM CONUS Nest, CONUS WRF-NMM, NSSL-ARW –Ensemble systems SSEO, SSEF, AFWA CAM Simulated Reflectivity Verification –New SPC objective verification system SPC “request list”

Good News From SPC Perspective NCO/EMC leadership and guidance in planning/executing transition from CCS to WCOSS Continued reliable, timely operational model production New opportunities for enhanced collaboration Weekly EMC Model Evaluation Group (MEG) webinars Bi-Weekly EMC-SPC telecons/webinars focusing on convection-allowing model improvements (NAM Nest, HiResWindow) Bi-Weekly GSD-SPC telecons/webinars focusing on RAP and HRRR development Interactions with HWRF group to generate output fields for Tropical Cyclone tornado guidance Closer integration of model development efforts with severe weather forecasting needs Better alignment with NOAA/NWS strategic goals of Weather Ready Nation, Warn on Forecast, and emerging Forecasting A Continuum of Environmental Threats (FACETs)

Thunderstorms Hail, Wind, Tornadoes Fire Weather Mesoscale Winter Weather STORM PREDICTION CENTER HAZARDOUS PHENOMENA

SPC Mission and Responsibility Storm Prediction Center Primary Products

Severe Weather Case of 10 February 2013 Localized Tornado Events – EF-4 Strikes Hattiesburg, MS –21 mile path length up to 3/4 mile wide –Excellent NWS forecasts/warnings => No fatalities; 82 injuries –Nearly 200 businesses/homes destroyed and 370 damaged –Atypical convective mode transition from QLCS to series of discrete supercells –Cool season High Shear/Low CAPE environment

Base Reflectivity Mosaic 23z 10 Feb 2013 Hattiesburg, MS

Simulated Reflectivity (1 km AGL) 12-hr Forecasts Valid 12z 10 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic

Simulated Reflectivity (1 km AGL) 13-hr Forecasts Valid 13z 10 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic

Simulated Reflectivity (1 km AGL) 14-hr Forecasts Valid 14z 10 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic

Simulated Reflectivity (1 km AGL) 15-hr Forecasts Valid 15z 10 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic

Simulated Reflectivity (1 km AGL) 16-hr Forecasts Valid 16z 10 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic Models are slow with convective line from western MS into southeast TX

Simulated Reflectivity (1 km AGL) 17-hr Forecasts Valid 17z 10 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic

Simulated Reflectivity (1 km AGL) 18-hr Forecasts Valid 18z 10 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic Radar indicates new cells forming ahead of line over southeast LA and MS

Simulated Reflectivity (1 km AGL) 19-hr Forecasts Valid 19z 10 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic Linear structure is diminishing as additional discrete storms form southeast LA and MS

Simulated Reflectivity (1 km AGL) 20-hr Forecasts Valid 20z 10 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic Models fail to predict incipient supercell storms in southern MS

Simulated Reflectivity (1 km AGL) 21-hr Forecasts Valid 21z 10 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic Tornadic supercells are imminent over southeast MS where models have few storms

Simulated Reflectivity (1 km AGL) 22-hr Forecasts Valid 22z 10 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic

Simulated Reflectivity (1 km AGL) 23-hr Forecasts Valid 23z 10 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic

Simulated Reflectivity (1 km AGL) 24-hr Forecasts Valid 00z 11 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic Tornadic supercells are continuing over southeast MS/southwest AL where models have few storms

Did 12z Model Runs Provide Improved Forecasts?

Simulated Reflectivity (1 km AGL) 24-hr Forecasts Valid 00z 11 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic Tornadic supercells are continuing over southeast MS/southwest AL where 00z models have few storms

Simulated Reflectivity (1 km AGL) 12-hr Forecasts Valid 00z 11 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic 12z models show general improvement over earlier 00z runs with more intense QLCS

Hourly Max Updraft Helicity(2-5 km AGL) 24-hr Forecasts Valid 00z 11 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW NSSL Radar Low-Level Rotation Tracks 23-00z <25 m2/s2 Max 50 m2/s2 Models have limited indications of storms with strong rotating updrafts

Hourly Max Updraft Helicity(2-5 km AGL) 12-hr Forecasts Valid 00z 11 Feb 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW NSSL Radar Low-Level Rotation Tracks 23-00z Max 125 m2/s2 Max 50 m2/s2 12z WRF-NMM and NSSL-ARW have stronger rotating updrafts compared to earlier 00z model runs <25 m2/s2

12z and 18z HRRR Forecasts Was the hourly HRRR able to predict discrete cells initiating ahead of convective line?

Simulated Composite Reflectivity Forecasts Valid 21z 10 Feb z HRRR 09-hr Forecast 18z HRRR 03-hr Forecast Observed Radar Mosaic

Simulated Composite Reflectivity Forecasts Valid 22z 10 Feb z HRRR 10-hr Forecast 18z HRRR 04-hr Forecast Observed Radar Mosaic

Simulated Composite Reflectivity Forecasts Valid 23z 10 Feb z HRRR 11-hr Forecast 18z HRRR 05-hr Forecast Observed Radar Mosaic

Simulated Composite Reflectivity Forecasts Valid 00z 11 Feb z HRRR 12-hr Forecast 18z HRRR 06-hr Forecast Observed Radar Mosaic 18z HRRR has smaller phase error than 12z run although convective evolution did not match observations

Hourly Max Updraft Helicity (2-5 km AGL) Forecasts Valid 00z 11 Feb z HRRR 12-hr Forecast 18z HRRR 06-hr Forecast NSSL Radar Low-Level Rotation Tracks 23-00z Max 125 m2/s2Max 25 m2/s2 18z HRRR rotating updraft forecast is stronger with better placement compared to 12z run Hi-Res models can provide useful guidance for forecasters even when CAM predictions are not “perfect”

Severe Weather Case of 20 May 2013 Significant Tornado Events – EF-5 Strikes Moore, OK (again) –17 mile path length; up to 1.3 miles wide –24 fatalities and hundreds injured Most tornado deaths from a single tornado in 2013 –Two schools, a hospital and hundreds of businesses/houses destroyed –Complex stormscale interactions with dryline/cold front resulted in rapid storm intensification in central Oklahoma Unlike previous 10 February case, more widespread instability and severe storms occurred on 20 May 2013

Severe Weather Case of 20 May 2013 Significant Tornado Events – EF-5 Strikes Moore, OK (again) –17 mile path length; up to 1.3 miles wide –24 fatalities and hundreds injured Most tornado deaths from a single tornado in 2013 –Two schools, a hospital and hundreds of businesses/houses destroyed –Complex stormscale interactions with dryline/cold front resulted in rapid storm intensification in central Oklahoma

Severe Weather Case of 20 May 2013 Significant Tornado Events – EF-5 Strikes Moore, OK (again) –17 mile path length; up to 1.3 miles wide –24 fatalities and hundreds injured Most tornado deaths from a single tornado in 2013 –Two schools, a hospital and hundreds of businesses/houses destroyed –Complex stormscale interactions with dryline/cold front resulted in rapid storm intensification in central Oklahoma

Severe Weather Case of 20 May 2013 Significant Tornado Events – EF-5 Strikes Moore, OK (again) –17 mile path length; up to 1.3 miles wide –24 fatalities and hundreds injured Most tornado deaths from a single tornado in 2013 –Two schools, a hospital and hundreds of businesses/houses destroyed –Complex stormscale interactions with dryline/cold front resulted in rapid storm intensification in central Oklahoma

Base Reflectivity Mosaic 20z 20 May 2013 Moore, OK

Simulated Reflectivity (1 km AGL) 18-hr Forecasts Valid 18z 20 May 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic Storm initiation beginning southwest OK and southeast KS; WRF-NMM4 has spurious storms MO/AR

Simulated Reflectivity (1 km AGL) 19-hr Forecasts Valid 19z 20 May 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic Models generally slow with initial severe storms southwest OK, southeast KS and southwest MO

Simulated Reflectivity (1 km AGL) 20-hr Forecasts Valid 20z 20 May 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic NAM4 weaker than NSSL-WRF and WRF-NMM with OK severe storms; EF-5 tornado now in Moore, OK

Simulated Reflectivity (1 km AGL) 21-hr Forecasts Valid 21z 20 May 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic All models poor with severe storms in southeast KS; NSSL-WRF better with new storms in TX

Simulated Reflectivity (1 km AGL) 22-hr Forecasts Valid 22z 20 May 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic WRF-NMM and NSSL-WRF indicate bowing storms into eastern OK; linear transition too fast

Simulated Reflectivity (1 km AGL) 23-hr Forecasts Valid 23z 20 May 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic NSSL-WRF has best overall location/intensity; NAM4 slowest with eastward movement in OK

Simulated Reflectivity (1 km AGL) 24-hr Forecasts Valid 00z 21 May 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW Observed Radar Mosaic Overall, all models provided useful guidance indicating strong/severe storms developing during afternoon

Hourly Max Updraft Helicity (2-5 km AGL) 20-hr Forecasts Valid 20z 20 May 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW NSSL Radar Low-Level Rotation Tracks 19-20z <25 m2/s2 Max 200 m2/s2 Max 75 m2/s2 WRF-NMM showing intense supercell near Red River; NSSL-WRF corresponds well to observed supercells

Hourly Max Updraft Helicity (2-5 km AGL) 21-hr Forecasts Valid 21z 20 May 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW NSSL Radar Low-Level Rotation Tracks 20-21z Max 75 m2/s2 Max 200 m2/s2 Max 125 m2/s2 By 21z models indicating strongly rotating updrafts although locations vary

Hourly Max Updraft Helicity (2-5 km AGL) 22-hr Forecasts Valid 22z 20 May 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW NSSL Radar Low-Level Rotation Tracks 21-22z Max 125 m2/s2 Max 175 m2/s2 Max 125 m2/s2 All models develop storms with strong rotating updrafts; NSSL-WRF UH tracks better match observations

Hourly Max Updraft Helicity (2-5 km AGL) 23-hr Forecasts Valid 23z 20 May 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW NSSL Radar Low-Level Rotation Tracks 22-23z Max 125 m2/s2 Max 175 m2/s2 Max 200 m2/s2

Hourly Max Updraft Helicity (2-5 km AGL) 24-hr Forecasts Valid 00z 21 May 2013 NAM 4 km CONUS Nest 4 km WRF-NMM NSSL 4 km WRF-ARW NSSL Radar Low-Level Rotation Tracks 23-00z Max 100 m2/s2 Max 200 m2/s2

NAM Nest Testing – 20 May Case EMC MMB has been conducting a series of sensitivity tests in NAM 4 km Nest to improve severe weather guidance (Brad Ferrier, Eric Aligo) –Focus has been to improve details of convective storm structure and intensity –Address concerns that NAM Nest convective storms tend to be too weak and diffuse (“do not look like storms on radar”) –Retrospective tests have been run on several severe weather cases Primarily exploring sensitivity to microphysics and radiation parameterizations No convective parameterization (BMJ “light” turned off)

Simulated Composite Reflectivity 22-hr Forecasts Valid 22z 20 May 2013 Vertical Cross-Section (top); Plan View (bottom) Courtesy: Eric Aligo and Brad Ferrier

Objective Verification of CAM Reflectivity Forecasts Spatial Neighborhood Method Applied to Real-Time Model Forecasts

Background The predictability of convective storms is often low at the grid point for CAM forecasts though we know these forecasts can be quite useful (e.g., Kain et al. 2010). Traditional grid point verification methods penalize high resolution forecasts with small phase errors – How do we quantify the perceived utility of high-resolution forecasts when the number of exact grid-point matches between forecasts and observations is relatively low? At SPC, we have examined/applied the spatial neighborhood method owing to its straightforward interpretation and relative ease of implementation – Tested and validated in HWT Spring Experiments

Background SPC began looking at near real-time objective verification of simulated reflectivity forecasts from convection-allowing models (CAMs) during the 2012 HWT Spring Forecasting Experiment (SFE2012). Compared subjective impressions of forecast skill to objective grid-point and neighborhood metrics. Neighborhood metrics were overwhelmingly favored by SFE2012 participants in agreeing with the perceived quality of reflectivity forecasts. We wanted to develop a year-round framework to examine objective statistics in near-real time for forecaster look-back at model performance and for evaluating parallel CAMs.

Neighborhood Approach NSSL-WRFNMQ OBS Find the neighborhood maximum reflectivity within a 40 km radius. Hit = 32 Hit = 11,555 Get credit for near 40 dBZ

Neighborhood Metrics NSSL-WRFNMQ OBS Still calculate traditional 2x2 contingency table statistics (i.e., POD, FAR, Bias, CSI, etc.) on neighborhood fields By applying a 2-D Gaussian smoother (with   40 km) to both model and obs (e.g., Sobash et al. 2011, Marsh et al. 2012), get a probabilistic field (purple lines – 40 dBZ) from which fractions skill score (FSS; Roberts and Lean 2008; Schwartz et al. 2010) can be calculated. FSS = 0.806

SPC Internal Webpage (Shared with MMB) Utilizes database of gridded reflectivity forecasts and observations Flexible menu options at top: date, region, cycle, variable, threshold, model, & metric Multi- panel model display with obs Hourly objective metrics below images Link to summary statistics page

Example: 17 Nov 2013 Tornado Outbreak Outbreak of tornadoes and damaging wind events – 20 strong/violent tornadoes; wind gusts to 86 mph – 9 fatalities in Illinois and lower Michigan – Storms transitioned from discrete supercells to bowing line segments

Example: 17 November in Midwest 18z: NSSL-WRF and HiResW WRF-ARW better at depicting early storms across the region.

Example: 17 November in Midwest 00z: HiResW ARW highest scores; NAMp better than NAM depicting later linear storms.

Reflectivity Summary Statistics 13z Nov z Nov 18, 2013: Midwest Region Stats for 40-km Neighborhood Relatively high CSI/POD scores on this day with strongly forced scenario. NSSL-WRF (blue) has the highest POD and CSI for all thresholds; HiResW ARW has similar CSI at 40 dBZ. Although NSSL-WRF also has largest bias, the POD is much higher without an increase in FAR. Para NAM Nest (red) has higher POD/CSI at dBZ and lower FAR at 30 dBZ

Reflectivity Summary Statistics August 13 – November 22, 2013: National Stats at Grid Point Performance diagrams can be dynamically generated with varying inputs (cycle, model, sector, date range). Accuracy at the grid point is very low for CAM reflectivity forecasts, but we know they can still be useful…

Reflectivity Summary Statistics August 13 – November 22, 2013 : National Stats for 40-km Neighborhood NSSL-WRF (blue) has the highest CSI for all dBZ thresholds. Although it also has largest bias, the POD is much higher without an increase in FAR. Opnl NAM Nest (green) has the lowest CSI at 30 & 40 dBZ with a low bias/POD. Para NAM Nest (red) is improved over Opnl NAM Nest (green) over the past ~3 months.

CAM Verification Summary Through testing and evaluation during HWT Spring Forecasting Experiments in , SPC developed a framework for near real-time objective verification of reflectivity forecasts from CAMs This system can help to objectively compare performance of multiple CAMs and assess degree of improvement in parallel systems before implementation The neighborhood metrics tend to agree well with subjective impressions of forecast skill and provide meaningful separation among the CAMs Over the last 3 months (13 August – 22 November) over CONUS: – NSSL-WRF had the highest CSI/POD at all dBZ thresholds (i.e., 30, 40, & 50) – Parallel NAM Nest was statistically better than the operational NAM Nest

Stormscale Ensemble Systems Multiple CAM Ensembles have been tested in the HWT and are used by SPC forecasters

NOAA Hazardous Weather Testbed Forecasting Research Satellite-based Research Experimental Warning Program Detection and prediction of hazardous weather events up to several hours in advance Experimental Forecast Program Prediction of hazardous weather events from a few hours to a week in advance Warning Research Storm Prediction Center: Nationwide responsibility NWS OUN Forecast Office: Local CWA responsibility Where practitioners and researchers work together…

NOAA Hazardous Weather Testbed Spring Forecasting Experiment Overview What was new for 2013? Forecasting emphasis on probabilistic forecasts of total severe weather valid for shorter time periods than current operational products with more frequent updates. New model guidance available for the generation of these forecasts: 12Z convection-allowing ensembles (OU CAPS SSEF, SPC SSEO, and AFWA) in addition to 00Z runs Hourly NSSL Mesoscale Ensemble (NME) NSSL WRF-ARW parallel initialized from the NME UKMET Unified Model convection-allowing runs

NOAA Hazardous Weather Testbed Spring Forecasting Experiment Overview What was new for 2013? Forecasting emphasis on probabilistic forecasts of total severe weather valid for shorter time periods than current operational products with more frequent updates. New model guidance available for the generation of these forecasts: 12Z convection-allowing ensembles (OU CAPS SSEF, SPC SSEO, and AFWA) in addition to 00Z runs Hourly NSSL Mesoscale Ensemble (NME) NSSL WRF-ARW parallel initialized from the NME UKMET Unified Model convection-allowing runs

OU CAPS Storm-Scale Ensemble Forecast (SSEF) 25 members at 00Z with 15 core members for ensemble products and 10 members with physics-only perturbations; 8 members at 12Z; 4-km grid spacing Single model (WRF-ARW), multi-physics, multi-initial conditions; apply SREF perturbations to NAM initial conditions Advanced physics, 3DVAR radar data assimilation SPC Storm-Scale Ensemble of Opportunity (SSEO) 7 members at 00Z and 12Z including 2 time-lagged members Multi-model (WRF-ARW, WRF-NMM, and NMM-B), multi-physics; take available deterministic models and process as an ensemble All members use NAM ICs/LBCs Air Force Weather Agency (AFWA) Ensemble 10 members at 00Z & 12Z with 4-km grid spacing Single model (WRF-ARW), multi-physics, multi-initial conditions; cold start from downscaled global model forecasts No data assimilation 2013 HWT Spring Forecasting Experiment 3 Convection-Allowing Ensembles

2013 HWT Spring Forecasting Experiment Display Tools to Extract and Summarize Information 24-hr UH Forecasts CAM Ensembles valid 21-00z 20 May hr Max Any Member 3-hr NProb>25 m2/s2 3-hr NProb >100 m2/s2 CAPS SSEF SPC SSEO AFWA

2013 HWT Spring Forecasting Experiment Display Tools to Extract and Summarize Information 12-hr UH Forecasts CAM Ensembles valid 21-00z 20 May hr Max Any Member 3-hr NProb>25 m2/s2 3-hr NProb >100 m2/s2 CAPS SSEF SPC SSEO AFWA

The impact of radar data assimilation in the CAPS SSEF was evident in the first 4 hours of the 12z initialized forecast. The 12z SSEO had slightly higher FSS than the 00z SSEO after 15z However, the 12z SSEF scores were slightly lower than 00z SSEF during 19-23z before showing slightly improved skill thereafter Performance of SSEO was often as good or better than more formal SSEF 2013 HWT Spring Forecasting Experiment 00z and 12z CAM Ensembles Comparison – SSEF and SSEO Hourly Reflectivity Forecast Verification (Fractions Skill Score) 24 day sample M-F May 6-Jun 7

SPC Request List (1 of 2) –All involve large challenges with substantial computing resource dependencies –SPC future NWP model requirements are linked directly to NOAA/NWS Strategic Goals Weather Ready Nation Warn on Forecast –following the Forecasting A Continuum of Environmental Threats (FACETs) concept to enhance communication of risk/uncertainty progressing from days to hours before events –Mesoscale and Convection-Allowing Models (CAMs) Continue focused efforts to improve NAM Nests and HiResWindows to simulate and predict convective storms in a more realistic manner –Data assimilation, physics, dynamic core testing –Expand deterministic approaches toward CAM ensemble system (e.g., HRRRE) Enhance collaborations between EMC and external scientists having storm scale DA and modeling expertise, especially within (but not limited to) NOAA –OAR (ESRL, NSSL); NCAR, universities (NCEP visiting scientist program?) Broaden objective verification metrics used in assessing model performance including parallel runs to include: –CAPE and 0-6 km bulk shear parameter analyses and forecasts to assess skill predicting the mesoscale convective environment (GFS, NAM, RAP, CAMs) –Incorporation of emerging scale-appropriate verification metrics for CAM high- res forecasts of discontinuous fields such as reflectivity, precipitation, and storm intensity attributes (e.g., new SPC CAM verification system)

SPC Request List (2 of 2) –Continue Efforts to Consolidate and Simplify Model Run Suite Goal: Move toward single global, regional, and stormscale deterministic and ensemble systems that serve specific purposes Global: 2-4 times daily out to two weeks Regional: every 6 hrs out to Day 4 Stormscale: every hour out to hrs (predictability dependent) –Model Evaluation Process Prior to Operational Implementation Current 30-day pre-implementation evaluation period is necessary for final technical testing of code stability, run timeliness, product correctness, etc. But this short period does not typically allow for in-depth assessment by user community across a range of seasons and weather regimes –Parallel model configuration largely finalized before most users first see output Develop IT framework to give interested users earlier access to parallel runs under development –One solution would be an internal NWS NOMADS-like data distribution system for Centers/WFOs to access and display gridded datasets from operational and parallel model runs in AWIPS2 (IDP challenge)