An Object-Based Approach for Identifying and Evaluating Convective Initiation Forecast Impact and Quality Assessment Section, NOAA/ESRL/GSD.

Slides:



Advertisements
Similar presentations
Katherine Jenny Thompson
Advertisements

R. Forbes, 17 Nov 09 ECMWF Clouds and Radiation University of Reading ECMWF Cloud and Radiation Parametrization: Recent Activities Richard Forbes, Maike.
DYnamical and Microphysical Evolution of Convective Storms Thorwald Stein, Robin Hogan, John Nicol DYMECS.
Object-based precipitation analysis: application to tropical cyclones and the Slovenian radar data Mini Workshop on NWP Modelling Research in Slovenia.
Introduction to the Forecast Impact and Quality Assessment Section GSD Verification Summit Meeting 8 September 2011 Jennifer Mahoney 1.
Verification for Terminal and En-route Weather Forecasts and TFM Decisions Jennifer Mahoney NOAA/Earth System Research Laboratory
Numerical Uncertainty Assessment Reference: P.J. Roache, Verification and Validation in Computational Science and Engineering, Hermosa Press, Albuquerque.
Assessment of Tropical Rainfall Potential (TRaP) forecasts during the Australian tropical cyclone season Beth Ebert BMRC, Melbourne, Australia.
Science Meeting-1 Lin 12/17/09 MIT Lincoln Laboratory Prediction of Weather Impacts on Air Traffic Through Flow Constrained Areas AMS Seattle Yi-Hsin Lin.
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Object-based Spatial Verification for.
1 BA 555 Practical Business Analysis Review of Statistics Confidence Interval Estimation Hypothesis Testing Linear Regression Analysis Introduction Case.
Verification of WAFS Global Icing Products Jennifer Mahoney 1 and Sean Madine 1,2 1 NOAA/Earth System Research Laboratory 2 Cooperative Institute for Research.
Proxy Data and VHF/Optical Comparisons Monte Bateman GLM Proxy Data Designer.
On the relationship of in-cloud convective turbulence and total lightning Wiebke Deierling, John Williams, Sarah Al-Momar, Bob Sharman, Matthias Steiner.
49 COMET Hydrometeorology 00-1 Matt Kelsch Tuesday, 19 October 1999 Radar-Derived Precipitation Part 3 I.Radar Representation of.
Data Integration: Assessing the Value and Significance of New Observations and Products John Williams, NCAR Haig Iskenderian, MIT LL NASA Applied Sciences.
CPC Unified Gauge – Satellite Merged Precipitation Analysis for Improved Monitoring and Assessments of Global Climate Pingping Xie, Soo-Hyun Yoo,
RADAR STORM MOTION ESTIMATION AND BEYOND: A SPECTRAL ALGORITHM AND RADAR OBSERVATION BASED DYNAMIC MODEL Gang Xu* and V. Chandrasekar Colorado State University.
Atmospheric Science Department University of Missouri - Columbia A Procrustes Shape Analysis Verification Tool Steve Lack Sakis Micheas Neil Fox Chris.
Verification Summit AMB verification: rapid feedback to guide model development decisions Patrick Hofmann, Bill Moninger, Steve Weygandt, Curtis Alexander,
THE GOES-R GLM LIGHTNING JUMP ALGORITHM (LJA): RESEARCH TO OPERATIONAL ALGORITHM Elise V. Schultz 1, C. J. Schultz 1,2, L. D. Carey 1, D. J. Cecil 2, G.
Towards an object-oriented assessment of high resolution precipitation forecasts Janice L. Bytheway CIRA Council and Fellows Meeting May 6, 2015.
STEPS: An empirical treatment of forecast uncertainty Alan Seed BMRC Weather Forecasting Group.
Event-based Verification and Evaluation of NWS Gridded Products: The EVENT Tool Missy Petty Forecast Impact and Quality Assessment Section NOAA/ESRL/GSD.
1- Near-Optimized Filtered Forecasts (NOFF) using wavelet analysis (O-MAPLE) 2- Probabilistic MAPLE (Probability of rain occurrence at different thresholds)
Christopher J. Schultz 1, Walter A. Petersen 2, Lawrence D. Carey 3* 1 - Department of Atmospheric Science, UAHuntsville, Huntsville, AL 2 – NASA Marshall.
NCAR Auto-Nowcaster Convective Weather Group NCAR/RAL.
Ebert-McBride Technique (Contiguous Rain Areas) Ebert and McBride (2000: Verification of precipitation in weather systems: determination of systematic.
Quality Assessment and NEVS for NextGen Jennifer Mahoney NOAA/ESRL/GSD 2 Dec 2010 Interagency Aviation Meeting.
The Benefit of Improved GOES Products in the NWS Forecast Offices Greg Mandt National Weather Service Director of the Office of Climate, Water, and Weather.
Data assimilation, short-term forecast, and forecasting error
Prototype-Driven Learning for Sequence Models Aria Haghighi and Dan Klein University of California Berkeley Slides prepared by Andrew Carlson for the Semi-
T he Man-In-The-Loop (MITL) Nowcast Demonstration: Forecaster Input into a Thunderstorm Nowcasting System R. Roberts, T. Saxen, C. Mueller, E. Nelson,
Mike Evans NWS Binghamton, NY. Outline The checklist Example – April 28, 2011 Verification Summary / Conclusion.
Issues concerning the interpretation of statistical significance tests.
Computer-based identification and tracking of Antarctic icebergs in SAR images Department of Geography, University of Sheffield, 2004 Computer-based identification.
Characteristics of Dual-Polarimetric Radar Variables during Lightning Cessation: A case Study for 11 April 2008 Thunderstorm Xuanli Li, John Mecikalski,
Feature-based (object-based) Verification Nathan M. Hitchens National Severe Storms Laboratory.
Observed & Simulated Profiles of Cloud Occurrence by Atmospheric State A Comparison of Observed Profiles of Cloud Occurrence with Multiscale Modeling Framework.
Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University.
OSSEs and NOAA’s Quantitative Observing Systems Assessment Program (QOSAP) Bob Atlas, Craig MacLean, Lidia Cucurull (NOAA, USA) Sharan Majumdar, Tom Hamill.
Spatial Verification Methods for Ensemble Forecasts of Low-Level Rotation in Supercells Patrick S. Skinner 1, Louis J. Wicker 1, Dustan M. Wheatley 1,2,
CI VERIFICATION METHODOLOGY & PRELIMINARY RESULTS
US Army Engineer Research and Development Center U.S. IOOS MODEL VALIDATION CAPABILITY SURA SUPER-REGIONAL TEST BED Working across agencies to bring observations.
Recent and Planned Updates to the NCAR Auto-Nowcast (ANC) System Thomas Saxen, Rita Roberts, Huaqing Cai, Eric Nelson, Dan Breed National Center for Atmospheric.
Diagnostic verification and extremes: 1 st Breakout Discussed the need for toolkit to build beyond current capabilities (e.g., NCEP) Identified (and began.
Investigating Lightning Cessation at KSC Holly A. Melvin Henry E. Fuelberg Florida State University GOES-R GLM Workshop September 2009.
WRF Verification Toolkit Workshop, Boulder, February 2007 Spatial verification of NWP model fields Beth Ebert BMRC, Australia.
Using TIGGE Data to Understand Systematic Errors of Atmospheric River Forecasts G. Wick, T. Hamill, P. Neiman, and F.M. Ralph NOAA Earth System Research.
UAH 28 Sept 2008R. Boldi NSSTC/UAH 1 Hazardous Cell Tracking Robert Boldi 29 September 2008 NSSTC/UAH.
© 2000 Roger Edwards Investigating the Potential of Using Radar to Nowcast Cloud-to-Ground Lightning Initiation over Southern Ontario 18 th Annual GLOMW.
Gridded WAFS Icing Verification System Matt Strahan WAFC Washintgon.
C. Schultz, W. Petersen, L. Carey GLM Science Meeting 12/01/10.
Application of the CRA Method Application of the CRA Method William A. Gallus, Jr. Iowa State University Beth Ebert Center for Australian Weather and Climate.
ASAP Convective Weather Research at NCAR Matthias Steiner and Huaqing Cai Rita Roberts, John Williams, David Ahijevych, Sue Dettling and David Johnson.
Application of object-oriented verification techniques to ensemble precipitation forecasts William A. Gallus, Jr. Iowa State University June 5, 2009.
60 min Nowcasts 60 min Verification Cold Front Regime
Paper Review Jennie Bukowski ATS APR-2017
Intensity-scale verification technique
Procrustes Shape Analysis Verification Tool
Systematic timing errors in km-scale NWP precipitation forecasts
NCAR Research on Thunderstorm Analysis & Nowcasting
UZAKTAN ALGIILAMA UYGULAMALARI Segmentasyon Algoritmaları
Tremor Detection Using Motion Filtering and SVM Bilge Soran, Jenq-Neng Hwang, Linda Shapiro, ICPR, /16/2018.
Nic Wilson’s M.S.P.M. Research
Roger Edwards (SPC) and Greg Carbin (WPC)
Rita Roberts and Jim Wilson National Center for Atmospheric Research
Case Study: Evaluation of PBL Depth in an Erroneous HRRR forecast for CI Keenan Eure.
Sorin Burcea, Roxana Cică, Roxana Bojariu
Presentation transcript:

An Object-Based Approach for Identifying and Evaluating Convective Initiation Forecast Impact and Quality Assessment Section, NOAA/ESRL/GSD

Convective Initiation Problem Modelers are continually trying to improve the initiation of convective activity in their solutions How do you quantify that the model is improving convective initiation? Qualitative assessments are often misleading Improvement of performance measures at typical initiation times misleading Isolating CI in observations and forecasts to validate improvement is not widely applied A solution: use an object-oriented verification approach to assist in the automated identification of CI in the observation and the forecast 9/8/11Verification Summit2

Initiation Confusion Onset Cessation The above plot (Layne and Lack, 2010) shows a measure of impact to an ARTCC over time. It is important to note that initiation is not the same as onset. Onset could have been caused by advection into a particular domain. Likewise, cessation of an event is not the same as decay or dissipation. 3

Procrustes Verification Approach Procrustes verification approach (Lack et al WAF) identifies objects by using threshold/minimum size of object criteria or FFT transform for signal strength object detection Object matches are based on minimizing a user-defined weighted cost function Observed objects are matched to all forecast objects and vice versa to be tagged hits, misses, and false alarms Error statistics are broken down into components displacement magnitude and direction, intensity, rotation, and shape 9/8/11Verification Summit4

Procrustes Scheme for Initiation A weakness of object-oriented approaches is from counter- intuitive matching that may occur during matching forecast and observation objects (esp. large time steps) This is minimized when using small time steps and examining consecutive observation fields or forecast fields (High spatio- temporal correlation for convection) Hits, miss, false alarm detection in Procrustes scheme helps identify new initiation from growth, decay, and advection Misses from t=0 to t-5 generally are initiation cells Decay of larger storms into smaller ones to avoid initiation on cluster breakdown is also implemented 9/8/11Verification Summit5

No Initiation Example (threshold method) T-5=2105ZT0=2110Z Procrustes matching is from T=0 to T-5, so 1 matches to 1, 2 matches to 2, and 3 is matched to 4. 3 in the T-5 is tagged as a false alarm and there is no initiation just growth

Initiation Example (threshold method) T-5=2105ZT0=2110Z In this case there are 2 new areas of initiation: Cells 2 and 3 at T=0 on the right (tagged as misses in the scheme). Cells 1,4 and 5 at T=0 on the right are growing cells from 1,2, and 3, respectively

Initiation at 1-h intervals (FFT method) New Initiation 9/8/11Verification Summit8

9 Initiation Detection Challenges Sensitivity in what defines an initiation cell (object) No standard definition for evaluation Radar outages or coverage gaps Large cells may appear during outage and tagged as initiation Inconsistent radar time steps…(e.g. Δt=5,6, 10-min) especially on merged radar products Growth may appear inconsistent Radar QC (different modes, clutter) When switching to different modes, initiation objects may spike at those time steps Some mitigation is applied to the initiation Ex: Minimum size restrictions for radar outages and gaps 9/8/11Verification Summit

10 Initiation area 1 (top blue object) has a nest of initiation 2 and has 3 cells nearby. It has 1 intersection (distance 0) and 2 cells distance x1 and distance x2 away within +/- z minutes from forecast valid time. Initiation area 2 (bottom blue object) has 1 cell distance y away from it within +/- z minutes from forecast valid time. If no forecast initiation zone is present and initiation cells are present, the forecast tagged as a total miss. Verification Methodology ANC-like example Procrustes error decomposition allows for distributions of distances, etc. x1 x2 y 9/8/11Verification Summit

Example Results ANC HITL Pilot Evaluation for ZFW HITL (regime selection and boundary input) altered initiation potential regions (intensity and some location) May adjust time window for initiation and define alternate initiation thresholds 9/8/11Verification Summit11

Future Directions Aforementioned study focused on comparing convective forecasts using radar data as the observation Using radar has its challenges but may have value as a verification methodolgy for CI Expand approach to include other parameters or fields Use of satellite data, especially for oceanic applications Total lightning for a “truer” convective initiation measure, although model comparisons become more challenging GOES-R Test concept on sample HRRR CI explicit fields 3 different CI fields Can bring in LMA data along with reflectivity to test CI detection and evaluation 9/8/11Verification Summit12