Do the NAM and GFS have displacement biases in their MCS forecasts? Charles Yost Russ Schumacher Department of Atmospheric Sciences Texas A&M University.

Slides:



Advertisements
Similar presentations
Solar Energy Forecasting Using Numerical Weather Prediction (NWP) Models Patrick Mathiesen, Sanyo Fellow, UCSD Jan Kleissl, UCSD.
Advertisements

Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
Louisville, KY August 4, 2009 Flash Flood Frank Pereira NOAA/NWS/NCEP/Hydrometeorological Prediction Center.
Jordan Bell NASA SPoRT Summer Intern  Background  Goals of Project  Methodology  Analysis of Land Surface Model Results  Severe weather case.
Method for Object-based Diagnostic Evaluation (MODE) Fake forecasts.
NOAA/NWS Change to WRF 13 June What’s Happening? WRF replaces the eta as the NAM –NAM is the North American Mesoscale “timeslot” or “Model Run”
WRF Verification: New Methods of Evaluating Rainfall Prediction Chris Davis NCAR (MMM/RAP) Collaborators: Dave Ahijevych, Mike Baldwin, Barb Brown, Randy.
Improving Excessive Rainfall Forecasts at HPC by using the “Neighborhood - Spatial Density“ Approach to High Res Models Michael Eckert, David Novak, and.
A tale of two severe weather surprises – The isolated event of 16 July 2010 and the severe weather outbreak of 17 July 2010 Neil A. Stuart NOAA/NWS Albany,
NWP Verification with Shape- matching Algorithms: Hydrologic Applications and Extension to Ensembles Barbara Brown 1, Edward Tollerud 2, Tara Jensen 1,
Convection-permitting forecasts initialized with continuously-cycling limited-area 3DVAR, EnKF and “hybrid” data assimilation systems Craig Schwartz and.
An Investigation of Cool Season Extratropical Cyclone Forecast Errors Within Operational Models Brian A. Colle 1 and Michael Charles 1,2 1 School of Marine.
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
A Process-Oriented Observational Study of Snowfall Potential in the Central United States Chad M Gravelle Saint Louis University Charles E Graves Saint.
Univ of AZ WRF Model Verification. Method NCEP Stage IV data used for precipitation verification – Stage IV is composite of rain fall observations and.
Atmospheric Modeling at RENCI Brian J. Etherton. Atmospheric Modeling at RENCI Focus of RENCI for C- STAR project is to provide modeling support/development.
Multi-sensor Analysis of Tropical Mesoscale Convective Systems Jian Yuan and R. A. Houze [J. Clim., J. Atmos. Sci.] NASA A-Train Symposium, New Orleans,
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
Model Verification of Short Range High Impact Weather in Central Florida Christopher Hicks Department of Marine and Environmental Systems Florida Institute.
Climatology and Predictability of Cool-Season High Wind Events in the New York City Metropolitan and Surrounding Area Michael Layer School of Marine and.
Evaluation of a Challenging Warm Season QPF month at HPC: June 2009 Brendon Rubin-Oster Richard Otto (with contributions from Mike Bodner, Keith Brill,
Russ Bullock 11 th Annual CMAS Conference October 17, 2012 Development of Methodology to Downscale Global Climate Fields to 12km Resolution.
Verification of the Cooperative Institute for Precipitation Systems‘ Analog Guidance Probabilistic Products Chad M. Gravelle and Dr. Charles E. Graves.
An evaluation of the efficacy of using observed lightning to improve convective lightning forecasts. Lynn, Barry H., Guy Kelman, Weather It Is, LTD, Gary.
1 What is the “NAM”? David Novak Science and Operations Officer NOAA/NCEP/ Hydrometeorological Prediction Center.
“High resolution ensemble analysis: linking correlations and spread to physical processes ” S. Dey, R. Plant, N. Roberts and S. Migliorini Mesoscale group.
“High resolution ensemble analysis: linking correlations and spread to physical processes ” S. Dey, R. Plant, N. Roberts and S. Migliorini NWP 4: Probabilistic.
Evaluation of the ability of Numerical Weather Prediction models run in support of IHOP to predict the evolution of Mesoscale Convective Systems Steve.
Towards an object-oriented assessment of high resolution precipitation forecasts Janice L. Bytheway CIRA Council and Fellows Meeting May 6, 2015.
Operational Forecasting of Turbulence in Radial Bands around Mesoscale Convective Systems (MCS’s) 06 August 2013 Midwest US Melissa Thomas, Lead & Training.
Ebert-McBride Technique (Contiguous Rain Areas) Ebert and McBride (2000: Verification of precipitation in weather systems: determination of systematic.
Joe Klemp National Center for Atmospheric Research Boulder, Colorado Convection Resolving NWP using WRF.
1 Climate Test Bed Seminar Series 24 June 2009 Bias Correction & Forecast Skill of NCEP GFS Ensemble Week 1 & Week 2 Precipitation & Soil Moisture Forecasts.
An Experiment to Evaluate the Use of Quantitative Precipitation Forecasts from Numerical Guidance by Operational Forecasters Joshua M. Boustead and Daniel.
Model representation of the diurnal cycle and moist surges along the Gulf of California during NAME Emily J. Becker and Ernesto Hugo Berbery Department.
TAMDAR Workshop 2006 – Boulder, Colorado 1 April 13, 2006 UPDATE ON TAMDAR IMPACT ON RUC FORECASTS & RECENT TAMDAR/RAOB COMPARISONS Ed Szoke,* Brian Jamison*,
Use of Mesoscale Ensemble Weather Predictions to Improve Short-Term Precipitation and Hydrological Forecasts Michael Erickson 1, Brian A. Colle 1, Jeffrey.
“High resolution ensemble analysis: linking correlations and spread to physical processes ” S. Dey Supervisors: R. Plant, N. Roberts and S. Migliorini.
MMET Team Michelle Harrold Tracy Hertneky Jamie Wolff Demonstrating the utility of the Mesoscale Model Evaluation Testbed (MMET)
Feature-based (object-based) Verification Nathan M. Hitchens National Severe Storms Laboratory.
ASSIMILATING DENSE PRESSURE OBSERVATIONS— A PREVIEW OF HOW THIS MAY IMPACT ANALYSIS AND NOWCASTING Luke Madaus -- Wed., Sept. 21, 2011.
Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University.
Diurnal Water and Energy Cycles over the Continental United States from three Reanalyses Alex Ruane John Roads Scripps Institution of Oceanography / UCSD.
NCEP Vision: First Choice – First Alert – Preferred Partner NOAA Hydrometeorological TestBed at the NCEP Hydrometeorological Prediction Center (HPC) Faye.
Diagnostic Evaluation of Mesoscale Models Chris Davis, Barbara Brown, Randy Bullock and Daran Rife NCAR Boulder, Colorado, USA.
Trials of a 1km Version of the Unified Model for Short Range Forecasting of Convective Events Humphrey Lean, Susan Ballard, Peter Clark, Mark Dixon, Zhihong.
Using TIGGE Data to Understand Systematic Errors of Atmospheric River Forecasts G. Wick, T. Hamill, P. Neiman, and F.M. Ralph NOAA Earth System Research.
Verification of model wind structure and rainfall forecasts for 2008 Atlantic storms Tim Marchok NOAA / GFDL Collaborators: Rob Rogers (NOAA / AOML / HRD)
Weather Prediction Center 2015 NCEP Production Suite Review December 7, 2015 Mark Klein Science and Operations Officer Contributions by David Novak, Kathy.
Testing of Objective Analysis of Precipitation Structures (Snowbands) using the NCAR Developmental Testbed Center (DTC) Model Evaluation Tools (MET) Software.
Applied Meteorology Unit 1 Observation Denial and Performance of a Local Mesoscale Model Leela R. Watson William H. Bauman.
1 Application of MET for the Verification of the NWP Cloud and Precipitation Products using A-Train Satellite Observations Paul A. Kucera, Courtney Weeks,
11 Short-Range QPF for Flash Flood Prediction and Small Basin Forecasts Prediction Forecasts David Kitzmiller, Yu Zhang, Wanru Wu, Shaorong Wu, Feng Ding.
© Crown copyright Met Office Review topic – Impact of High-Resolution Data Assimilation Bruce Macpherson, Christoph Schraff, Claude Fischer EWGLAM, 2009.
Application of the CRA Method Application of the CRA Method William A. Gallus, Jr. Iowa State University Beth Ebert Center for Australian Weather and Climate.
Statistical Evaluation of High-resolution WRF Model Forecasts Near the SF Bay Peninsula By Ellen METR 702 Prof. Leonard Sklar Fall 2014 Research Advisor:
REGIONAL-SCALE ENSEMBLE FORECASTS OF THE 7 FEBRUARY 2007 LAKE EFFECT SNOW EVENT Justin Arnott and Michael Evans NOAA/NWS Binghamton, NY Richard Grumm NOAA/NWS.
2. WRF model configuration and initial conditions  Three sets of initial and lateral boundary conditions for Katrina are used, including the output from.
An Examination of the Diurnal Cycle in the NCEP GFS (and Eta) Model Precipitation Forecasts (during NAME) John Janowiak, Valery Dagostaro*, Vern Kousky,
Application of object-oriented verification techniques to ensemble precipitation forecasts William A. Gallus, Jr. Iowa State University June 5, 2009.
Hydrometeorological Predication Center
Systematic timing errors in km-scale NWP precipitation forecasts
  Robert Gibson1, Douglas Drob2 and David Norris1 1BBN Technologies
West Virginia Floods June 2016 NROW 2016 Albany NY
Analysis of NASA GPM Early 30-minute Run in Comparison to Walnut Gulch Experimental Watershed Rain Data Adolfo Herrera April Arizona Space Grant.
Predictability of Snow Multi-Bands Using a 40-Member WRF Ensemble
Sea Breeze Initiated Thunderstorms Relating to Convective Available Potential Energy (CAPE) and Convective Inhibition (CINH) Javier Rosa Department of.
North American Monsoon Rainfall Parameterization over the Southwest United States and Northwest Mexico: WRF Simulations using NAME IOP Data Stephen W.
Reporter : Prudence Chien
Presentation transcript:

Do the NAM and GFS have displacement biases in their MCS forecasts? Charles Yost Russ Schumacher Department of Atmospheric Sciences Texas A&M University Research supported by COMET grant #Z

Outline Brief Background Data and Methodology Results Case Studies Future Work

Background Mesoscale Convective Systems (MCSs) are responsible for a large percentage of rain during the warm season Researchers and forecasters noticed the NAM and GFS consistently predicted these events too far north HPC and Texas A&M University collaborated to investigate

Cases Searched April through August of 2009 and 2010 using – Radar to identify MCSs – Stage IV to analyze amounts 29 unique 6 hour intervals – Ranging from April 13 to August 18 – Several cases outside of initial time frame

Data Stage IV – 6 hourly multi-sensor precipitation analyses North American Mesoscale Model – 0Z and 12Z model runs – 6 hourly precipitation forecast Global Forecast System Model – 0Z and 12 Z model runs – 6 hourly precipitation forecast

Methodology “Eyeball” Test Method for Object-Based Diagnostic Evaluation (MODE)

Note on terminology 1 st Forecast: most recent model forecast 2 nd Forecast: second most recent forecast 3 rd Forecast: third most recent forecast Example: 6Z to 12Z – 1 st Fore: 0Z – 6 to 12hr – 2 nd Fore: 12Z (previous day) – 18 to 24hr – 3 rd Fore: 0Z (previous day) – 30 to 36hr Note: 0Z and 12Z are 12 hour forecasts, 6Z and 18Z are 6 hour forecasts

August 18, 2009 – 12Z

Method for Object-Based Diagnostic Evaluation Tool (MODE) Resolves objects in observed and forecasted fields Provides detailed information about the objects – Centroid location, object area, length, width – Axis angle, aspect ratio, curvature, intensity Can pair observed and forecasted objects

Process for Resolving Objects Davis, C., B. Brown, and R. Bullock, 2006a: Object-based verifica- tion of precipitation forecasts. Part I: Methods and application to mesoscale rain areas. Mon. Wea. Rev., 134, 1772–1784.

MODE Tool Settings ModelRadiusThreshold GFS4≥ 7.5 NAM6≥ 10 GFS was re-gridded to the 212 grid. NAM remained at the 218 grid Stage IV was regridded to the corresponding forecast’s grid Radii and thresholds were selected to match what a human would draw

MODE Tool Output Fields used: – Centroid (center of mass) – Area – Length – Width Determine forecast error: “Forecast – Observed”

Results

“Eyeball” Test

GFS Forecast Errors QuadrantNumber of PointsPercentage 1 (NE)4248% 2 (NW)2124% 3 (SW)911% 4 (SE)1517% TOTAL87100% MeanMedianStand. Dev. Distance (km)

“Eyeball” Test

NAM Forecast Errors QuadrantNumber of PointsPercentage 1 (NE)2946% 2 (NW)2438% 3 (SW)610% 4 (SE)46% TOTAL63100% MeanMedianStand. Dev. Distance (km)

Forecasting Questions Is there a correlation between forecast error (distance) and forecast area? Is there a correlation between forecast error (distance) and forecast width? Is there a correlation between forecast error (distance) and forecast length?

Conclusions “Eyeball” test and MODE test are consistent with each other Clear northern bias in the NAM 84% of cases No temporal bias GFS northern bias present, not as strong 72% of cases Tends to move system through early (65%) No clear bias with area, width, or length

Future Work Expand the time period to include more years and cases Does this bias exist in higher resolutions? – NSSL WRF What are the causes of this bias?