Download presentation
Presentation is loading. Please wait.
1
TAIWIN model verification task
Jamie Wolff Team members: Tressa Fowler, John Halley Gotway, Michelle Harrold, Tracy Hertneky, Kyoko Ikeda, Scott Landolt Collaborations with: Greg Thompson and Mei Xu FAA Icing Weather Tools Review 13 July 2016
2
Model verification task
Task: Examine microphysics forecasts and NWP model forecast performance for TAIWIN-relevant fields Conduct comprehensive verification in order to establish a baseline for how well current operational models perform; monitor model performance in future years to quantify the impact of developments Deliverable: Internal FAA report Status: Model datasets Observation datasets Verification approaches Preliminary results
3
Model datasets Operational HRRR (3km) Operational NAMnest (4km)
Hybrid level data (50 levels) Mixing ratio (at each level): cloud water, cloud ice, rain, snow, graupel Operational NAMnest (4km) Isobaric level data (Grid 227, 5km LC) 42 isobaric levels from hPa Mixing ratio (at each level): cloud water, cloud ice, rain, snow Pulled for 1 Jan – 31 March 2016 Regridded each model to a common 3-km domain for verification purposes Model output examined Mixing ratios: rain and snow Categorical (sfc) precipitation type: rain, snow, ice pellets, freezing rain
4
Mixing ratios (lowest level)
HRRR NAM (no graupel)
5
Categorical ptype HRRR NAM (not mutually exclusive except RA vs FZRA)
(ensemble technique where dominant category is declared ptype)
6
Verification datasets
Point observations: METARs Any report of weather types: RA, DZ, FZRA, FZDZ, SN, GR, GS, PE, PL, SG (purposely ignoring VC and UP reports) ASOS A/B stations are used to identify reliable observations of non-occurrence of precipitation (no weather type report – set to NONE) mPING Any data types report of: rain (3), freezing rain (4), drizzle (5), freezing drizzle (6), ice pellets-sleet (7), snow and/or graupel (8), mixed rain and snow (9), mixed ice pellets and snow (10), mixed rain and ice pellets (11), graupel (12), mixed freezing rain and ice pellets (48), none (2) Precipitation type reports categorized and used in MET: M*_RAIN, M*_SNOW, M*_FRZR, M*_ICEP, M*_NONE Gridded observations: Multi-radar/Multi-sensor (MRMS) – CONUS ~1km resolution Hourly QPE (gauge corrected radar estimates) Automated surface precipitation classification (ptype): seven-classes categorized into rain, snow, or none Regridded to the same 3-km domain as the model output
9
Verification approaches
Grid-to-grid comparisons Model accumulated precipitation vs. MRMS QPE Model mixing ratios (RWMR[T>0]/SNMR) vs. MRMS ptype (rain/snow/none) Model categorical precipitation type vs. MRMS ptype (rain/snow/none) Grid-to-point comparisons Model mixing ratios vs. METAR/mPING ptype (rain[T>0]/frzr[T<0]/snow/none) Model categorical precipitation type vs. METAR/mPING ptype (rain[T>0]/frzr[T<0]/snow/icep/none) Looked at neighborhood widths of 1 (nearest), 2, 3, 4, 5, 6 Categorical: Fraction of points in the n x n box that are matches Mixing ratios: Maximum forecast value of points in the n x n box Statistics computed Probability of detection (POD) of yes and no False Alarm Ratio (FAR) Gilbert Skill Score (GSS) Frequency Bias (Fbias) Performance Diagrams (PODy, Success Ratio, Bias, Critical Success Index) Processing done, plots created – analysis underway!
10
Sample ptype results CRAIN CSNOW opHRRR opNAMnest
Benjamin et al WAF CRAIN CSNOW opHRRR opNAMnest Model ptype vs. METAR 4x4 window (~12km) 95% CIs CONUS JFM 2016 Probability of Detection (PODy) CFRZR CICEP
11
Sample ptype results CRAIN CSNOW opHRRR opNAMnest
Benjamin et al WAF CRAIN CSNOW opHRRR opNAMnest Model ptype vs. METAR 4x4 window (~12km) 95% CIs CONUS JFM 2016 False Alarm Ration (FAR) CFRZR CICEP
12
Critical Success Index
Perfect score Overforecast Underforecast Bias Critical Success Index
13
Future work Monitor operational model performance to quantify the impact of developments Utilize output of from the Freezing Drizzle algorithm to further assess surface weather type in the models using these enhanced observations Explore methods and observations to verify aloft conditions Assist with evaluating newly developed techniques within the TAIWIN modeling task area (e.g., HRRR-TLE, aerosol- aware scheme, cloud underproduction) Apply spatial and object-based verification techniques to acquire advanced diagnostic information to help identify strengths and weaknesses of models
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.