Real-data WRF: Verification with MET package

Slides:



Advertisements
Similar presentations
Model Evaluation Tools MET. What is MET Model Evaluation Tools ( MET )- a powerful and highly configurable verification package developed by DTC offering:
Advertisements

ATMO5332 WRF-ARW Tutorial 0.01”.
2002 MM5 Model Evaluation 12 vs. 36 km Results Chris Emery, Yiqin Jia, Sue Kemball-Cook, and Ralph Morris ENVIRON International Corporation Zion Wang UCR.
Outline of Talk Introduction Toolbox functionality Results Conclusions and future development.
NATS 101 Lecture 3 Climate and Weather. Climate and Weather “Climate is what you expect. Weather is what you get.” -Robert A. Heinlein.
Brian Ancell, Cliff Mass, Gregory J. Hakim University of Washington
JEFS Status Report Department of Atmospheric Sciences University of Washington Cliff Mass, Jeff Baars, David Carey JEFS Workshop, August
This is the footer Running WRF on HECToR Ralph Burton, NCAS (Leeds) Alan Gadian, NCAS (Leeds) With thanks to Paul Connolly, Hector.
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling Division, Applied Modeling Research Branch October 8, 2008.
Plug & Play Middle School Common Core Statistics and Probability using TinkerPlots.
ACOT Intro/Copyright Succeeding in Business with Microsoft Excel 2010: Chapter1.
Copyright 2012, University Corporation for Atmospheric Research, all rights reserved Verifying Ensembles & Probability Fcsts with MET Ensemble Stat Tool.
European Computer Driving Licence Syllabus version 5.0 Module 4 – Spreadsheets Chapter 22 – Functions Pass ECDL5 for Office 2007 Module 4 Spreadsheets.
Verification and Case Studies for Urban Effects in HIRLAM Numerical Weather Forecasting A. Baklanov, A. Mahura, C. Petersen, N.W. Nielsen, B. Amstrup Danish.
Use of sea level observations in DMIs storm surge model Jacob L. Høyer, Weiwei Fu, Kristine S. Madsen & Lars Jonasson Center for Ocean and Ice, Danish.
OUTLINE Current state of Ensemble MOS
Latest results in verification over Poland Katarzyna Starosta, Joanna Linkowska Institute of Meteorology and Water Management, Warsaw 9th COSMO General.
The latest results of verification over Poland Katarzyna Starosta Joanna Linkowska COSMO General Meeting, Cracow September 2008 Institute of Meteorology.
MODEL OUTPUT STATISTICS (MOS) TEMPERATURE FORECAST VERIFICATION JJA 2011 Benjamin Campbell April 24,2012 EAS 4480.
Quality control of daily data on example of Central European series of air temperature, relative humidity and precipitation P. Štěpánek (1), P. Zahradníček.
Meteorological Data Analysis Urban, Regional Modeling and Analysis Section Division of Air Resources New York State Department of Environmental Conservation.
WRF Four-Dimensional Data Assimilation (FDDA) Jimy Dudhia.
This document gives one example of how one might be able to “fix” a meteorological file, if one finds that there may be problems with the file. There are.
Verification Verification with SYNOP, TEMP, and GPS data P. Kaufmann, M. Arpagaus, MeteoSwiss P. Emiliani., E. Veccia., A. Galliani., UGM U. Pflüger, DWD.
Progress Update of Numerical Simulation for OSSE Project Yongzuo Li 11/18/2008.
Boundary layer depth verification system at NCEP M. Tsidulko, C. M. Tassone, J. McQueen, G. DiMego, and M. Ek 15th International Symposium for the Advancement.
General Meeting Moscow, 6-10 September 2010 High-Resolution verification for Temperature ( in northern Italy) Maria Stefania Tesini COSMO General Meeting.
0 0 July, 2009 WRF-Var Tutorial Syed RH Rizvi WRFDA Analysis/Forecast Verification Syed RH Rizvi National Center For Atmospheric Research NCAR/ESSL/MMM,
10th COSMO General Meeting, Cracow, Poland Verification of COSMOGR Over Greece 10 th COSMO General Meeting Cracow, Poland.
CWB Midterm Review 2011 Forecast Applications Branch NOAA ESRL/GSD.
Verification methods - towards a user oriented verification The verification group.
Deutscher Wetterdienst FE VERSUS 2 Priority Project Meeting Langen Use of Feedback Files for Verification at DWD Ulrich Pflüger Deutscher.
Global vs mesoscale ATOVS assimilation at the Met Office Global Large obs error (4 K) NESDIS 1B radiances NOAA-15 & 16 HIRS and AMSU thinned to 154 km.
Statistical Evaluation of High-resolution WRF Model Forecasts Near the SF Bay Peninsula By Ellen METR 702 Prof. Leonard Sklar Fall 2014 Research Advisor:
NESIS estimates for the SOC case ATM 419 Spring 2016 Fovell 1.
California Central Valley (Experiment 6) ATM 419 Spring 2016 Fovell 1.
Verify WRF Forecast using MET
Indirect impact of ozone assimilation using Gridpoint Statistical Interpolation (GSI) data assimilation system for regional applications Kathryn Newman1,2,
Satellite data monitoring
of Temperature in the San Francisco Bay Area
WRF Four-Dimensional Data Assimilation (FDDA)
Evaluation for China L band radiosonde
Evaluation of the Cuban Wind Atlas
What is in our head…. Spatial Modeling Performance in Complex Terrain Scott Eichelberger, Vaisala.
Verifying and interpreting ensemble products
Tom Hopson, Jason Knievel, Yubao Liu, Gregory Roux, Wanli Wu
Analysis of WRF Model Ensemble Forecast Skill for 80 m over Iowa
Recent changes in the ALADIN operational suite
IMPROVING HURRICANE INTENSITY FORECASTS IN A MESOSCALE MODEL VIA MICROPHYSICAL PARAMETERIZATION METHODS By Cerese Albers & Dr. TN Krishnamurti- FSU Dept.
of Temperature in the San Francisco Bay Area
Model Post Processing.
gWRF Workflow and Input Data Requirements
Edexcel: Large Data Set Activities
Daniel Leuenberger1, Christian Keil2 and George Craig2
Winter storm forecast at 1-12 h range
Forecast Pressure.
Heat Index Pentaho User Manual.
Post Processing.
Observation uncertainty in verification
Lidia Cucurull, NCEP/JCSDA
New Developments in Aviation Forecast Guidance from the RUC
Predicting Frost Using Artificial Neural Network
Real-time WRF EnKF 36km outer domain/4km nested domain D1 (36km)
Envs,Geol,Phys. 112: Global Climate
2007 Mei-yu season Chien and Kuo (2009), GPS Solutions
Statistics for Managers Using Microsoft® Excel 5th Edition
Alex Gallagher and Dr. Robert Fovell
Unit # Deviation Absolute Dev. Square of Dev
Tuesday.
Presentation transcript:

Real-data WRF: Verification with MET package ATM 419/563 Spring 2017 Fovell

Terms AGL = above (local) ground level MSL = above mean sea level ASOS = Automated Surface Observing System MET = Model Evaluation Tools package MADIS = Meteorological Assimilation Data Ingest System

References MET Users Guide (version 5.0) (PDF) ASOS home page MADIS data web portal

Outline Compare available ASOS observations to WRF simulation, including Temperature, dew point and humidity at 2 m AGL Wind speed, nominally at 10 m AGL (some ASOS wind towers not at 10 m or 33’) ASOS observations acquired from MADIS

Part 1: Domain-averaged analysis See script for required actions

Steps for Part 1 Unipost MET PointStat tool Unpacks WRF output files into GRIB format One domain at a time (if doing nesting) MET PointStat tool Interpolates model fields to locations of observations derived from MADIS database for comparison

Scripts for Part 1 run_unipost A bash shell script that calls unipost.exe Output: WRFPRD* and wrfprd* files in postprd/ MET_run_ASCII2_ASOS.sh A bash shell script that invokes MET’s PointStat tool Output: point_stat* files in postprd/ plot_met.sh A bash shell script that extracts average forecast and observation values for several variables Outputs data to screen – paste into spreadsheet

Unipost See script for actions required Edit run_unipost These lines may/will need attention in future: export startdate=2016031300 export fhr=00 export lastfhr=48 export incrementhr=01 for domain in d01 Start time/date of simulation 48h simulation, start at time 0. increment by 1 hour [we have 20 min output, but verifying against hourly output] Process domain 1 [each domain, if exists, done separately] Script is set up to use my build of UPPV2.0. This is obsolete, but I’ve had no reason to upgrade.

Execute MET PointStat See script for actions required Edit MET_run_ASCII2_ASOS.sh These lines may/will need attention in future: Start time/date of simulation (expressed differently) Process domain 1 Date_base=20160313 Date_hour=00 domain=1 OBS_base=$MYLAB/MADIS/MADIS_20160313_ASOS where MADIS data reside DO NOT alter $MYHOME and $MYLAB in script

plot_met.sh See script for actions required A bash shell script that reads in information from postprd/*.cnt.txt files and outputs to screen You can copy/paste these data into a spreadsheet for plotting Usage: sh plot_met.sh N Where N is 2 for 2-m temperature (K) 3 for 10-m wind speed (m/s) 6 for 2-m dew point (K) 7 for 2-m relative humidity (%) 11 for sea-level pressure (Pa)

Bias, MAE, MSE, and BCMSE Large + or – bias means a bad forecast, but nearly zero bias doesn’t necessarily mean it’s a good forecast MSE exaggerates impact of largest errors

Output for N = 2 NOTES: • f = forecast, o = obs Date_HHMMSS FCST OBS N MAE BCMSE MSE BIAS fcst std obs std level 20160313_000000 TMPf 288.03292 TMPo 288.99799 N 250 MAE 1.44846 BCRMSE 3.35828 MSE 4.28965 BIAS -0.96508 STDf 4.10845 STDo 3.99308 LVL Z2f Z2o 20160313_010000 TMPf 285.67687 TMPo 287.48599 N 250 MAE 2.09267 BCRMSE 3.99666 MSE 7.26957 BIAS -1.80912 STDf 3.85012 STDo 3.65356 LVL Z2f Z2o 20160313_020000 TMPf 284.86614 TMPo 286.24960 N 251 MAE 1.90243 BCRMSE 4.28162 MSE 6.19556 BIAS -1.38345 STDf 3.99740 STDo 3.78339 LVL Z2f Z2o 20160313_030000 TMPf 284.27217 TMPo 285.51254 N 251 MAE 1.76028 BCRMSE 3.97226 MSE 5.51077 BIAS -1.24037 STDf 4.02758 STDo 3.87092 LVL Z2f Z2o 20160313_040000 TMPf 283.82116 TMPo 285.03047 N 251 MAE 1.71808 BCRMSE 4.03162 MSE 5.49406 BIAS -1.20931 STDf 4.09182 STDo 3.87010 LVL Z2f Z2o NOTES: • f = forecast, o = obs • The forecast and observed field is TMP (2-m temperature) – averaged over N stations • The forecast and observation level here is 2m (“Z2”). Others: 10m (Z10), surface (Z0), model level 1 (L1) • BIAS = bias or mean error • MAE = mean absolute error • MSE = mean squared error • BCMSE = bias-corrected mean squared error • std = standard deviation of the forecasts (STDf) and observations (STDo)

Copied output from plot_met.sh into Excel Results for KANSAS01 Copied output from plot_met.sh into Excel

ASOS 2-m temperature Red: observed Black: forecast sh plot_met.sh 2

ASOS 2-m Td Red: observed Black: forecast sh plot_met.sh 6

ASOS 2-m RH Red: observed Black: forecast sh plot_met.sh 7

ASOS 10-m wind speed Red: observed Black: forecast sh plot_met.sh 3 Note diurnal cycle of error Red: observed Black: forecast sh plot_met.sh 3

ASOS SLP Red: observed Black: forecast sh plot_met.sh 11

Part 2: Station-based analysis of wind speed See script for required actions

Scripts for Part 2 run_graph_aircraft_mpr_F10M.sh A bash shell script that calls a Perl script that combs through point_stat* files in postprd/ for individual stations Output: Station by station files like member_WINDZ10F10M_KBMQ.dat in directory FILTERED_MET_STATS/tmp sum_and_average.sh A bash shell script that calls a Perl script that reads through files in directory FILTERED_MET_STATS/tmp Output: input_for_grads file grads_plot.sh A bash shell script that calls a Perl script to create a GrADS script for plotting from contents of input_for_grads Output: stations.gs

Execute run_graph_aircraft_mpr_F10M.sh See script for actions required Contents of FILTERED_MET_STATS/tmp member_WINDZ10F10M_K9V9.dat member_WINDZ10F10M_KFSD.dat member_WINDZ10F10M_KMSY.dat member_WINDZ10F10M_KAAO.dat member_WINDZ10F10M_KFSM.dat member_WINDZ10F10M_KMTJ.dat member_WINDZ10F10M_KABI.dat member_WINDZ10F10M_KFST.dat member_WINDZ10F10M_KMWL.dat member_WINDZ10F10M_KABQ.dat member_WINDZ10F10M_KFTW.dat member_WINDZ10F10M_KMWT.dat member_WINDZ10F10M_KABR.dat member_WINDZ10F10M_KFYV.dat member_WINDZ10F10M_KNFW.dat member_WINDZ10F10M_KACT.dat member_WINDZ10F10M_KGAG.dat member_WINDZ10F10M_KODO.dat member_WINDZ10F10M_KAEX.dat member_WINDZ10F10M_KGCC.dat member_WINDZ10F10M_KODX.dat [and more]

Execute sum_and_average.sh See script for actions required more input_for_grads Event-averaged statistics member_WINDZ10F10M_K9V9.dat, Bias:, 0.904, MAE, Average:, 1.418, FBAR:, 4.55, OBAR:, 3.64, STDEV_OBS:, 43.800, -99.320, 2.33, NUM_OBS, 49, MAXO:, 10.803, member_WINDZ10F10M_KAAO.dat, Bias:, -0.482, MAE, Average:, 1.250, FBAR:, 2.66, OBAR:, 3.14, STDEV_OBS:, 37.750, -97.220, 1.62, NUM_OBS, 49, MAXO:, 8.231, member_WINDZ10F10M_KABI.dat, Bias:, -0.250, MAE, Average:, 1.174, FBAR:, 5.51, OBAR:, 5.76, STDEV_OBS:, 32.420, -99.680, 2.32, NUM_OBS, 48, MAXO:, 10.289, [etc.] station lat, lon Files in FILTERED_MET_STATS/tmp (one per station)

Execute grads_plot.sh See script for actions required more stations.gs 'q w2xy -99.32 43.8' xpos=subwrd(result,3) ypos=subwrd(result,6) 'set line 0' 'draw mark 3 'xpos' 'ypos' 0.15' 'set line 1 1 5' 'draw mark 2 'xpos' 'ypos' 0.15’ [etc.] Plots a circle for each station, with circle fill indicating event-averaged bias

Average event wind bias (m/s) 254 ASOS stations KLXV (Leadville, CO) Roswell is red dot in SE NM KGDP (Guadalupe Pass, TX) Average event wind bias (m/s)

KANSAS01 event-averaged wind bias ranked N = 254 stations. Mean bias = +0.32 m/s. Std. deviation 1.08. This ranked distribution shape is essentially as expected if bias is normally distributed (null hypothesis: distribution is normal. Probability > 99%)

Q-Q plot (quantile-quantile) in box: symbol = mean horiz. Line – median box width = interquartile range (25th-75th pctile) box whiskers at 1.5*interquartile range 25th pct. 75th pct. Q-Q plot (quantile-quantile)

Contents of input_for_grads copied into Excel, Some of these less well-represented stations - Have anemometers mounted below 10 m - Are close to domain boundary - Have terrain misrepresented by coarse resolution - Have inappropriate landuse or z0 assignment - Are bad observations Guadalupe Pass MET_verif_NEW_plot_met.xlsx, KANSAS01 stations tab Leadville Contents of input_for_grads copied into Excel, plotting event-average forecast wind speed (FBAR) vs. observed (OBAR) - each point is a station

Average event wind bias (m/s)