Model Evaluation Tools MET. What is MET Model Evaluation Tools ( MET )- a powerful and highly configurable verification package developed by DTC offering:

Slides:



Advertisements
Similar presentations
Usage Statistics in Context: related standards and tools Oliver Pesch Chief Strategist, E-Resources EBSCO Information Services Usage Statistics and Publishers:
Advertisements

Mei Xu, Jamie Wolff and Michelle Harrold National Center for Atmospheric Research (NCAR) Research Applications Laboratory (RAL) and Developmental Testbed.
ATMO5332 WRF-ARW Tutorial 0.01”.
ECMWF long range forecast systems
1 Trieste, April 17 th 2008Lucio Zambon Electronic Graphic Interface for Global Archiving Technology: PHP, MySQL, JavaScript, JPGraph, etc Development.
Climate Predictability Tool (CPT)
Verification and evaluation of a national probabilistic prediction system Barbara Brown NCAR 23 September 2009.
NWP Verification with Shape- matching Algorithms: Hydrologic Applications and Extension to Ensembles Barbara Brown 1, Edward Tollerud 2, Tara Jensen 1,
Automating Tasks With Macros
Application of Forecast Verification Science to Operational River Forecasting in the National Weather Service Julie Demargne, James Brown, Yuqiong Liu.
Agenda Overview Why TransCAD Challenges/tips Initiatives Applications.
70-290: MCSE Guide to Managing a Microsoft Windows Server 2003 Environment Chapter 11: Monitoring Server Performance.
ASSIMILATION of RADAR DATA at CONVECTIVE SCALES with the EnKF: PERFECT-MODEL EXPERIMENTS USING WRF / DART Altuğ Aksoy National Center for Atmospheric Research.
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
Automating Tasks With Macros. 2 Design a switchboard and dialog box for a graphical user interface Database developers interact directly with Access.
Evaluation of MineSet 3.0 By Rajesh Rathinasabapathi S Peer Mohamed Raja Guided By Dr. Li Yang.
Exploitation of Ensemble Output (and other operationally cool stuff) at NCEP HPC Peter C. Manousos NCEP HPC Science & Operations Officer
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Chapter 7 Managing Data Sources. ASP.NET 2.0, Third Edition2.
Tara Jensen for DTC Staff 1, Steve Weiss 2, Jack Kain 3, Mike Coniglio 3 1 NCAR/RAL and NOAA/GSD, Boulder Colorado, USA 2 NOAA/NWS/Storm Prediction Center,
Design Verification Design Profiler Course 8. All materials updated on: September 30, Design Profiler Design Profiler is a tool integrated within.
Hydrologic Statistics
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling Division, Applied Modeling Research Branch October 8, 2008.
Quantitative Skills: Data Analysis and Graphing.
Copyright 2012, University Corporation for Atmospheric Research, all rights reserved Verifying Ensembles & Probability Fcsts with MET Ensemble Stat Tool.
Tutorial 10 Adding Spry Elements and Database Functionality Dreamweaver CS3 Tutorial 101.
Lattice 2004Chris Maynard1 QCDml Tutorial How to mark up your configurations.
Understanding the Web Site Development Process. Understanding the Web Site Development You need a good project plan Larger projects need a project manager.
Carolina Environmental Program UNC Chapel Hill The Analysis Engine – A New Tool for Model Evaluation, Sensitivity and Uncertainty Analysis, and more… Alison.
Climate Predictability Tool (CPT) Ousmane Ndiaye and Simon J. Mason International Research Institute for Climate and Society The Earth.
Verification of ensembles Courtesy of Barbara Brown Acknowledgments: Tom Hamill, Laurence Wilson, Tressa Fowler Copyright UCAR 2012, all rights reserved.
WEKA - Explorer (sumber: WEKA Explorer user Guide for Version 3-5-5)
Quantitative Skills 1: Graphing
StAR web server tutorial for ROC Analysis. ROC Analysis ROC Analysis: This module allows the user to input data for several classifiers to be tested.
Marcel Casado NCAR/RAP WEATHER WARNING TOOL NCAR.
Ed Tollerud, Tara Jensen, Barb Brown ( also Yuejian Zhu, Zoltan Toth, Tony Eckel, Curtis Alexander, Huiling Yuan,…) Module 6 Objective: Provide a portable,
Tutorial. Other post-processing approaches … 1) Bayesian Model Averaging (BMA) – Raftery et al (1997) 2) Analogue approaches – Hopson and Webster, J.
Verification Summit AMB verification: rapid feedback to guide model development decisions Patrick Hofmann, Bill Moninger, Steve Weygandt, Curtis Alexander,
Verification in NCEP/HPC Using VSDB-fvs Keith F. Brill November 2007.
CHAPTER TEN AUTHORING.
Microsoft Office 2007 Intermediate© 2008 Pearson Prentice Hall1 PowerPoint Presentation to Accompany GO! With Microsoft ® Office 2007 Intermediate Chapter.
Touchstone Automation’s DART ™ (Data Analysis and Reporting Tool)
70-290: MCSE Guide to Managing a Microsoft Windows Server 2003 Environment, Enhanced Chapter 11: Monitoring Server Performance.
Requirements from KENDA on the verification NetCDF feedback files: -produced by analysis system (LETKF) and ‘stat’ utility ((to.
VIPER Quality Assessment Overview Presenter: Sathyadev Ramachandran, SPS Inc.
. Outline  Evaluation of different model-error schemes in the WRF mesoscale ensemble: stochastic, multi-physics and combinations thereof  Where is.
Climate Predictability Tool (CPT) Ousmane Ndiaye and Simon J. Mason International Research Institute for Climate and Society The Earth.
DTC Verification for the HMT Edward Tollerud 1, Tara Jensen 2, John Halley Gotway 2, Huiling Yuan 1,3, Wally Clark 4, Ellen Sukovich 4, Paul Oldenburg.
Refinement and Evaluation of Automated High-Resolution Ensemble-Based Hazard Detection Guidance Tools for Transition to NWS Operations Kick off JNTP project.
XP New Perspectives on Microsoft Office Access 2003 Tutorial 10 1 Microsoft Office Access 2003 Tutorial 10 – Automating Tasks With Macros.
Chapter 3 Response Charts.
171 PC-HYSPLIT WORKSHOP Workshop Agenda Model Overview Model history and features Computational method Trajectories versus concentration Code installation.
Edward Tollerud 1, Tara Jensen 2, John Halley Gotway 2, Huiling Yuan 1,3, Wally Clark 4, Ellen Sukovich 4, Paul Oldenburg 2, Randy Bullock 2, Gary Wick.
TROI – SPC Database Walkthrough Training Presentation Doc. USTP0213 Rev4.
0 0 July, 2009 WRF-Var Tutorial Syed RH Rizvi WRFDA Analysis/Forecast Verification Syed RH Rizvi National Center For Atmospheric Research NCAR/ESSL/MMM,
Diagnostic verification and extremes: 1 st Breakout Discussed the need for toolkit to build beyond current capabilities (e.g., NCEP) Identified (and began.
DET Module 1 Ensemble Configuration Linda Wharton 1, Paula McCaslin 1, Tara Jensen 2 1 NOAA/GSD, Boulder, CO 2 NCAR/RAL, Boulder, CO 3/8/2016.
The WRF Verification Toolkit Lacey Holland, Tressa Fowler, and Barbara Brown Research Applications Laboratory NCAR Lacey Holland, Tressa Fowler, and Barbara.
1 Application of MET for the Verification of the NWP Cloud and Precipitation Products using A-Train Satellite Observations Paul A. Kucera, Courtney Weeks,
Application of the CRA Method Application of the CRA Method William A. Gallus, Jr. Iowa State University Beth Ebert Center for Australian Weather and Climate.
HIC Meeting, 02/25/2010 NWS Hydrologic Forecast Verification Team: Status and Discussion Julie Demargne OHD/HSMB Hydrologic Ensemble Prediction (HEP) group.
Data Visualization with Tableau
Data Assimilation Research Testbed Tutorial
The Use of AMET and Automated Scripts for Model Evaluation
Verifying and interpreting ensemble products
Tom Hopson, Jason Knievel, Yubao Liu, Gregory Roux, Wanli Wu
5Developmental Testbed Center
Verification of Tropical Cyclone Forecasts
Forecast system development activities
Advances in Rfdbk based Verification at DWD
Presentation transcript:

Model Evaluation Tools MET

What is MET Model Evaluation Tools ( MET )- a powerful and highly configurable verification package developed by DTC offering: Standard verification scores comparing gridded model data to point-based observations Standard verification scores comparing gridded model data to gridded observations Spatial verification methods comparing gridded model data to gridded observations using neighborhood, object-based, and intensity-scale decomposition approaches Ensemble and probabilistic verification methods comparing gridded model data to point-based or gridded observations Aggregating the output of these verification methods through time and space

What is MET

MET Steps Configuration: A configuration file* (specific to each MET tool) detailing the different verification settings needs to be configured and supplied prior to a MET verification run. -ASCII text format -All settings are explained in comments in the file -Can be reused for multiple runs *More info on all of the settings in the configuration file can be found in the MET Tutorial PDF, available at

MET Steps Running selected MET Tool: -MET Tool name -Forecast data file -Observation data file -Configuration file -Output directory -Additional arguments Example for GRID-STAT: METv3.0.1/bin/grid_stat "$FCST_FILE" "$OBS_FILE" “$CONFIG_FILE” –outdir "$OUTPUT_DIRECTORY" -v 2; Example for POINT-STAT: METv3.0.1/bin/point_stat “$FCST_FILE” “$OBS_FILE” “$CONFIG_FILE” –outdir “$OUTPUT_DIRECTORY” –v 2;

MET Automation Without automation, MET verifies 1 forecast file with 1 observation file and is nearly impossible to use for actual data sets. Usually only used for testing Shell Scripts provide: - Speed (MET tools are run thousands of times over entire data sets) -Control (MET is run using specified settings or only if conditions are met) -Data collection, formatting and conversion -Ideal bookkeeping (Based on time, settings, etc.) -Custom, correct, and consistent output filenames -Tree-like folder structure ideal for METViewer -Easily adjusted to run over different models and/or times

MET Automation Depending on usage, automation scripts can vary greatly. However, most tend to have a general structure: 1. Set settings (period, domain, initialization times, etc.) 2. Gather forecast data 3. Gather observation data 4. Reformat forecast and observation data to be MET-readable 5. Iterate over time, computing forecast and observation filenames 6. If a match is found in step 5, run MET tool with given parameters from step 1 7. Aggregate and reformat output data 8. Repeat for different models

DTC/HMT Collaboration: Ensemble QPF Comparisons Area under the ROC Curve for SREF and WRF ensembles for 3 IOPs in February 2011 over California Domain for Rainfall greater than 0.5 inch

MODE Attributes and Model Diagnostics Ensemble Members and Systematic Microphysics Impacts Thompson microphysics scheme members: Colors 5-7 Others are Ferrier or Schultz Thompson members are less Intense but larger (see averages especially) Note extrema tails for both size and intensity (medians vs. averages)

Graphics MET does not have a built in way to display the computed statistics MET outputs all statistics in ASCII text files which can be be plotted by many different software packages if desired, or METViewer can be used MET Viewer is a separate software package which visualizes MET output and is highly configurable

More Information MET & MET Viewer on FAB Website: DTC Online MET Tutorial: tutorial/METv3.0/index.php tutorial/METv3.0/index.php

MET Viewer

What is MET Viewer The MET Viewer tool reads MET verification statistics output from a database creates plots using the R statistical package includes a web application that can be accessed from a web browser to create a single plot the specification for each plot is built using a series of controls and then serialized into XML for each plot, METViewer generates a SQL query, an R script to create the plot, a flat file containing the data that will be plotted and the plot itself Available to anyone online: Note: the link is a public test version and hence is limited to a single database.

METViewer Fully customizable and very powerful but has a learning curve Can produce fully customizable: -Series plots with confidence intervals -Box plots -X-Y scatter plots -Histograms

METViewer Usage METViewer is designed to be configured in a downward direction. This means that the user should set the desired settings in order, gradually moving downward. The top three settings (shown above) allow the user to select a database and type for the plot.

METViewer Usage The Y2 axis can be used to plot a Base Rate.

Result

What MET Viewer Offers Useful options that are in MET Viewer Standard scores: RMSE,ME,MAE,GSS,CSI,FAR,.... Ensemble utilities: ensemble mean, spread, probabilities, ROC, Brier…. Formats: Time series, bar plots, boxplots, rank histograms,…. Aggregate on-the fly, over thresholds, forecast times, valid times, lead times, etc. Uncertainty options: boxplot notches, error bars, sub- sampling,… Spatial Techniques displays: MODE, Wavelet methods, neighborhood methods

Result Aggregated by diurnal cycle valid times; Comparison of one gridded dataset with another gridded dataset (StageIV and ‘improved’ StageIV)

Result 30-day aggregations, with multiple models and ensemble members, plotted over lead times