General framework for features-based verification

Slides:



Advertisements
Similar presentations
14 May 2001QPF Verification Workshop Verification of Probability Forecasts at Points WMO QPF Verification Workshop Prague, Czech Republic May 2001.
Advertisements

HMM-BASED PATTERN DETECTION. Outline  Markov Process  Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D.
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
1 Some Current Problems in Point Process Research: 1. Prototype point processes 2. Non-simple point processes 3. Voronoi diagrams.
February 15, 2006 Geog 458: Map Sources and Errors
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
Mesoscale Model Evaluation Mike Baldwin Cooperative Institute for Mesoscale Meteorological Studies, University of Oklahoma Also affiliated with NOAA/NSSL.
Multivariate Statistical Data Analysis with Its Applications
4th Int'l Verification Methods Workshop, Helsinki, 4-6 June Methods for verifying spatial forecasts Beth Ebert Centre for Australian Weather and.
Verification Summit AMB verification: rapid feedback to guide model development decisions Patrick Hofmann, Bill Moninger, Steve Weygandt, Curtis Alexander,
Texture. Texture is an innate property of all surfaces (clouds, trees, bricks, hair etc…). It refers to visual patterns of homogeneity and does not result.
Development of an object- oriented verification technique for QPF Michael Baldwin 1 Matthew Wandishin 2, S. Lakshmivarahan 3 1 Cooperative Institute for.
Report on Intrusion Detection and Data Fusion By Ganesh Godavari.
On the spatial verification of FROST-2014 precipitation forecast fields Anatoly Muraviev (1), Anastasia Bundel (1), Dmitry Kiktev (1), Nikolay Bocharnikov.
Ebert-McBride Technique (Contiguous Rain Areas) Ebert and McBride (2000: Verification of precipitation in weather systems: determination of systematic.
Real-time Verification of Operational Precipitation Forecasts using Hourly Gauge Data Andrew Loughe Judy Henderson Jennifer MahoneyEdward Tollerud Real-time.
Event retrieval in large video collections with circulant temporal encoding CVPR 2013 Oral.
Feature-based (object-based) Verification Nathan M. Hitchens National Severe Storms Laboratory.
Verification of Precipitation Areas Beth Ebert Bureau of Meteorology Research Centre Melbourne, Australia
Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University.
Spatial Verification Methods for Ensemble Forecasts of Low-Level Rotation in Supercells Patrick S. Skinner 1, Louis J. Wicker 1, Dustan M. Wheatley 1,2,
CI VERIFICATION METHODOLOGY & PRELIMINARY RESULTS
Diagnostic Evaluation of Mesoscale Models Chris Davis, Barbara Brown, Randy Bullock and Daran Rife NCAR Boulder, Colorado, USA.
U. Damrath, COSMO GM, Athens 2007 Verification of numerical QPF in DWD using radar data - and some traditional verification results for surface weather.
Verification of ensemble systems Chiara Marsigli ARPA-SIMC.
Spatial Forecast Methods Inter-Comparison Project -- ICP Spring 2008 Workshop NCAR Foothills Laboratory Boulder, Colorado.
Page 1© Crown copyright 2005 Met Office Verification -status Clive Wilson, Presented by Mike Bush at EWGLAM Meeting October 8- 11, 2007.
Nathalie Voisin 1, Florian Pappenberger 2, Dennis Lettenmaier 1, Roberto Buizza 2, and John Schaake 3 1 University of Washington 2 ECMWF 3 National Weather.
A Kernel Approach for Learning From Almost Orthogonal Pattern * CIS 525 Class Presentation Professor: Slobodan Vucetic Presenter: Yilian Qin * B. Scholkopf.
Diagnostic verification and extremes: 1 st Breakout Discussed the need for toolkit to build beyond current capabilities (e.g., NCEP) Identified (and began.
WRF Verification Toolkit Workshop, Boulder, February 2007 Spatial verification of NWP model fields Beth Ebert BMRC, Australia.
NCAR, 15 April Fuzzy verification of fake cases Beth Ebert Center for Australian Weather and Climate Research Bureau of Meteorology.
Using TIGGE Data to Understand Systematic Errors of Atmospheric River Forecasts G. Wick, T. Hamill, P. Neiman, and F.M. Ralph NOAA Earth System Research.
Verification methods - towards a user oriented verification The verification group.
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
Statistical Evaluation of High-resolution WRF Model Forecasts Near the SF Bay Peninsula By Ellen METR 702 Prof. Leonard Sklar Fall 2014 Research Advisor:
Chapter 11 Analysis of Variance
Lecture 1.31 Criteria for optimal reception of radio signals.
Data Quality Data quality Related terms:
A few examples of heavy precipitation forecast Ming Xue Director
Nearest-neighbor matching to feature database
Intensity-scale verification technique
Procrustes Shape Analysis Verification Tool
Systematic timing errors in km-scale NWP precipitation forecasts
Spatial Verification Intercomparison Meeting, 20 February 2007, NCAR
Multi-scale validation of high resolution precipitation products
Verifying and interpreting ensemble products
Origins of Signal Detection Theory
Map of the Great Divide Basin, Wyoming, created using a neural network and used to find likely fossil beds See:
Principal Component Analysis (PCA)
Week 3 Class Discussion.
Air Quality Forecast Verification (AFQx)
Binary Forecasts and Observations
Assessing Student Learning
Nearest-neighbor matching to feature database
Outline Announcement Texture modeling - continued Some remarks
A Unifying View on Instance Selection
Nir London, Ora Schueler-Furman  Structure 
Lidia Cucurull, NCEP/JCSDA
Probabilistic forecasts
N. Voisin, J.C. Schaake and D.P. Lettenmaier
Numerical Weather Prediction Center (NWPC), Beijing, China
Descriptive Analysis and Presentation of Bivariate Data
Volume 25, Issue 13, Pages (June 2015)
15.1 The Role of Statistics in the Research Process
Functions in Algebra Pg
A New Approach to Tornado Warning Guidance Algorithms
Nir London, Ora Schueler-Furman  Structure 
Presentation transcript:

General framework for features-based verification Mike Baldwin Purdue University

General framework Any verification method should be built upon the general framework for verification outlined by Murphy and Winkler (1987) Object-oriented or features-based methods can be considered an extension or generalization of the original framework Joint distribution of forecasts and observations: p(f,o)

general joint distribution p(f,o) : where f and o are vectors containing all variables, 3-D, same time o could come from data assimilation o=first guess + weight*forward_model(obs) p(fm,om) : fm and om are values of a single variable across the entire domain traditional joint distribution

general joint distribution p(G(f),G(o)) : where G is some mapping/transformation/operator that is applied to the variable values morphing filter convolution deformation fuzzy

general joint distribution p(Gm(f),Gm(o)) : where Gm is a specific aspect/attribute/characteristic that results from the mapping operator distributions-oriented measures-oriented compute some error measure or score that is a function of Gm(f),Gm(o) MSE

Terminology that we might standardize “feature” – a distinct or important physical object that can be identified within meteorological data “attribute” – a characteristic or quality of a feature, an aspect that can be measured “similarity” – the degree of resemblance between features “distance” – the degree of difference between features (others?)

framework follow Murphy (1993) and Murphy and Winkler (1987) terminology joint distribution of forecast and observed features goodness: consistency, quality, value

aspects of quality accuracy: correspondence between forecast and observed feature attributes single and/or multiple? bias: correspondence between mean forecast and mean observed attributes resolution reliability discrimination stratification

Features-based process FCST OBS Identify features

Features-based process FCST OBS Characterize features

Features-based process How to measure differences between objects? How to determine false alarms/missed events? FCST OBS Compare features

Features-based process FCST OBS Classify features

feature identification procedures for locating a feature within the meteorological data will depend on the problem/phenomena/user of interest a set of instructions that can (easily) be followed/programmed in order for features to be objectively identified in an automated fashion

feature characterization a set of attributes that describe important aspects of each feature numerical values will be the most useful

feature comparison similarity or distance measures systematic method of matching or pairing observed and forecast features determination of false alarms? determination of missed events?

classification a procedure to place similar features into groups or classes reduces the dimensionality of the verification problem similar to going from a scatter plot to a contingency table not necessary/may not always be used

SSEC MODIS archive 10 Apr 2003

feature matching

attributes Lake Fcst #1 Fcst #2 Obs #1 Obs #2 Obs #3 Lat 47.7 44.0 44.8 42.2 43.7 Lon 87.5 87 82.4 81.2 77.9 Area (km2) 82400 58000 59600 25700 19500 Volume (km3) 12000 4900 3540 480 1640 Max depth (m) 406 281 230 64 246

How to match observed and forecast objects? = missed event O1 dij = ‘distance’ between F i and O j O3 Objects might “match” more than once… If d*j > dT : missed event O2 F1 …for each forecast object, choose closest observed object …for each observed object, choose closest forecast object If di* > dT then false alarm F2 = false alarm

Example of object verf ARW 2km (CAPS) Radar mosaic Fcst_2 Obs_2 Fcst_1 Object identification procedure identifies 4 forecast objects and 5 observed objects

Distances between objects ARW 2km (CAPS) Radar mosaic Distances between objects Use dT = 4 as threshold Match objects, find false alarms, missed events O_34 O_37 O_50 O_77 O_79 F_25 5.84 4.16 8.94 9.03 11.53 F_27 6.35 2.54 7.18 6.32 9.25 F_52 7.43 9.11 4.15 9.19 5.45 F_81 9.39 6.36 2.77 5.24

ARW2 ARW4 Df = .07 Dl = .08 Df = .04 Dl = -.07 median position errors matching obs object given a forecast object NMM4 Df = .04 Dl = .22