Download presentation
Presentation is loading. Please wait.
Published byMercy Fowler Modified over 6 years ago
1
General framework for features-based verification
Mike Baldwin Purdue University
2
General framework Any verification method should be built upon the general framework for verification outlined by Murphy and Winkler (1987) Object-oriented or features-based methods can be considered an extension or generalization of the original framework Joint distribution of forecasts and observations: p(f,o)
3
general joint distribution
p(f,o) : where f and o are vectors containing all variables, 3-D, same time o could come from data assimilation o=first guess + weight*forward_model(obs) p(fm,om) : fm and om are values of a single variable across the entire domain traditional joint distribution
4
general joint distribution
p(G(f),G(o)) : where G is some mapping/transformation/operator that is applied to the variable values morphing filter convolution deformation fuzzy
5
general joint distribution
p(Gm(f),Gm(o)) : where Gm is a specific aspect/attribute/characteristic that results from the mapping operator distributions-oriented measures-oriented compute some error measure or score that is a function of Gm(f),Gm(o) MSE
6
Terminology that we might standardize
“feature” – a distinct or important physical object that can be identified within meteorological data “attribute” – a characteristic or quality of a feature, an aspect that can be measured “similarity” – the degree of resemblance between features “distance” – the degree of difference between features (others?)
7
framework follow Murphy (1993) and Murphy and Winkler (1987) terminology joint distribution of forecast and observed features goodness: consistency, quality, value
8
aspects of quality accuracy: correspondence between forecast and observed feature attributes single and/or multiple? bias: correspondence between mean forecast and mean observed attributes resolution reliability discrimination stratification
9
Features-based process
FCST OBS Identify features
10
Features-based process
FCST OBS Characterize features
11
Features-based process
How to measure differences between objects? How to determine false alarms/missed events? FCST OBS Compare features
12
Features-based process
FCST OBS Classify features
13
feature identification
procedures for locating a feature within the meteorological data will depend on the problem/phenomena/user of interest a set of instructions that can (easily) be followed/programmed in order for features to be objectively identified in an automated fashion
14
feature characterization
a set of attributes that describe important aspects of each feature numerical values will be the most useful
15
feature comparison similarity or distance measures
systematic method of matching or pairing observed and forecast features determination of false alarms? determination of missed events?
16
classification a procedure to place similar features into groups or classes reduces the dimensionality of the verification problem similar to going from a scatter plot to a contingency table not necessary/may not always be used
17
SSEC MODIS archive 10 Apr 2003
18
feature matching
19
attributes Lake Fcst #1 Fcst #2 Obs #1 Obs #2 Obs #3 Lat 47.7 44.0
44.8 42.2 43.7 Lon 87.5 87 82.4 81.2 77.9 Area (km2) 82400 58000 59600 25700 19500 Volume (km3) 12000 4900 3540 480 1640 Max depth (m) 406 281 230 64 246
20
How to match observed and forecast objects?
= missed event O1 dij = ‘distance’ between F i and O j O3 Objects might “match” more than once… If d*j > dT : missed event O2 F1 …for each forecast object, choose closest observed object …for each observed object, choose closest forecast object If di* > dT then false alarm F2 = false alarm
21
Example of object verf ARW 2km (CAPS) Radar mosaic Fcst_2 Obs_2 Fcst_1
Object identification procedure identifies 4 forecast objects and 5 observed objects
22
Distances between objects
ARW 2km (CAPS) Radar mosaic Distances between objects Use dT = 4 as threshold Match objects, find false alarms, missed events O_34 O_37 O_50 O_77 O_79 F_25 5.84 4.16 8.94 9.03 11.53 F_27 6.35 2.54 7.18 6.32 9.25 F_52 7.43 9.11 4.15 9.19 5.45 F_81 9.39 6.36 2.77 5.24
23
ARW2 ARW4 Df = .07 Dl = .08 Df = .04 Dl = -.07 median position errors matching obs object given a forecast object NMM4 Df = .04 Dl = .22
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.