Download presentation
1
Similarity and Difference
Pete Barnum January 25, 2006 Advanced Perception
2
Visual Similarity Color Texture
3
Uses for Visual Similarity Measures
Classification Is it a horse? Image Retrieval Show me pictures of horses. Unsupervised segmentation Which parts of the image are grass?
4
Histogram Example Slides from Dave Kauchak
5
Cumulative Histogram Normal Histogram Cumulative Histogram
Slides from Dave Kauchak
6
Joint vs Marginal Histograms
Images from Dave Kauchak
7
Joint vs Marginal Histograms
Images from Dave Kauchak
8
Adaptive Binning
9
Clusters (Signatures)
10
Higher Dimensional Histograms
Histograms generalize to any number of features Colors Textures Gradient Depth Signatures Histograms
11
Distance Metrics - - - = Euclidian distance of 5 units
y y - x x = Euclidian distance of 5 units - = Grayvalue distance of 50 values - = ?
12
Bin-by-bin Bad! Good!
13
Cross-bin Bad! Good!
14
Distance Measures Heuristic Nonparametric test statistics
Minkowski-form Weighted-Mean-Variance (WMV) Nonparametric test statistics 2 (Chi Square) Kolmogorov-Smirnov (KS) Cramer/von Mises (CvM) Information-theory divergences Kullback-Liebler (KL) Jeffrey-divergence (JD) Ground distance measures Histogram intersection Quadratic form (QF) Earth Movers Distance (EMD)
15
Heuristic Histogram Distances
Minkowski-form distance Lp Special cases: L1: absolute, cityblock, or Manhattan distance L2: Euclidian distance L: Maximum value distance Slides from Dave Kauchak
16
More Heuristic Distances
Weighted-Mean-Variance Only includes minimal information about the distribution Slides from Dave Kauchak
17
Nonparametric Test Statistics
2 Measures the underlying similarity of two samples Images from Kein Folientitel
18
Nonparametric Test Statistics
Kolmogorov-Smirnov distance Measures the underlying similarity of two samples Only for 1D data
19
Nonparametric Test Statistics
Kramer/von Mises Euclidian distance Only for 1D data
20
Information Theory Kullback-Liebler
Cost of encoding one distribution as another
21
Information Theory Jeffrey divergence
Just like KL, but more numerically stable
22
Ground Distance Histogram intersection Good for partial matches
23
Ground Distance Quadratic form Heuristic Images from Kein Folientitel
24
Ground Distance Earth Movers Distance Images from Kein Folientitel
25
Summary Images from Kein Folientitel
26
Moving Earth ≠
27
Moving Earth ≠
28
Moving Earth =
29
The Difference? (amount moved) =
30
The Difference? (amount moved) * (distance moved) =
31
Linear programming P Q m clusters (distance moved) * (amount moved)
All movements n clusters
32
Linear programming P Q m clusters (distance moved) * (amount moved)
n clusters
33
Linear programming P m clusters * (amount moved) Q n clusters
34
Linear programming P m clusters Q n clusters
35
Constraints 1. Move “earth” only from P to Q P P’ Q Q’ m clusters
n clusters
36
Constraints 2. Cannot send more “earth” than there is P P’ Q Q’
m clusters P’ Q Q’ n clusters
37
Constraints 3. Q cannot receive more “earth” than it can hold P P’ Q
m clusters P’ Q Q’ n clusters
38
Constraints 4. As much “earth” as possible must be moved P P’ Q Q’
m clusters P’ Q Q’ n clusters
39
Advantages Uses signatures Nearness measure without quantization
Partial matching A true metric
40
Disadvantage High computational cost
Not effective for unsupervised segmentation, etc.
41
Examples Using Color (CIE Lab) Color + XY Texture (Gabor filter bank)
42
Image Lookup
43
Image Lookup L1 distance Jeffrey divergence χ2 statistics
Quadratic form distance Earth Mover Distance
44
Image Lookup
45
Concluding thought - - = it depends on the application -
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.