Download presentation
Presentation is loading. Please wait.
Published byTeresa Barton Modified over 9 years ago
1
Ziming Zhang *, Ze-Nian Li, Mark Drew School of Computing Science, Simon Fraser University, Vancouver, B.C., Canada {zza27, li, mark}@cs.sfu.ca Learning Image Similarities via Probabilistic Feature Matching 1 *This work was done when the author was in SFU.
2
Outline 2 Introduction Probabilistic-matching based similarity learning Probabilistic feature matching function Probabilistic feature matching learning Experiments Conclusion
3
Introduction 3 Object-based image similarity Ideally two images are supposed to have a higher similarity if they contain similar objects. Feature matching A natural way to measure the image similarities Many different criteria In this talk, we simply match features only based on their appearance information
4
4 >
5
Introduction 5 Several relevant feature matching approaches Summation kernel Max-selection kernel Optimal assignment kernel
6
Introduction 6
7
7 Our approach is a generalization of a family of SIMILARITY learning approaches, including the three above. Similarity matrix ≠ Kernel A kernel matrix can be considered as a special similarity matrix (i.e. symmetric positive semi-definite) Classification with Support Vector Machines (SVM)
8
Probabilistic-matching based similarity learning 8 How to learn these feature matching probabilities?
9
Feature Matching Function 9 Given two images X={x 1,…,x |X| } and Y={y 1,…,y |Y| }, a feature matching function α can be defined as Explanations of matching processes in the Summation Kernel, Max-selection Kernel, and Optimal Assignment Kernel using feature matching functions K sum : K max : K OA :
10
Probabilistic Feature Matching Function 10 Our probabilistic feature matching function α is defined in the vector space covered by the following convex set: Total matching probability of one feature Total matching probability of all features Each matching probability
11
Probabilistic Feature Matching Learning 11 Data-dependent optimization Image similarity Distribution sparseness (or Regulizer)
12
Probabilistic Feature Matching Learning 12 Theorems Consider max f(x) over x X, where f(x) is convex, and X is a closed convex set. If the optimum exists, a boundary point of X is the optimum. If a convex function f(x) attains its maximum on a convex polyhedron X with some extreme points, then this maximum is attained at an extreme point of X. Relation to K sum, K max, and K OA K sum : C=+∞ and H={i,j} K max : C=0 and H={i}, and C=0 and H={j} K OA : C=0 and H={i,j}
13
Probabilistic Feature Matching Learning 13 Proposition For two images X and Y, both the sparseness of α and their similarity will decrease monotonically with increasing the parameter C.
14
Probabilistic Feature Matching Learning 14
15
Experiments 15 Datasets: Graz Descriptor SIFT + dense sampling Image Representation 3*3 spatial Bag-of-Word histograms with 200 codewords Feature similarity: RBF- kernel with χ 2 distance 50 runs Graz-01 Graz-02
16
Experiments 16 Graz-01 (a) PFM 1 with H={i,j} (b) PFM 2 with H={i} or H={j} (c) PFM 3 with H= ф
17
Experiments 17 BikePersonAverage SPK [1] 86.3±2.582.3±3.184.3 PDK [2] 90.2±2.687.2±3.888.7 PFM 1 (C=0)90.6±5.388.2±4.689.4 PFM 2 (C=5)89.6±4.988.5±4.689.0 PFM 3 (C=+∞)89.6±4.887.9±5.188.8 Table 1. Comparison results between different approaches on Graz-01 (%) [1] Lazebnik et. al., “Beyond bags of features: Spatial pyramid matching for recognizing natural scene categories,” in CVPR’06. [2] Ling and Soatto, “Proximity distribution kernels for geometric context in category recognition,” in ICCV’07.
18
Experiments 18 Graz-02 (a) PFM 1 with H={i,j}(b) PFM 2 with H={i} or H={j} (c) PFM 3 with H= ф
19
Experiments 19 BikePersonCarAverage Boost.+SIFT [3] 76.070.068.971.6 Boost.+comb. [3] 77.881.270.576.5 PDK+SIFT [2] 86.7 74.782.7 PDK+hybrid [2] 86.087.374.782.7 PFM 1 +SIFT (C=5) 88.988.185.287.4 PFM 1 +SIFT (C=10) 88.087.983.686.5 PFM 1 +SIFT (C=+∞) 87.787.882.686.0 Table 2. Comparison results between different approaches on Graz-02 (%) Opelt et. al., “Generic object recognition with boosting,” PAMI, 2006.
20
Conclusion 20 Probabilistic feature matching scheme A generalization of a rich family of probabilistic feature matching approaches Easy to control the sparseness of the matching probability distributions and their corresponding image similarities
21
21 Thank you !!!
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.