Download presentation
Presentation is loading. Please wait.
1
Unsupervised Learning of Models for Recognition
M. Weber, M. Welling, P. Perona ECCV 2000 + M.Weber’s PhD thesis Presented by Greg Shakhnarovich for – Learning and Vision seminar May 1, 2002 May 1, 2002 Weber,Welling,Perona
2
The problem “Recognizing members of object class”
i.e. detection Object class defined by common parts that Are visually similar (inter-class variation) Occur in similar but varying configurations (intra-class variation) Are less pose-dependent than the whole object May 1, 2002 Weber,Welling,Perona
3
Meet the xyz May 1, 2002 Weber,Welling,Perona
4
Spot the xyz May 1, 2002 Weber,Welling,Perona
5
Spot the xyz May 1, 2002 Weber,Welling,Perona
6
Spot the xyz May 1, 2002 Weber,Welling,Perona
7
Spot the xyz May 1, 2002 Weber,Welling,Perona
8
Spot the xyz May 1, 2002 Weber,Welling,Perona
9
Spot the xyz May 1, 2002 Weber,Welling,Perona
10
Meet the abc May 1, 2002 Weber,Welling,Perona
11
Example: house in rural scene
Random BG (Poisson) House: roof, 2 windows Rooftop, window pos. normally distributed Fixed scale BG objects: Trees, flowers,fences Random position/scale Occasional occlusion May 1, 2002 Weber,Welling,Perona
12
Main ideas Unsupervised learning of relevant parts
Fully automatic, from cluttered images Only positive examples (exactly one object present) Learning the affine shape (constellations of parts) distribution using EM Decision made in probabilistic framework May 1, 2002 Weber,Welling,Perona
13
Background clutter May 1, 2002 Weber,Welling,Perona
14
Related work Amit & Geman, ’99 Burl,Leung,Perona,Weber ’95-’98
Assumes registration (alignment) Burl,Leung,Perona,Weber ’95-’98 Requires manual labeling Taylor, Cutts, Edwards ’96-’98 AAM – model deformations May 1, 2002 Weber,Welling,Perona
15
Overview Part selection Probabilistic model
Learning the model parameters Results May 1, 2002 Weber,Welling,Perona
16
Part selection Detection: using normalized correlation
Efficiency, good performance Choose the templates in 2 steps: Identify points of interest Förstner’s interest operator ~150 candidates per training image Learn vector quantization in order to reduce the number of candidates May 1, 2002 Weber,Welling,Perona
17
Part selection Interesting points: points/regions where image significantly changes two-dimensionally Edges are not Corners and circular features (contours or blobs) are Förstner’s operator May 1, 2002 Weber,Welling,Perona
18
Part selection: interest operator
May 1, 2002 Weber,Welling,Perona
19
Vector Quantization Goal: learn small subset of best representatives
Think of it as minimal error codebook construction Possible solution: -means clustering Number of clusters set to 100 Discard small clusters (less than 10 patterns) Remove duplicates (up to small shift in any direction) Merge/split clusters Select correct number of clusters May 1, 2002 Weber,Welling,Perona
20
Unsupervised detector training - 2 ©WWP
“Pattern Space” (100+ dimensions) May 1, 2002 Weber,Welling,Perona
21
Example: part selection
34, 9 parts from 200 images Harris corner detector -means clustering, = 100 May 1, 2002 Weber,Welling,Perona
22
Generative Object Model
Part: type (one of ) + position in image Observations: where is a 2D location Hypothesis: means is the location of the part of type Occluded parts: Locations of missing parts: May 1, 2002 Weber,Welling,Perona
23
Example: observed detections
3-part model: May 1, 2002 Weber,Welling,Perona
24
Example: observed detections
Parts: Observations: Correct hypothesis: May 1, 2002 Weber,Welling,Perona
25
Probabilistic model Joint pdf Notation: iff
the number of BG detections in the -th row of May 1, 2002 Weber,Welling,Perona
26
Example: model components
Parts: Observations: Correct hypothesis: May 1, 2002 Weber,Welling,Perona
27
Number of BG part detections
Assumptions about part detections in the BG: Independence between types Independence between locations Binomial Poisson where is the average number of BG detections of part type May 1, 2002 Weber,Welling,Perona
28
Probability of FG part detection
Shouldn’t assume independence e.g., certain parts often occluded simultaneously Model as joint probability mass function with entries May 1, 2002 Weber,Welling,Perona
29
Probability of the hypothesis
Let be the set of hypotheses consistent with given Assumption: all hypotheses in equally likely May 1, 2002 Weber,Welling,Perona
30
Likelihood of the observations
Notation: all FG part locations - all the BG detections Assuming independence between FG & BG: Modeling May 1, 2002 Weber,Welling,Perona
31
Example: constellation model
May 1, 2002 Weber,Welling,Perona
32
Positions of BG part detections
is all the BG detections in Probability of BG detection (given their actual number) is uniform over the image: where is the image area May 1, 2002 Weber,Welling,Perona
33
Affine invariance Must ensure TRS invariance Make positions relative
Shape rather than positions Make positions relative Single reference point: eliminates translation Two points: eliminates rotation + scaling; dimension decreases by two Want to keep a simple form for the densities Part detectors must be TRS-invariant, too… May 1, 2002 Weber,Welling,Perona
34
Shape representation A figure from: M.C.Burl and P.Perona.
Recognition of Planar Object Classes. CVPR 1996 Dryden & Mardia: distributions in shape space (for Gaussian in figure space) Scale/rotation difficult; the authors only implement translation May 1, 2002 Weber,Welling,Perona
35
Classification formulation
Two classes: obj. absent, obj. present Null-hypothesis : all detections are in BG Decision: MAP given the detections A true hypothesis testing setup?.. May 1, 2002 Weber,Welling,Perona
36
Model details Start with a pool of candidate parts
Greedily choose optimal parts Start with random selection Randomly replace one of the parts and see if improve Stop when no more improvement Set and start over May optimize a bit May 1, 2002 Weber,Welling,Perona
37
ML using EM ©WWP 1. Current estimate
2. Assign probabilities to constellations Image 1 Large P Image 2 Small P Image i pdf ... 3. Use probabilities as weights to reestimate parameters. Example: Large P x + Small P x + … = new estimate of May 1, 2002 Weber,Welling,Perona
38
Model parameter estimation
Probability of hypothesized constellation on the object Probability of missing a part of the true object Probability of the observed number of detections in BG The parameters: Must infer hidden variables from observed data EM, maximizing the likelihood May 1, 2002 Weber,Welling,Perona
39
Experiment 1: faces 200 images with faces, 30 people
200 BG images – same environment Grayscale, 240 x 160 pixels Random split to train + test sets Parts: 11x11 pixels Tried 2,3,4,5 parts in model May 1, 2002 Weber,Welling,Perona
40
Learned model - faces May 1, 2002 Weber,Welling,Perona
41
Sample results - faces May 1, 2002 Weber,Welling,Perona
42
Sample results - faces May 1, 2002 Weber,Welling,Perona
43
ROC curves 93.5% correct May 1, 2002 Weber,Welling,Perona
44
Experiment 2: cars Images high-pass filtered 6.899 May 1, 2002
Weber,Welling,Perona
45
Results – cars (86.5% correct)
May 1, 2002 Weber,Welling,Perona
46
Results - cars May 1, 2002 Weber,Welling,Perona
47
Other data sets… May 1, 2002 Weber,Welling,Perona
48
Experiment 3: occlusion
More overfitting Larger parts behave worse? May 1, 2002 Weber,Welling,Perona
49
Experiment 4: multi-scale
May 1, 2002 Weber,Welling,Perona
50
Advantages Unsupervised: not misled by our intuition, doesn’t require expensive labeling Handles occlusion in a well-defined probabilistic way Some pose-invariance, promising results in view-based training Potentially fast (once trained) May 1, 2002 Weber,Welling,Perona
51
Apparent limitations Unsupervised (not led by our intuition)
Must have at most a single object in image Current model doesn’t handle repeated parts (e.g. identical windows) Rotation,scale invariance (theoretically possible) Very expensive training Illumination ? May 1, 2002 Weber,Welling,Perona
52
That’s all… Discussion (hopefully) 6.899 May 1, 2002
Weber,Welling,Perona
53
EM In each iteration, want to find Maximize instead
Value being optimized for (current iteration) Current value (previous iteration) May 1, 2002 Weber,Welling,Perona
54
EM: update rules Expectations w.r.t. the posterior density
May 1, 2002 Weber,Welling,Perona
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.