Download presentation
Presentation is loading. Please wait.
1
Generic Object Detection using Feature Maps Oscar Danielsson (osda02@kth.se) Stefan Carlsson (stefanc@kth.se)
2
Outline Intro Problem Statement Related Work Famous Object Detectors Our Work Motivation Training Hierarchical Detection Experiments Conclusion
3
Detect all Instances of an Object Class The classifier needs to be fast (on average). This is typically accomplished by: 1.Using image features that can be computed quickly 2.Using a cascade of increasingly complex classifiers (Viola and Jones IJCV 2004)
4
Outline Intro Problem Statement Related Work Famous Object Detectors Our Work Motivation Training Hierarchical Detection Experiments Conclusion
5
Famous Object Detectors (1) Dalal and Triggs (CVPR 05) use a dense Histogram of Oriented Gradients (HOG) representation - the window is tiled into (overlapping) sub-regions and gradient orientation histograms from all sub-regions are concatenated. A linear SVM is used for classification.
6
Famous Object Detectors (2) Felzenszwalb et. al. (PAMI 10, CVPR 10) extend the Dalal and Triggs model to include high resolution parts with flexible location.
7
Famous Object Detectors (3) Viola and Jones (IJCV 2004) construct a weak classifier by thresholding the response of a Haar filter (computed using integral images). Weak classifiers are combined into a strong classifier using AdaBoost.
8
Outline Intro Problem Statement Related Work Famous Object Detectors Our Work Motivation Training Hierarchical Detection Experiments Conclusion
9
Motivation CornersCorners + BlobsRegionsEdges Different object classes are characterized by different features. So we want to leave the choice of features up to the user. Therefore we construct an object detector based on feature maps. Any feature detectors in any combination can be used to generate feature maps.
10
Our Object Detector We use AdaBoost to build a strong classifier. We construct a weak classifier by thresholding the distance from a measurement point to the closest occurrence of a given feature.
11
Outline Intro Problem Statement Related Work Famous Object Detectors Our Work Motivation Training Hierarchical Detection Experiments Conclusion
12
Extraction of Training Data 1.Feature maps are extracted by some external feature detectors 2.Distance transforms are computed for each feature map 3.(For each training window) Distances from each measurement point to the closest occurrence of the corresponding feature are concatenated into a vector
13
Training Cascade Strong Learner Weak Learner Decision Stump Learner 1.Require positive training examples and background images 2.Randomly sample background images to extract negative training examples 3.Loop: 1.Train strong classifier 2.Append strong classifier to current cascade 3.Run cascade on background images to harvest false positives 4.If number of false positives sufficiently few, stop { f i }, { I j } { f i }, { c i }, T { f i }, { c i }, { d i } Viola-Jones Cascade Construction
14
Training Cascade Strong Learner Weak Learner Decision Stump Learner 1.Require positive training examples and background images 2.Randomly sample background images to extract negative training examples 3.Loop: 1.Train strong classifier 2.Append strong classifier to current cascade 3.Run cascade on background images to harvest false positives 4.If number of false positives sufficiently few, stop { f i }, { I j } { f i }, { c i }, T { f i }, { c i }, { d i } Viola-Jones Cascade Construction
15
Training Cascade Strong Learner Weak Learner Decision Stump Learner 1.Require labeled training examples and number of rounds 2.Init. weights of training examples 3.For each round 1.Train weak classifier 2.Compute weight of weak classifier 3.Update weights of training examples { f i }, { c i }, T { f i }, { I j } { f i }, { c i }, { d i } AdaBoost
16
Training Cascade Strong Learner Weak Learner Decision Stump Learner 1.Require labeled training examples and number of rounds 2.Init. weights of training examples 3.For each round 1.Train weak classifier 2.Compute weight of weak classifier 3.Update weights of training examples { f i }, { c i }, T { f i }, { I j } { f i }, { c i }, { d i } AdaBoost
17
Training Cascade Strong Learner Weak Learner Decision Stump Learner 1.Require labeled and weighted training examples 2.Compute node output 3.Train decision stump 4.Split training examples using decision stump 5.Evaluate stopping conditions 6.Train decision tree on left subset of training examples 7.Train decision tree on right subset of training examples { f i }, { c i }, { d i } { f i }, { I j } { f i }, { c i }, T { f i }, { c i }, { d i } Decision Tree Learner
18
Training Cascade Strong Learner Weak Learner Decision Stump Learner 1.Require labeled and weighted training examples 2.Compute node output 3.Train decision stump 4.Split training examples using decision stump 5.Evaluate stopping conditions 6.Train decision tree on left subset of training examples 7.Train decision tree on right subset of training examples { f i }, { c i }, { d i } { f i }, { I j } { f i }, { c i }, T { f i }, { c i }, { d i } Decision Tree Learner
19
Training Cascade Strong Learner Weak Learner Decision Stump Learner 1.Require labeled and weighted training examples 2.For each measurement point 1.Compute a threshold by assuming exponentially distributed distances 2.Compute classification error after split 3.If error lower than previous errors, store threshold and measurement point { f i }, { c i }, { d i } { f i }, { I j } { f i }, { c i }, T { f i }, { c i }, { d i } Feature and Threshold Selection
20
Outline Intro Problem Statement Related Work Famous Object Detectors Our Work Motivation Training Hierarchical Detection Experiments Conclusion
21
Hierarchical Detection Evaluate an “optimistic” classifier on regions in search space. Split positive regions recursively.
22
Hierarchical Detection Evaluate an “optimistic” classifier on regions in search space. Split positive regions recursively.
23
Hierarchical Detection Evaluate an “optimistic” classifier on regions in search space. Split positive regions recursively.
24
Hierarchical Detection Evaluate an “optimistic” classifier on regions in search space. Split positive regions recursively.
25
Hierarchical Detection Evaluate an “optimistic” classifier on regions in search space. Split positive regions recursively.
26
Hierarchical Detection Evaluate an “optimistic” classifier on regions in search space. Split positive regions recursively.
27
Hierarchical Detection x y s x y s Search spaceImage space Each point in search space corresponds to a window in the image. In the image is a measurement point.
28
Hierarchical Detection x y s x y Search spaceImage space A region in search space corresponds to a set of windows in the image. This translates to a set of locations for the measurement point.
29
Hierarchical Detection x y s x y Search spaceImage space We can then compute upper and lower bounds for the distance to the closest occurrence of the corresponding feature. Based on these bounds we construct an optimistic classifier.
30
Outline Intro Problem Statement Related Work Famous Object Detectors Our Work Motivation Training Hierarchical Detection Experiments Conclusion
31
Experiments Detection results obtained on the ETHZ Shape Classes dataset, which was used for testing only Training data downloaded from Google images : 106 applelogos, 128 bottles, 270 giraffes, 233 mugs and 165 swans Detections counted as correct if A intersect / A union ≥ 0.2 Features used: edges, corners, blobs, Kadir-Brady + SIFT + quantization
32
Results Real AdaBoost slightly better than Dirscrete and Gentle AdaBoost
33
Results Decision tree weak classifiers should be shallow
34
Results Using all features are better than using only edges
35
Results Using the asymmetric weighting scheme of Viola and Jones yields a slight improvement
36
Results Applelogos Bottles Mugs Swans
37
Results Hierarchical search yields a significant speed-up
38
Outline Intro Problem Statement Related Work Famous Object Detectors Our Work Motivation Training Hierarchical Detection Experiments Conclusion
39
Proposed object detection scheme based on feature maps Used distances from measurement points to nearest feature occurrence in image to construct weak classifiers for boosting Showed promising detection performance on the ETHZ Shape Classes dataset Showed that a hierarchical detection scheme can yield significant speed-ups – Thanks for listening!
40
Famous Object Detectors (4) Laptev (IVC 09) construct a weak classifier using a linear discriminant on a histogram of oriented gradients (HOG – computed by integral histograms) from a sub-region of the window. Again, weak classifiers are combined into a strong classifier using AdaBoost.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.