Presentation is loading. Please wait.

Presentation is loading. Please wait.

EE462 MLCV Lecture 5-6 Object Detection – Boosting Tae-Kyun Kim.

Similar presentations


Presentation on theme: "EE462 MLCV Lecture 5-6 Object Detection – Boosting Tae-Kyun Kim."— Presentation transcript:

1 EE462 MLCV Lecture 5-6 Object Detection – Boosting Tae-Kyun Kim

2 EE462 MLCV Face Detection Demo 2

3 EE462 MLCV 3 Multiclass object detection [Torralba et al PAMI 07]

4 EE462 MLCV Object Detection 4 From TUD dataset

5 EE462 MLCV Object Detection 5 From TUD dataset

6 EE462 MLCV From TUD dataset Object Detection 6

7 EE462 MLCV Number of windows 7 … x Number of Windows: 747,666 # of scales # of pixels

8 EE462 MLCV Time per window 8 or raw pixels …… dimension D Num of feature vectors: 747,666 …

9 EE462 MLCV Time per window 9 or raw pixels …… dimension D Num of feature vectors: 747,666 … Time per window (or vector): 0.00000134 sec In order to finish the task in 1 sec Neural Network? Nonlinear SVM?

10 EE462 MLCV Examples of face detection 10 From Viola, Jones, 2001

11 EE462 MLCV  Integrating Visual Cues [Darrell et al IJCV 00]  Face pattern detection output (left).  Connected components recovered from stereo range data (mid).  Flesh hue regions from skin hue classification (right). 11 More traditionally… Narrow down the search space

12 EE462 MLCV Since about 2001… Boosting Simple Features [Viola &Jones 01]  Adaboost classification  Weak classifiers: Haar-basis like functions (45,396 in total) 12 Weak classifier Strong classifier

13 EE462 MLCV Introduction to Boosting Classifiers - AdaBoost

14 EE462 MLCV Sorry for inconsistent notations… 14

15 EE462 MLCV Boosting 15

16 EE462 MLCV Boosting 16

17 EE462 MLCV Boosting  Iteratively reweighting training samples.  Higher weights to previously misclassified samples. 17 1 round2 rounds3 rounds4 rounds5 rounds50 rounds

18 EE462 MLCV 18

19 EE462 MLCV AdaBoost 19

20 EE462 MLCV 20

21 EE462 MLCV Boosting 21

22 EE462 MLCV Boosting as an optimisation framework 22

23 EE462 MLCV Minimising Exponential Error 23

24 EE462 MLCV 24

25 EE462 MLCV 25

26 EE462 MLCV 26

27 EE462 MLCV 27

28 EE462 MLCV Existence of weak learners  Definition of a baseline learner  Data weights:  Set  Baseline classifier: for all x  Error is at most ½.  Each weak learner in Boosting is demanded s.t. → Error of the composite hypothesis goes to zero as boosting rounds increase [Duffy et al 00]. 28

29 EE462 MLCV Robust real-time object detector

30 EE462 MLCV Boosting Simple Features [Viola and Jones CVPR 01]  Adaboost classification  Weak classifiers: Haar-basis like functions (45,396 in total) 30 Weak classifier Strong classifier

31 EE462 MLCV 31

32 EE462 MLCV Evaluation (testing) 32 From Viola, Jones, 2001

33 EE462 MLCV Boosting Simple Features [Viola and Jones CVPR 01]  Integral image  A value at (x,y) is the sum of the pixel values above and to the left of (x,y).  The integral image can be computed in one pass over the original image. 33

34 EE462 MLCV Boosting Simple Features [Viola and Jones CVPR 01]  Integral image  The sum of original image values within the rectangle can be computed: Sum = A-B-C+D  This provides the fast evaluation of Haar-basis like features 34

35 EE462 MLCV Evaluation (testing) 35 From Viola, Jones, 2001

36 EE462 MLCV Boosting as a Tree-structured Classifier

37 EE462 MLCV Boosting (very shallow network)  The strong classifier H as boosted decision stumps has a flat structure  Cf. Decision “ferns” has been shown to outperform “trees” [Zisserman et al, 07] [Fua et al, 07] 37 c0 c1 x ……

38 EE462 MLCV Boosting -continued  Good generalisation by a flat structure  Fast evalution  Sequential optimisation 38 A strong boosting classifier Boosting Cascade [viola & Jones 04], Boosting chain [Xiao et al]  Very imbalanced tree  Speeds up for unbalanced binary problems  Hard to design

39 EE462 MLCV A cascade of classifiers  The detection system requires good detection rate and extremely low false positive rates.  False positive rate and detection rate are f_i is the false positive rate of i-th classifier on the examples that get through to it.  The expected number of features evaluated is p_j is the proportion of windows input to i-th classifier. 39

40 EE462 MLCV Demo video: Fast evaluation 40

41 EE462 MLCV Object Detection by a Cascade of Classifiers 41 Pictures from Romdhani et al. ICCV01

42 EE462 MLCV Matlab Demo for Boosting 42


Download ppt "EE462 MLCV Lecture 5-6 Object Detection – Boosting Tae-Kyun Kim."

Similar presentations


Ads by Google