Download presentation
Presentation is loading. Please wait.
Published byTodd Stevenson Modified over 9 years ago
1
Computer Vision Spring 2012 15-385,-685 Instructor: S. Narasimhan WH 5409 T-R 10:30am – 11:50am Lecture #23
2
Classification and SVM Credits: Guru Krishnan and Shree Nayar Eric P. Xing
3
Classification (Supervised Learning) X = {X 1, X 2, …, X n } Data: Y = {y 1, y 2, …, y n }, yi {-1,1} Label: X = Y = 1 1 1 -1 -1 -1 X t -1 or 1? Test: XtXt Face or not? Classifier: f: X Y
4
Classification and Computer Vision Face Object recognitionScene classification Handwritten characters
5
Classification and Computer Vision Pedestrian detection Yes / No
6
Procedure of Classification (1) Gather positive / negative training data (2) Feature Extraction X = Y = 1 1 1 -1 -1 -1 R n Very challenging in reality…
7
Feature Extraction R n Pixel Color histogram Texture filters SIFT Binary codes are available ! Haar filter
8
Feature Space f2f2 fNfN f1f1 ° ° ° ° ° Training Data of Face Training Data of Non-Face ° (3) Learn classifier! Let the n-dimensional feature vector F be a point an n-D space R n (Feature space)
9
f2f2 fNfN f1f1 Nearest Neighbor Classifier ° ° ° ° ° Training Data of Face Training Data of Non-Face ° Nearest samples decide the result of the classifier. Test Image
10
f2f2 fNfN f1f1 Nearest Neighbor Classifier ° ° ° ° ° Training Data of Face Training Data of Non-Face ° Nearest samples decide the result of the classifier. Face
11
f2f2 fNfN f1f1 Nearest Neighbor Classifier ° ° ° ° ° Training Data of Face Training Data of Non-Face ° Nearest samples decide the result of the classifier. Not Face
12
f2f2 fNfN f1f1 Nearest Neighbor Classifier ° ° ° ° ° Training Data of Face Training Data of Non-Face ° False Positive Larger the training set, more robust the NN classifier
13
f2f2 fNfN f1f1 Nearest Neighbor Classifier ° ° ° ° ° Training Data of Face Training Data of Non-Face ° Larger the training set, slower the NN classifier
14
f2f2 fNfN f1f1 Decision Boundary ° Training Data of Face Training Data of Non-Face A simple decision boundary separating the face and non-face classes will suffice. ° ° ° ° °
15
Decision Boundary Find Decision Boundary in feature space Decision Boundary ° ° ° ° ° ° ° ° ° ° ° Faces Non-Faces W T F+b=0 W T F+b>0 W T F+b<0
16
Decision Boundary How to find the optimal decision boundary? ° ° ° ° ° ° ° ° ° ° ° Face Class Non-Face Class
17
Evaluating a Decision Boundary Margin or Safe Zone: The width that the boundary could be increased by before hitting a data point. Margin ° ° ° ° ° ° ° ° ° ° °
18
Evaluating a Decision Boundary Choose Decision Boundary with the Maximum Margin! Margin I ° ° ° ° ° ° ° ° Margin II ° ° ° ° ° ° ° ° + + Decision I: FaceDecision II: Non-Face
19
Support Vector Machine (SVM) Margin ° ° ° ° ° ° ° ° ° ° ° Classifier optimized to maximize Margin Support Vectors: Closest data samples to the boundary. Decision Boundary and the Margin depend only on the Support Vectors.
20
Finding Decision Boundary ° ° ° ° ° ° Distance between a point x to a plan W T F+b=0 w T x + b ||w|| d= (Hessian normal form) W T F+b=0 W T F+b=c -ρ W T F+b=-c ρ w T x s + b ||w|| ρ= where w T x s + b=c c ||w|| ρ= max 2c ||w|| w s.t w T x i + b ||w|| ≥ c for all X i with y i =1 w T x i + b ||w|| ≤ -c ||w|| for all X i with y i =-1
21
Founding Decision Boundary max 2c ||w|| w s.t w T x i + b ||w|| ≥ c for all X i with y i =1 w T x i + b ||w|| ≤ -c ||w|| for all X i with y i =-1 max 2c ||w|| w s.t w T x i + b ||w|| ≥ c for all X i yiyi min ||w|| w s.t (w T x i + b) ≥ 1 for all X i yiyi Learning classifier Compute w by solving the QP
22
Kernel Trick ° ° ° ° ° ° Linearly separable How about these points in 1-D? Think about a quadratic mapping Φ(x)=x 2. ° °° 0 ° ° ° 0 No! Yes!
23
Kernel Trick ° ° ° ° ° ° ° ° Mathematically hard to describe! Φ(x) ° ° ° ° ° ° ° °
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.