Download presentation
Presentation is loading. Please wait.
1
Rutgers CS440, Fall 2003 Support vector machines Reading: Ch. 20, Sec. 6, AIMA 2 nd Ed
2
Rutgers CS440, Fall 2003 Linearly separable sets Which separating line (plane, hyperplane) to choose? w’ x – w 0 = 0 x1x1 x2x2 (w’x p -w 0 ) / ||w|| xpxp w The one with the largest distance to the sets: b = ( w x p - w 0 ) / ||w|| More precisely: margin
3
Rutgers CS440, Fall 2003 Maximizing the margin Support vectors: points closest to the margin w’ x – w 0 = 0 x1x1 x2x2 support vectors, x sv b b such that y=+1 y=-1
4
Rutgers CS440, Fall 2003 Maximization problem such that Quadratic programming problem: w’ x – w 0 = 0 x1x1 x2x2 b b 1 =0 2 =0 3 >0 4 >0 5 =0
5
Rutgers CS440, Fall 2003 Linearly non-separable problems We know how to find a “best” hyperplane (e.g., line) when sets of points are linearly separable. What if they are not? Options: 1.Find a hyperplane that makes the fewest errors. 2.Reformulate the problem: change the original space into a one where the points are linearly separable. New space (z 1,z 2,z 3 ):
6
Rutgers CS440, Fall 2003 Kernel machines Construct mapping from space X to some other space Z Build linear SVM in Z Alternatively, introduce kernels: Linear SVM in X Nonlinear kernel SVM Linear SVM in Z = F(X) linear polynomial Gaussian
7
Rutgers CS440, Fall 2003 Example: SVMs for face detection Osuna et al., ‘97
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.