Download presentation
Presentation is loading. Please wait.
Published byAugusta McDaniel Modified over 9 years ago
1
SUPPORT VECTOR MACHINES
2
Intresting Statistics: Vladmir Vapnik invented Support Vector Machines in 1979. SVM have been developed in the framework of Statistical Learning Theory
3
Two Class Linear Discrimination The method of discrimination for two classes of points involves determining a linear function that consists of a linear combination of the attributes
4
The Classification Problem Separating Surface: A+ A- Find surface to best separate two classes.
5
Definition : 2-SVM In its simplest, linear form, an SVM is a hyperplane that separates a set of positive examples from a set of negative examples with maximum margin.
6
Linear case The two hyper planes are w x b 1 w x b 1 W=normal to the hyperplane Perpendicular distance = 2/|w|
7
Now finding the hyper planes with the largest margin reduces to finding values for w and b that minimizes square|w| Subject to constraint yi ( w x i b 1 0.
8
A standard way of handling optimization is through minimization of lagrangian LP 1\2 square||w || i yi (w x i b i
9
By differentiating with w and b We get w I yi xi I yi 0, LD 1\2 i j yi y j x i x j.
10
So know the classification relation is b i yi x i v.
11
Nonlinear case One cannot separate two classes with a straight line. The structure of the SVM equation allows a simple solution Map the data, through a nonlinear transformation, to a different space, where the data can be separated with a hyper plane
12
So know LD 1\2 i j yi y j (x I) x j). he classification relation is b i yi ( x i ) ( v ).
13
Suppose K( x, y ) ( x ) ( y ) Then classification relation becomes b i yi k( x v ).
14
K-SVM In multi classes discrimination u have to construct a discriminate function to separate one class from remaining k-1 classes
15
Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.