SUPPORT VECTOR MACHINES
Intresting Statistics: Vladmir Vapnik invented Support Vector Machines in SVM have been developed in the framework of Statistical Learning Theory
Two Class Linear Discrimination The method of discrimination for two classes of points involves determining a linear function that consists of a linear combination of the attributes
The Classification Problem Separating Surface: A+ A- Find surface to best separate two classes.
Definition : 2-SVM In its simplest, linear form, an SVM is a hyperplane that separates a set of positive examples from a set of negative examples with maximum margin.
Linear case The two hyper planes are w x b 1 w x b 1 W=normal to the hyperplane Perpendicular distance = 2/|w|
Now finding the hyper planes with the largest margin reduces to finding values for w and b that minimizes square|w| Subject to constraint yi ( w x i b 1 0.
A standard way of handling optimization is through minimization of lagrangian LP 1\2 square||w || i yi (w x i b i
By differentiating with w and b We get w I yi xi I yi 0, LD 1\2 i j yi y j x i x j.
So know the classification relation is b i yi x i v.
Nonlinear case One cannot separate two classes with a straight line. The structure of the SVM equation allows a simple solution Map the data, through a nonlinear transformation, to a different space, where the data can be separated with a hyper plane
So know LD 1\2 i j yi y j (x I) x j). he classification relation is b i yi ( x i ) ( v ).
Suppose K( x, y ) ( x ) ( y ) Then classification relation becomes b i yi k( x v ).
K-SVM In multi classes discrimination u have to construct a discriminate function to separate one class from remaining k-1 classes
Questions?