Download presentation
Presentation is loading. Please wait.
1
The following slides are taken from:
2
Support Vector Machines
Trends & Controversies May, 2002 Intelligent Multimedia lab.
3
CONTENTS Theory of SV Learning How to Implement SVM SVM Applications
Conclusion
4
Theory of SV Learning Introduction
Learning Pattern Recognition from example Hyper plane classifier Feature spaces & kernels Architecture of Support Vector Machines
5
Introduction What are benefits SV learning?
Based on simple idea High performance in practical applications Characteristics of SV method Can dealing with complex nonlinear problems (pattern recognition,regression,feature extraction) But working with a simple linear algorithm (by the use of kernels)
6
Learning Pattern Recognition from Examples(1)
Training Data We want to estimate a function using training data Empirical Risk Risk
7
Learning Pattern Recognition from Examples(2)
Structual Risk Minimization VC Dimension - property of set of functions - maximum number of training points that can be shattered by Ex) ‘s VC dimension of the set of oriented lines VC Theory provides bounds on the test error, which depend on both empirical risk and capacity of function class
8
Hyperplane Classifiers(1)
Class of Hyperplanes Decision functions Maximum margin of separation x2 x1 +1 -1 Wx+b=0
9
Hyperplane Classifiers(2)
10
Hyperplane Classifiers(3)
To construct optimal hyperplane Minimize Subject to Constrained Optimization problem with Lagrangian
11
Hyperplane Classifiers(4)
Primal variables vanish KKT condition Support Vectors whose is nonzero Optimization problem Maximize Subject to Decision function
12
Feature Spaces and Kernels
Input space map to some other dot product space F(feature space) via a nonlinear mapping Kernels Evaluation of decision function require dot product but never the mapped pattern in explicit form Dot products can be evaluated by simple kernel
13
Feature Spaces and Kernels(2)
Example of Kernels Polynomial kernel If d=2 and
14
Architecture of SVMs Nonlinear Classifier(using kernel)
Decision function are computed as the solution of quadratic program
15
How to Implement SVM Optimization Problem Solving Quadratic Program
16
Optimization Problem(1)
Simple example (XOR problem) Input vector d [-1,-1] [-1,+1] [+1,-1] [+1,+1] 1 1 1 K= | | | | | | | |
17
Optimisation Problem(2)
Simple example(cont.) Four Input vectors are All support vectors W = [0 0 –1/sqrt(2) 0 0 0]’
18
Optimization Problem(3)
We want to find Maximize Subject to iteratively increasing the value - Stop conditions 1.Monitoring the growth of the objective function : fraction rate of increase of objective function W(a) 2.Monitor condition 3. Monitoring the gap vs solution
19
Optimisation Problem(4)
The Naive Solution : Gradient Ascent Method I’th component of the gradient of Given Training set S and learning rate Repeat for all train set update End for Until stop criterion satisfied return = learning rate
20
Solving Quadratic Programming(1)
Maximize Minimize Subject to - QP package (MINOS,LOQO,MATLAB toolbox etc.) Is N*N matrix : depend on training input , label ,SVM functional form Call this problem quadratic programming
21
Solving Quadratic Programming(2)
Solving QP problems Q matrix can be very large size –> limitation of memory capacity 1.using sophisticated algorithm [ref] “Solving quadratic programming problem” ,Advanced in kernel methods – Support vector learning, Linda Kaufman,Bell Lab. calculate only activate rows or cols 2. Decompose method Decompose the large scale QP problem into a series of smaller QP problems - Chunking - Osuna’s algorithm - SMO
22
Chunking Idea Pseudo-code
The value of objective function is the same if removes all rows and columns of the matrix Q correspond to zero so large QP problem break down into series of smaller QP problem Pseudo-code Given training set Select an arbitrary working set Repeat solve optimization problem on select new working set from data not satisfying KKT conditions Until stopping criterion satisfied return
23
Osuna’s Method - Pseudo-code
Keeping constant size matrix for every QP sub-problem So it allows very large size training data Requires numerical QP Package - Pseudo-code Given training set Select an arbitrary working set B of free variables The set N of fixed variables While KKT violated (there exists some ,such that) select new set B -> replace any solve optimization problem on B return
24
SMO(Sequential minimal optimization)
-Without any extra matrix storage -Without using numerical QP optimization step -each step only two components modified -needing more iterations to converge, but it needs a few operations each step so overall speed-up -QP decompose is similar to Osuna’s method -each iteration,SMO chooses only two ,and find optimal value, updates the SVM to reflect new optimal value - 3 components to SMO Analytic method to solve for two lagrange multiplier Heuristic for choosing A method for computing bias b
25
SVM Applications(1) Applying to Face Detection
Applying to Face Recognition Applying to Text region Detection Other Applications
26
SVM Applications(2) Applying to Face Detection REFERENCE
“Training Support Vector Machines: an Application to Face Detection” ,Edgar Osuna, MIT “Support Vector Machines : Training and Applications”,Edgar Osuna,MIT Rescale Image several times Cut 19*19 windows pattern Preprocessing –> light correction, histogram equlization Classify using SVM Using Polynomial Kernel degree of 2
27
SVM Applications(3) Applying to Face Recognition
Ref “Face Recognition under Various Lighting Conditions and Expression using Support Vector Machines”,김재진,이성환,고려대학교 인공시각연구센터,1999 Basically SVM is 2-class classifier Face Recognition problem is usually multi class problem Result(correct recognition rate) Face DB : Yale Face Data base , 15 person, 11 images/person It contains various conditions for light, expression, glasses One/the others method Pair-wise Light condition : 94.7%~98.0% Expression : 99.4%~100%
28
SVM Applications(4) Applying to caption detection
Ref : “Support vector machined-based text detection in digital video”,Pattern Recognition,2001,김재진,경북대 Experimental results : 94.3% of text regions detected 86 false alarms 2000 frames korean news shot 500 were using train process 1500 were test image
29
SVM Applications(5) Other Applications -Hand Written Digit Recognition
-Text Categorization -3D object recognition -Face Pose Recognition -Color based Classification -Bio-informatics(protein Homology Detection) “Using SVM for text categorization”,Susan Dumais,microsoft research center “SVM for 3D object recognition”,Pontil,MIT Sequence x , log-likelihood, HMM model parameter
30
Conclusion SVM assure that good performance in a variety of applications such as Pattern Recognition,regression estimation,time series prediction etc. But it have some open issues, 1.Speed up the quadratic programming training method(both time complexity & storage capacity problem are increasing as train data increase) 2.The choice of kernel function : there are no guidelines Other Kernel method Kernel PCA performs nonlinear PCA by carrying out linear PCA in feature space its architecture is nearly the same as SVM
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.