A TUTORIAL ON SUPPORT VECTOR MACHINES FOR PATTERN RECOGNITION ASLI TAŞÇI Christopher J.C. Burges, Data Mining and Knowledge Discovery 2, , 1998
OUTLINE Introduction Linear Support Vector Machines Nonlinear Support Vector Machines Limitations Conclusion
INTRODUCTION Classification and Regression tool Supervised Learning Linear and non-linear classification performance
APPLICATION AREAS Handwritten Digit Recognition Object Recognition Speaker Identification Text Categorization Face Detection in Images
LINEAR SUPPORT VECTOR MACHINES Simplest Case: Seperable Data SVM Equaiton: Lagranian:
KARUSH-KUHN-TUCKER CONDITIONS Constraint optimization
NON-SEPERABLE CASE Introducing Slack variables for a feasible solution with linear SVM Lagranian for non-seperable data:
NONLINEAR SUPPORT VECTOR MACHINES Mapping data to a feature space Example: Kernel Function:
MERCER’S CONDITION Positive Semi-definite
OPTIMIZATION PROBLEM Quadratic programming optimizaiton
TRAINING Decomposition algorithms for larger problems Chunking method Osuna’s decomposition algorithm
LIMITATIONS Choice of the Kernel Speed Size Discrete Data Multi-class classification
PERFORMANCE OF SVM The Virtual Support Vector Method Training the system than creating a new data by distorting the resulting support vectors. The reduced set method Increases the speed of SVM
CONCLUSION New approach to the problem of pattern recognition SVM training always find a global minimum Largely characterized by the choice of its Kernel
THANK YOU FOR LISTENING