Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan.

Slides:



Advertisements
Similar presentations
Introduction to Support Vector Machines (SVM)
Advertisements

Generative Models Thus far we have essentially considered techniques that perform classification indirectly by modeling the training data, optimizing.
Support Vector Machines
Lecture 9 Support Vector Machines
ECG Signal processing (2)
SVM - Support Vector Machines A new classification method for both linear and nonlinear data It uses a nonlinear mapping to transform the original training.
An Introduction of Support Vector Machine
Classification / Regression Support Vector Machines

An Introduction of Support Vector Machine
Support Vector Machines
SVM—Support Vector Machines
CSCE822 Data Mining and Warehousing
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Groundwater 3D Geological Modeling: Solving as Classification Problem with Support Vector Machine A. Smirnoff, E. Boisvert, S. J.Paradis Earth Sciences.
Support Vector Machine
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Support Vector Machines (SVMs) Chapter 5 (Duda et al.)
University of Texas at Austin Machine Learning Group Department of Computer Sciences University of Texas at Austin Support Vector Machines.
Prénom Nom Document Analysis: Linear Discrimination Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Support Vector Machines Kernel Machines
Support Vector Machines and Kernel Methods
Support Vector Machines
Sparse Kernels Methods Steve Gunn.
CS 4700: Foundations of Artificial Intelligence
2806 Neural Computation Support Vector Machines Lecture Ari Visa.
SVM Support Vectors Machines
A Study of the Relationship between SVM and Gabriel Graph ZHANG Wan and Irwin King, Multimedia Information Processing Laboratory, Department of Computer.
Support Vector Machines
Lecture 10: Support Vector Machines
Greg GrudicIntro AI1 Support Vector Machine (SVM) Classification Greg Grudic.
ML Concepts Covered in 678 Advanced MLP concepts: Higher Order, Batch, Classification Based, etc. Recurrent Neural Networks Support Vector Machines Relaxation.
An Introduction to Support Vector Machines Martin Law.
Ch. Eick: Support Vector Machines: The Main Ideas Reading Material Support Vector Machines: 1.Textbook 2. First 3 columns of Smola/Schönkopf article on.
Support Vector Machine & Image Classification Applications
CS 8751 ML & KDDSupport Vector Machines1 Support Vector Machines (SVMs) Learning mechanism based on linear programming Chooses a separating plane based.
Support Vector Machines Mei-Chen Yeh 04/20/2010. The Classification Problem Label instances, usually represented by feature vectors, into one of the predefined.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
计算机学院 计算感知 Support Vector Machines. 2 University of Texas at Austin Machine Learning Group 计算感知 计算机学院 Perceptron Revisited: Linear Separators Binary classification.
10/18/ Support Vector MachinesM.W. Mak Support Vector Machines 1. Introduction to SVMs 2. Linear SVMs 3. Non-linear SVMs References: 1. S.Y. Kung,
An Introduction to Support Vector Machine (SVM) Presenter : Ahey Date : 2007/07/20 The slides are based on lecture notes of Prof. 林智仁 and Daniel Yeung.
Kernel Methods A B M Shawkat Ali 1 2 Data Mining ¤ DM or KDD (Knowledge Discovery in Databases) Extracting previously unknown, valid, and actionable.
SVM Support Vector Machines Presented by: Anas Assiri Supervisor Prof. Dr. Mohamed Batouche.
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
An Introduction to Support Vector Machines (M. Law)
1 Chapter 6. Classification and Prediction Overview Classification algorithms and methods Decision tree induction Bayesian classification Lazy learning.
1 Support Vector Machine (SVM)  MUMT611  Beinan Li  Music McGill 
An Introduction to Support Vector Machine (SVM)
SVM – Support Vector Machines Presented By: Bella Specktor.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
University of Texas at Austin Machine Learning Group Department of Computer Sciences University of Texas at Austin Support Vector Machines.
Support vector machine LING 572 Fei Xia Week 8: 2/23/2010 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A 1.
Support Vector Machines Tao Department of computer science University of Illinois.
Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.
A TUTORIAL ON SUPPORT VECTOR MACHINES FOR PATTERN RECOGNITION ASLI TAŞÇI Christopher J.C. Burges, Data Mining and Knowledge Discovery 2, , 1998.
Support Vector Machines (SVM): A Tool for Machine Learning Yixin Chen Ph.D Candidate, CSE 1/10/2002.
Chapter 6. Classification and Prediction Classification by decision tree induction Bayesian classification Rule-based classification Classification by.
Chapter 6. Classification and Prediction Classification by decision tree induction Bayesian classification Rule-based classification Classification by.
CpSc 810: Machine Learning Support Vector Machine.
A Brief Introduction to Support Vector Machine (SVM) Most slides were from Prof. A. W. Moore, School of Computer Science, Carnegie Mellon University.
Support Vector Machines (SVMs) Chapter 5 (Duda et al.) CS479/679 Pattern Recognition Dr. George Bebis.
CSSE463: Image Recognition Day 14
PREDICT 422: Practical Machine Learning
Support Vector Machines
An Introduction to Support Vector Machines
An Introduction to Support Vector Machines
Support Vector Machines Introduction to Data Mining, 2nd Edition by
COSC 4368 Machine Learning Organization
SVMs for Document Ranking
Presentation transcript:

Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan King Saud University The College of Computer & Information Science Computer Science Department (Master) Neural Networks and Machine Learning Applications (CSC 563 ) Spring 2008

Paper Information  Title: Machine Learning Using Support Vector Machines.  Authors: Abdul Rahim Ahmad. Marzuki Khalid. Rubiyah Yusof.  Publisher: MSTC 2002, Johor Bahru.  Date: September 2002.

Review Outlines  Introduction  Artificial Neural Network (ANN)  Support Vector Machine (SVM) Support Vectors Theory of SVM Quadratic Programming Non-linear SVM SVM Implementations SVM for Multi-class Classification  Handwriting Recognition Experimental Results  ANN vs. SVM  Conclusion

Introduction

 The aim of this paper is to Present SVM as a comparison to ANN. Show the concept of SVM by providing them some details about SVM.

Machine Learning

Machine Learning (ML)  Constructing computer program that automatically improve its performance with experience. ??? Learned Data

Machine Learning (ML) Applications 1.Data mining programs. 2.Information filtering systems. 3.Autonomous vehicles. 4.Pattern recognition system: Speech recognition. Handwriting recognition. Face recognition. Text categorization.

Artificial Neural Network

Artificial Neural Network (ANN)  Massively parallel computing systems consisting of an extremely large number of simple processors with many interconnections.

Artificial Neural Network (ANN)  The main characteristics of ANN are: 1.Ability to learn complex nonlinear input- output relationships 2.They use sequential training procedures to updating (adapt) network architecture and connection weights so that a network can work efficiently.

Artificial Neural Network (ANN)  In the area of pattern classification, the feed-forward network is most popularly used. Data Clustering Pattern Classification  Kohonen Self- Organizing Map (SOM)  Multilayer perceptron (MLP)  Radial-BasisFunction (RBF) networks

ANN and Pattern Recognition  ANN is low dependence on domain- specific knowledge compared to rule- based approaches.  Availability of efficient learning algorithms to use.

Support Vector Machine

Support Vector Machine – (SVM)  SVM was introduced in 1992 by Vapnik and his coworkers.  SVM original form: Binary classifier; separates between two classes. Design for linear and separable data set.  SVM used for classification and regression.

Support Vector Machine – (SVM)

Theory of SVM Constraints 1.No data points between H1 and H2. 2.Margin between H1 and H2 is maximized. -

Support Vectors Solution: expressed as a linear combination of support vectors: Subset of training patterns Close to the decision boundary

Theory of SVM  training data: {,, ……, } Where: SVM Class or label Input features

Theory of SVM Class 1:Class 2:

Theory of SVM  Learn a linear separating hyper plane classifier:

Quadratic Programming  to maximize the margin, we need to minimize:  Quadratic Programming solved by introducing Lagrange multipliers

Lagrange Multipliers  Maximize L where w and b are eliminated: and

Theory of SVM  Discriminate function:

Non-linear SVM 1.SVM mapped the data sets of input space into a higher dimensional feature space 2.Linear and the large-margin learning algorithm is then applied.

Non-linear SVM

 If the mapping function is,we just solve:  However, the mapping can be implicitly done by kernel functions:

Non-linear SVM  Discriminate function:

Kernel  There are many kernels that can be used that way.  Any kernel that satisfies Mercer’s condition can be used.

Kernel - Examples  Polynomial kernels  Hyperbolic tangent  Radial basis function (Gaussian kernel)

Non-separable Input Space  In real world problem, there is always noise.  Noise  Non-separable data.  Slack variables are introduced to each input:  Penalty Parameter C: control overfitting.

Non-separable Input Space

SVM for Multi-class Classification Basic SVM is binary classifier; separates between two classes. In real world, more than two classes is usually needed. Ex: handwriting recognition

SVM for Multi-class Classification Methods Modifying binary to incorporate multi-class learning. Combining binary classifiers One vs. One K (K-1) /2 One vs. All K

SVM for Multi-class Classification  One vs. One and DAGSVM (Directed Acyclic Graph) are the best choices for practical use.  they are: Less complex Easy to construct Faster to train. Tapia et al, 2005

SVM Implementation  Quadratic programming (QP) which is computationally intensive.  However, many decomposition methods have been proposed that avoids the QP and makes SVM learning practical for many current problems.  Ex: Sequential Minimal Optimization (SMO)

Results of Experimental Studies

Data  Handwritten digit database: MNIST dataset. USPS dataset.  more difficult; human recognition error rate is as high as 2.5%.

Error rate comparison of ANN, SVM and other algorithms for MNIST and USPS database.

1.SVM error rate is significantly lower than most other algorithms except for LeNet 5 NN. 2.Training time for SVM was significantly slower the higher recognition rate (low error rate) justify for the usage. 3.SVM usage should be increasing and replacing ANN in the area of handwriting recognition where faster method of implementing SVM have been introduced recently. Results of Experimental Studies

SVM vs. ANN SVMANN  Naturally handles multi-class classification.  Multi-class implementation needs to be performed.  ANN is known to overfit data unless cross-validation is applied.  SVM does not overfit data (Structural Risk Minimization).  Local minimum  Global minimum.

Conclusion  SVM is Powerful and is a useful alternative to neural networks.  SVM find Global, unique solution.  Two key concepts of SVM: maximize the margin and choice of kernel.  Performance depends on choice of kernel and parameters  Still a research topic.  Training is memory-intensive due to QP.

Conclusion  Many active research is taking place on areas related to SVM.  Many SVM implementations are available on the Web:  SVMLight  LIBSVM

Thank you ….. Questions?