Download presentation
Presentation is loading. Please wait.
Published byHeather Adams Modified over 9 years ago
1
Part II Support Vector Machine Algorithms
2
Outline Some variants of SVM Relevant algorithms Usage of the algorithms
3
SVM Algorithms Famous SVM Implementations LibSVM LibSVM http://www.csie.ntu.edu.tw/~cjlin/libsvm/http://www.csie.ntu.edu.tw/~cjlin/libsvm/http://www.csie.ntu.edu.tw/~cjlin/libsvm/ SVM-Light SVM-Light http://svmlight.joachims.org/http://svmlight.joachims.org/http://svmlight.joachims.org/ General Data Mining tools that contain SVM Spider toolbox (matlab) Spider toolbox (matlab) http://www.kyb.tuebingen.mpg.de/bs/people/spiderhttp://www.kyb.tuebingen.mpg.de/bs/people/spiderhttp://www.kyb.tuebingen.mpg.de/bs/people/spider WEKA toolbox (java) WEKA toolbox (java) http://www.cs.waikato.ac.nz/~ml/weka/index.htmlhttp://www.cs.waikato.ac.nz/~ml/weka/index.htmlhttp://www.cs.waikato.ac.nz/~ml/weka/index.html
4
Variants in LibSVM We use libSVM as an example C-SVC C-SVC Nu-SVC Nu-SVC One class SVM One class SVM Epsilon-SVR Epsilon-SVR Nu-SVR Nu-SVR
5
C-SVC The dual is
6
nu-SVC
7
One class-SVM The dual is
8
Epsilon-SVR
9
nu-SVR
10
Commands of libSVM SVMtoy : an Illustration tool SVMtrain: the training algorithm SVMpredict: the test algorithm SVMscale: an algorithm for scalling
11
Usage of svm-train Usage: svmtrain [options] training_set_file [model_file] options: -s svm_type : set type of SVM (default 0) 0 -- C-SVC 0 -- C-SVC 1 -- nu-SVC 1 -- nu-SVC 2 -- one-class SVM 2 -- one-class SVM 3 -- epsilon-SVR 3 -- epsilon-SVR 4 -- nu-SVR 4 -- nu-SVR -t kernel_type : set type of kernel function (default 2) 0 -- linear: u'*v 0 -- linear: u'*v 1 -- polynomial: (gamma*u'*v + coef0)^degree 1 -- polynomial: (gamma*u'*v + coef0)^degree 2 -- radial basis function: exp(-gamma*|u-v|^2) 2 -- radial basis function: exp(-gamma*|u-v|^2) 3 -- sigmoid: tanh(gamma*u'*v + coef0) 3 -- sigmoid: tanh(gamma*u'*v + coef0)
12
-d degree : set degree in kernel function (default 3) -g gamma : set gamma in kernel function (default 1/k) -r coef0 : set coef0 in kernel function (default 0) -c cost : set the parameter C of C-SVC, epsilon- SVR, and nu-SVR (default 1) -n nu : set the parameter nu of nu-SVC, one- class SVM, and nu-SVR (default 0.5) -p epsilon : set the epsilon in loss function of epsilon-SVR (default 0.1)
13
-m cachesize : set cache memory size in MB (default 100) -e epsilon : set tolerance of termination criterion (default 0.001) -h shrinking: whether to use the shrinking heuristics, 0 or 1 (default 1) -b probability_estimates: whether to train a SVC or SVR model for probability estimates, 0 or 1 (default 0) -wi weight: set the parameter C of class i to weight*C, for C-SVC (default 1) -v n: n-fold cross validation mode
14
Example of svm-train(cont.) svmtrain -s 0 -c 1000 -t 2 -g 0.5 -e 0.00001 data_file Train a classifier with RBF kernel exp(-0.5|u-v|^2) and stopping tolerance 0.00001 Train a classifier with RBF kernel exp(-0.5|u-v|^2) and stopping tolerance 0.00001 svmtrain -s 3 -p 0.1 -t 0 -c 10 data_file Solve SVM regression with linear kernel u'v and C=10, and epsilon = 0.1 in the loss function. Solve SVM regression with linear kernel u'v and C=10, and epsilon = 0.1 in the loss function. svmtrain -s 0 -c 10 -w1 1 -w-1 5 data_file Train a classifier with penalty 10 for class 1 and penalty 50 for class -1. Train a classifier with penalty 10 for class 1 and penalty 50 for class -1. svmtrain -s 0 -c 500 -g 0.1 -v 5 data_file Do five-fold cross validation for the classifier usingthe parameters C = 500 and gamma = 0.1> svm-train -s 0 - b 1 data_file Do five-fold cross validation for the classifier usingthe parameters C = 500 and gamma = 0.1> svm-train -s 0 - b 1 data_file
15
Usage of SVM-predict Usage: svmpredict [options] test_file model_file output_file options: -b probability_estimates: whether to predict probability estimates, 0 or 1 (default 0); one-class SVM not supported yet options: -b probability_estimates: whether to predict probability estimates, 0 or 1 (default 0); one-class SVM not supported yet Examples svmtrain -c 100 -t 0.\data\breast_cancer.train svmtrain -c 100 -t 0.\data\breast_cancer.train svmpredict.\data\breast_cancer.test.\data\br east_cancer.train.model breast_cancer.res svmpredict.\data\breast_cancer.test.\data\br east_cancer.train.model breast_cancer.res
16
Usage of SVM-scale Usage: svmscale [-l lower] [-u upper] [-y y_lower y_upper] [-s save_filename] [-r restore_filename] filename (default: lower = -1, upper = 1, no y scaling) svmscale -l 0 -u 1 -s range datasetx\svmguide3> svmguide3.scale svmscale -r range datasetx\svmguide3.t> svmguide3t.scale
17
Usage of SVM-light Commands: svm_learn svm_learn svm_classify svm_classify
18
Usage of SVM-light svm_learn [options] example_file model_file Available options are: Learning options: -z {c,r,p} - select between classification (c), regression (r), and preference ranking (p) (default classification) -z {c,r,p} - select between classification (c), regression (r), and preference ranking (p) (default classification) -c float - C: trade-off between training error and margin (default [avg. x*x]^-1) -c float - C: trade-off between training error and margin (default [avg. x*x]^-1) -w [0..] - epsilon width of tube for regression (default 0.1) -w [0..] - epsilon width of tube for regression (default 0.1) -j float - Cost: cost-factor, by which training errors on positive examples outweight errors on negative examples (default 1) -j float - Cost: cost-factor, by which training errors on positive examples outweight errors on negative examples (default 1) -b [0,1] - use biased hyperplane (i.e. x*w+b0) instead of unbiased hyperplane (i.e. x*w0) (default 1) -b [0,1] - use biased hyperplane (i.e. x*w+b0) instead of unbiased hyperplane (i.e. x*w0) (default 1) -i [0,1] - remove inconsistent training examples and retrain (default 0) -i [0,1] - remove inconsistent training examples and retrain (default 0)
19
Usage of SVM-light Transduction options -p [0..1] - fraction of unlabeled examples to be classified into the positive class (default is the ratio of positive and negative examples in the training data) -p [0..1] - fraction of unlabeled examples to be classified into the positive class (default is the ratio of positive and negative examples in the training data) Kernel options: -t int - type of kernel function: 0: linear (default) 1: polynomial (s a*b+c)^d 2: radial basis function exp(-gamma ||a-b||^2) 3: sigmoid tanh(s a*b + c) 4: user defined kernel from kernel.h -t int - type of kernel function: 0: linear (default) 1: polynomial (s a*b+c)^d 2: radial basis function exp(-gamma ||a-b||^2) 3: sigmoid tanh(s a*b + c) 4: user defined kernel from kernel.h -d int - parameter d in polynomial kernel -d int - parameter d in polynomial kernel -g float - parameter gamma in rbf kernel -g float - parameter gamma in rbf kernel -s float - parameter s in sigmoid/poly kernel -s float - parameter s in sigmoid/poly kernel -r float - parameter c in sigmoid/poly kernel -r float - parameter c in sigmoid/poly kernel -u string - parameter of user defined kernel -u string - parameter of user defined kernel
20
Example of SVM-Light SVM-Light considers the sparsity of data Especially useful for Text Classification svm_learn –c 100.\data\train.dat.\data\model.dat svm_learn –c 100.\data\train.dat.\data\model.dat svm_classify.\data\test.dat.\data\model.dat.\ data\res.dat svm_classify.\data\test.dat.\data\model.dat.\ data\res.dat
21
Illustration of SVM in Spider % Clear all variables/definitions >> clear classes % Generate 100 points from a toy data object and lets have a look to it. >> d=gen(toy2d('uneven_gauss','l=200')); >> plot(d);
22
Illustration of SVM in Spider % Create a C-SVM object with RBF kernel (sigma = 1) and C=Inf >> s=svm({kernel('rbf',1)}); % We now train the SVM and obtain a trained object and the training result >> [r,a]=train(s,d); % We evaluate the training error and visualize the trained object >> loss(r) >> plot(a)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.