SVM Lab material borrowed from tutorial by David Meyer FH Technikum Wien, Austria see:

Slides:



Advertisements
Similar presentations
ECG Signal processing (2)
Advertisements

Pattern Recognition and Machine Learning
Particle swarm optimization for parameter determination and feature selection of support vector machines Shih-Wei Lin, Kuo-Ching Ying, Shih-Chieh Chen,
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
CITYFLUX – Filter Sample (J Floor)
Ensemble Learning (2), Tree and Forest
WHAT ARE THEY AND WHERE ARE THEY FOUND? COMMON COMPOUNDS.
Prediction model building and feature selection with SVM in breast cancer diagnosis Cheng-Lung Huang, Hung-Chang Liao, Mu- Chen Chen Expert Systems with.
Elements and Symbols Practice. PART I (SLIDES WITH BLUE BACKGROUNDS) Directions: Name the Symbol for each listed element.
Matter Intro Chapter. Anything that has mass and volume. It is made up of atoms. Matter.
“Stuff” of the Universe The Raw Materials for Planets, Rocks and Life.
GA-Based Feature Selection and Parameter Optimization for Support Vector Machine Cheng-Lung Huang, Chieh-Jen Wang Expert Systems with Applications, Volume.
1 Peter Fox Data Analytics – ITWS-4963/ITWS-6965 Week 10a, April 1, 2014 Support Vector Machines.
Symbol Review ? ? ? ? ? ? ?.
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
1 Peter Fox Data Analytics – ITWS-4963/ITWS-6965 Week 10b, April 4, 2014 Lab: More on Support Vector Machines, Trees, and your projects.
Element name Oxygen (O) Silicon (Si) Aluminum (Al) Iron (Fe) Calcium (Ca) Sodium (Na) Potassium (K) Magnesium (Mg)
Review of Elements #1-20 Good Luck!! FabulousFunFriskyDr. Evil
Protein Fold Recognition as a Data Mining Coursework Project Badri Adhikari Department of Computer Science University of Missouri-Columbia.
1 Peter Fox Data Analytics – ITWS-4963/ITWS-6965 Week 11a, April 7, 2014 Support Vector Machines, Decision Trees, Cross- validation.
Matter Intro Chapter. Anything that has mass and volume. Matter.
JBR1 Support Vector Machines Classification Venables & Ripley Section 12.5 CSU Hayward Statistics 6601 Joseph Rickert & Timothy McKusick December 1, 2004.
SVM Lab material borrowed from tutorial by David Meyer FH Technikum Wien, Austria see:
Identifying glass Using the top-notch data-mining algorithms from the Leiden Institute of Advanced Computer Science (LIACS) Presented by Jan-Willem and.
ECE 471/571 – Lecture 22 Support Vector Machine 11/24/15.
Gist 2.3 John H. Phan MIBLab Summer Workshop June 28th, 2006.
Tree and Forest Classification and Regression Tree Bagging of trees Boosting trees Random Forest.
A distributed PSO – SVM hybrid system with feature selection and parameter optimization Cheng-Lung Huang & Jian-Fan Dun Soft Computing 2008.
1 Peter Fox Data Analytics – ITWS-4600/ITWS-6600 Week 11a, April 12, 2016 Interpreting: MDS, DR, SVM Factor Analysis; and Boosting.
Elements of Earth’s Crust, Living organisms, Oceans and Atmosphere
Zhenshan, Wen SVM Implementation Zhenshan, Wen
Predicting E. Coli Promoters Using SVM
Chapter 5 Compounds and Their Bonds
Project 4: Facial Image Analysis with Support Vector Machines
COMP24111: Machine Learning and Optimisation
Pfizer HTS Machine Learning Algorithms: November 2002
Mineral Chemistry.
Alan Qi Thomas P. Minka Rosalind W. Picard Zoubin Ghahramani
Assignment 8: due Use Weka to classify beer-bottle glass by brewery
Interpreting: MDS, DR, SVM Factor Analysis
Pie Chart in R.
Periodic Table Element flashcards
Labs: Dimension Reduction, Factor Analysis
Labs: Dimension Reduction, Factor Analysis
Labs: Dimension Reduction, Multi-dimensional Scaling, SVM
Peter Fox Data Analytics – ITWS-4600/ITWS-6600 Week 9b, April 1, 2016
Photo 11/12/2018.
The Periodic Table.
Peter Fox and Greg Hughes Data Analytics – ITWS-4600/ITWS-6600
Chemistry of glaze.
Homework What happens when you mix bleach and ammonia cleaning products? Why may it be important to understand chemical names?
Labs: Dimension Reduction, Multi-dimensional Scaling, SVM
Hyperparameters, bias-variance tradeoff, validation
Interpreting: MDS, DR, SVM Factor Analysis
Chapter 5 Compounds and Their Bonds
Elements numbers 1-20.
network of simple neuron-like computing elements
AHED Automatic Human Emotion Detection
Data Analytics – ITWS-4600/ITWS-6600/MATP-4450
Compounds and Their Bonds
Interpreting: MDS, DR, SVM Factor Analysis
Labs: Trees, Dimension Reduction, Multi-dimensional Scaling, SVM
Shih-Wei Lin, Kuo-Ching Ying, Shih-Chieh Chen, Zne-Jung Lee
Support Vector Machine _ 2 (SVM)
Peter Fox Data Analytics – ITWS-4600/ITWS-6600 Week 10b, April 8, 2016
ITWS-4600/ITWS-6600/MATP-4450/CSCI-4960
Classification by multivariate linear regression
Classification problem with small dataset
PLAY YOU CARDS RIGHT! reactivity.
Increase in Ease of Oxidation
Presentation transcript:

SVM Lab material borrowed from tutorial by David Meyer FH Technikum Wien, Austria see:

Packages # Start by loading relevant libraries: # e1071 # mlbench # # If mlbench isn’t available then you will have to install it

Glass Dataset # Retrieve/Access "Glass" data from mlbench package data(Glass, package="mlbench") #The description of the Glass data set is on the following slide # Number of Attributes: 10 (including an Id#) plus the class # attribute -- all attributes are continuously valued

Attribute Information: 1. Id number: 1 to RI: refractive index 3. Na: Sodium (unit measurement: weight percent in corresponding oxide, as are attributes 4-10) 4. Mg: Magnesium 5. Al: Aluminum 6. Si: Silicon 7. K: Potassium 8. Ca: Calcium 9. Ba: Barium 10. Fe: Iron

Class Information: Type of glass: (class attribute) Type 1 building_windows_float_processed 2building_windows_non_float_processed 3 vehicle_windows_float_processed 4 vehicle_windows_non_float_processed (none in this database) 5 containers 6 tableware 7 headlamps

Create Training and Test Sets # Create a row index index <- 1:nrow(Glass) # Create an index of test samples by randomly selecting 1/3 of the samples testindex <- sample(index, trunc(length(index)/3)) # Create test set testset <- Glass[testindex,] # Create training set trainset <- Glass[-testindex,]

Train the SVM model # Train the svm model using: # "Type" (column 10) as the dependent variable, # # cost = 100 as the penalty cost for C-classification # This is the ‘C’-constant of the regularization term in # the Lagrange formulation # # gamma = 1 as the radial basis kernel function-specific parameter svm.model <- svm(Type ~., data = trainset, cost = 100, gamma = 1)

Apply SVM Model # Use the SVM to predict the classification for the testset svm.pred <- predict(svm.model, testset[,-10]) # Compute the SVM confusion matrix table(pred = svm.pred, true = testset[,10]) # determine accuracy t = table(pred = svm.pred, true = testset[,10]) sum(diag(t))/sum(t)

Optimize Parameters # Approach: Grid search with 10-fold cross validation # Note: a random mixing precedes the partitioning of the data # Optimize parameters to the svm with RBF kernel # The grid search iterates with gamma = 2^-4 through 2 # and cost = 2 through 2^7 # The returned object reports the best gamma & cost # and the corresponding classification error obj = tune.svm(Type~., data = Glass, gamma = 2^(-4:1), cost = 2^(1:7))

Optimize Parameters # Inspect the results # Note the results will very unless you set the seed for the # random number generator which is used to mix the data # before the partitioning > obj Parameter tuning of ‘svm’: - sampling method: 10-fold cross validation - best parameters: gamma cost best performance: Note: The performance is reported as the error The accuracy is 1 – error, in this case =

Another online resource is: