LECTURE 21: SUPPORT VECTOR MACHINES PT. 2 April 13, 2016 SDS 293 Machine Learning.

Slides:



Advertisements
Similar presentations
ECG Signal processing (2)
Advertisements

Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Linear Classifiers (perceptrons)

An Introduction of Support Vector Machine
SVM—Support Vector Machines
Support vector machine
Machine learning continued Image source:
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
LOGO Classification IV Lecturer: Dr. Bo Yuan
SVMs Reprised. Administrivia I’m out of town Mar 1-3 May have guest lecturer May cancel class Will let you know more when I do...
Machine Learning & Data Mining CS/CNS/EE 155 Lecture 2: Review Part 2.
The Disputed Federalist Papers : SVM Feature Selection via Concave Minimization Glenn Fung and Olvi L. Mangasarian CSNA 2002 June 13-16, 2002 Madison,
Lecture 14 – Neural Networks
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
ECE 5984: Introduction to Machine Learning Dhruv Batra Virginia Tech Topics: –SVM –Multi-class SVMs –Neural Networks –Multi-layer Perceptron Readings:
Margins, support vectors, and linear programming Thanks to Terran Lane and S. Dreiseitl.
Announcements  Project teams should be decided today! Otherwise, you will work alone.  If you have any question or uncertainty about the project, talk.
SVMs Concluded; R1. Administrivia HW1 returned today and there was much rejoicing... μ=15.8 σ=8.8 Remember: class is curved You don’t have to run faster.
SVMs Finalized. Where we are Last time Support vector machines in grungy detail The SVM objective function and QP Today Last details on SVMs Putting it.
Data mining and statistical learning - lecture 13 Separating hyperplane.
Lecture outline Support vector machines. Support Vector Machines Find a linear hyperplane (decision boundary) that will separate the data.
Linear Methods, cont’d; SVMs intro. Straw poll Which would you rather do first? Unsupervised learning Clustering Structure of data Scientific discovery.
SVMs Reprised Reading: Bishop, Sec 4.1.1, 6.0, 6.1, 7.0, 7.1.
What is Learning All about ?  Get knowledge of by study, experience, or being taught  Become aware by information or from observation  Commit to memory.
Optimization Theory Primal Optimization Problem subject to: Primal Optimal Value:
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Ch. Eick: Support Vector Machines: The Main Ideas Reading Material Support Vector Machines: 1.Textbook 2. First 3 columns of Smola/Schönkopf article on.
Outline Separating Hyperplanes – Separable Case
ADVANCED CLASSIFICATION TECHNIQUES David Kauchak CS 159 – Fall 2014.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 24 – Classifiers 1.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
Support Vector Machines Reading: Ben-Hur and Weston, “A User’s Guide to Support Vector Machines” (linked from class web page)
Using Support Vector Machines to Enhance the Performance of Bayesian Face Recognition IEEE Transaction on Information Forensics and Security Zhifeng Li,
Text Classification 2 David Kauchak cs459 Fall 2012 adapted from:
Christopher M. Bishop, Pattern Recognition and Machine Learning.
Support Vector Machines in Marketing Georgi Nalbantov MICC, Maastricht University.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
1 Classification and Feature Selection Algorithms for Multi-class CGH data Jun Liu, Sanjay Ranka, Tamer Kahveci
Support Vector Machines. Notation Assume a binary classification problem. –Instances are represented by vector x   n. –Training examples: x = (x 1,
CSSE463: Image Recognition Day 15 Announcements: Announcements: Lab 5 posted, due Weds, Jan 13. Lab 5 posted, due Weds, Jan 13. Sunset detector posted,
LECTURE 02: EVALUATING MODELS January 27, 2016 SDS 293 Machine Learning.
Lecture 5: Statistical Methods for Classification CAP 5415: Computer Vision Fall 2006.
LECTURE 06: CLASSIFICATION PT. 2 February 10, 2016 SDS 293 Machine Learning.
LECTURE 07: CLASSIFICATION PT. 3 February 15, 2016 SDS 293 Machine Learning.
LECTURE 04: LINEAR REGRESSION PT. 2 February 3, 2016 SDS 293 Machine Learning.
LECTURE 05: CLASSIFICATION PT. 1 February 8, 2016 SDS 293 Machine Learning.
SVMs in a Nutshell.
CSSE463: Image Recognition Day 14 Lab due Weds. Lab due Weds. These solutions assume that you don't threshold the shapes.ppt image: Shape1: elongation.
Support Vector Machines Optimization objective Machine Learning.
CSSE463: Image Recognition Day 15 Today: Today: Your feedback: Your feedback: Projects/labs reinforce theory; interesting examples, topics, presentation;
LECTURE 16: BEYOND LINEARITY PT. 1 March 28, 2016 SDS 293 Machine Learning.
LECTURE 17: BEYOND LINEARITY PT. 2 March 30, 2016 SDS 293 Machine Learning.
LECTURE 20: SUPPORT VECTOR MACHINES PT. 1 April 11, 2016 SDS 293 Machine Learning.
LECTURE 11: LINEAR MODEL SELECTION PT. 1 March SDS 293 Machine Learning.
LECTURE 12: LINEAR MODEL SELECTION PT. 2 March 7, 2016 SDS 293 Machine Learning.
Support Vector Machines Part 2. Recap of SVM algorithm Given training set S = {(x 1, y 1 ), (x 2, y 2 ),..., (x m, y m ) | (x i, y i )   n  {+1, -1}
LECTURE 09: CLASSIFICATION WRAP-UP February 24, 2016 SDS 293 Machine Learning.
LECTURE 13: LINEAR MODEL SELECTION PT. 3 March 9, 2016 SDS 293 Machine Learning.
A Brief Introduction to Support Vector Machine (SVM) Most slides were from Prof. A. W. Moore, School of Computer Science, Carnegie Mellon University.
LECTURE 15: PARTIAL LEAST SQUARES AND DEALING WITH HIGH DIMENSIONS March 23, 2016 SDS 293 Machine Learning.
PREDICT 422: Practical Machine Learning
Lecture 18. Support Vector Machine (SVM)
Support Vector Machines (SVM)
Support Vector Machines
Support Vector Machine I
CSSE463: Image Recognition Day 15
Junheng, Shengming, Yunsheng 11/09/2018
Presentation transcript:

LECTURE 21: SUPPORT VECTOR MACHINES PT. 2 April 13, 2016 SDS 293 Machine Learning

Announcements 1/2 A8 due tonight by 11:59pm FP2: Dataset & Initial Model – due Friday by 4:59pm Remaining written homework assignments cancelled – focus on your final project deliverables

Announcements 2/2 FP poster template is now available on Moodle You’re welcome to make changes / reformat; just please keep it 3’x4’ portrait Printing: - Option 1: give PDF to Jordan on or before April 21 st - Option 2: arrange printing on your own (Paradise Copies, etc.)

Outline Maximal margin classifier Support vector classification - 2 classes, linear boundaries - 2 classes, nonlinear boundaries Multiple classes Comparison to other methods Lab quick recap

Recap: maximal margin classifier Big idea: find the dividing hyperplane with the widest margin Problem: sometimes we can’t be perfect…

Bigger value = further from margin = more confident >11 ≤11 Recap: support vector classifier Big idea: we might be willing to sacrifice a few in order to give the rest a better margin

>11 ≤11 Recap: support vectors Decision rule is based only on the support vectors => SVC is robust to strange behavior far from the hyperplane!

Recap: kernel LinearPolynomialRadial Big idea: using different ways of measuring “similarity” allows you to partition the feature space in different ways

Discussion Question: what’s the problem? Answer: regardless of kernel shape, a SMV can only divide the data into two parts… but the data in the real world sometimes has multiple classes So what can we do?

Goal: assign observation to 1 of 4 groups

One-versus-one classification Big idea: build an SVM for each pair of classes To predict: classify each observation using all of the (k choose 2) classifiers, keep track of how many times the observation is assigned to each: majority wins

One-versus-all classification Big idea: build an SVM for each class To predict: classify each observation using each of the k classifiers, keep track of how confident we are of each prediction: most confident class wins ! ! ! !

Quick history lesson Mid-90s: SVMs come on the scene Reaction: OoOoOoooOooooohh… - Good performance - “Mysterious”– felt entirely different from classical approaches

“loss” (consequence for poor prediction) “penalty” (consequence for being complicated) Flashback: loss functions Many of the methods we’ve seen so far take the form: With a little manipulation, we can rewrite our SVM as: hinge loss

Alas… Despite appearances, SVMs are quite closely related to logistic regression and other classical statistical methods And what’s worse, most other classification methods can use non-linear kernels, too! (recall Ch. 7…)

SVMs vs. logistic regression loss functions preferred when classes overlap preferred when classes are well-separated

Lab: Multiclass SVMs To do today’s lab in R: To do today’s lab in python: Instructions and code: Full version can be found beginning on p. 366 of ISLR

Coming up Next week: unsupervised techniques Wednesday: guest speaker Virginia Goodwin, MITLL