CSSE463: Image Recognition Day 20

Slides:



Advertisements
Similar presentations
1 Image Classification MSc Image Processing Assignment March 2003.
Advertisements

Support Vector Machines
CSSE463: Image Recognition Day 20 Announcements: Announcements: Sunset detector due Weds. 11:59 Sunset detector due Weds. 11:59 Literature reviews due.
Classifiers for Recognition Reading: Chapter 22 (skip 22.3) Slide credits for this chapter: Frank Dellaert, Forsyth & Ponce, Paul Viola, Christopher Rasmussen.
CSSE463: Image Recognition Day 31 Due tomorrow night – Project plan Due tomorrow night – Project plan Evidence that you’ve tried something and what specifically.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
What is machine learning? 1. A very trivial machine learning tool K-Nearest-Neighbors (KNN) The predicted class of the query sample depends on the voting.
Classification III Tamara Berg CS Artificial Intelligence Many slides throughout the course adapted from Svetlana Lazebnik, Dan Klein, Stuart Russell,
CSSE463: Image Recognition Day 21 Upcoming schedule: Upcoming schedule: Exam covers material through SVMs Exam covers material through SVMs.
This week: overview on pattern recognition (related to machine learning)
CSSE463: Image Recognition Day 27 This week This week Last night: k-means lab due. Last night: k-means lab due. Today: Classification by “boosting” Today:
ADVANCED CLASSIFICATION TECHNIQUES David Kauchak CS 159 – Fall 2014.
An Example of Course Project Face Identification.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
CSSE463: Image Recognition Day 11 Lab 4 (shape) tomorrow: feel free to start in advance Lab 4 (shape) tomorrow: feel free to start in advance Test Monday.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
CSSE463: Image Recognition Day 33 This week This week Today: Classification by “boosting” Today: Classification by “boosting” Yoav Freund and Robert Schapire.
CSSE463: Image Recognition Day 15 Announcements: Announcements: Lab 5 posted, due Weds, Jan 13. Lab 5 posted, due Weds, Jan 13. Sunset detector posted,
CSSE463: Image Recognition Day 11 Due: Due: Written assignment 1 tomorrow, 4:00 pm Written assignment 1 tomorrow, 4:00 pm Start thinking about term project.
CSSE463: Image Recognition Day 14 Lab due Weds. Lab due Weds. These solutions assume that you don't threshold the shapes.ppt image: Shape1: elongation.
CSSE463: Image Recognition Day 15 Today: Today: Your feedback: Your feedback: Projects/labs reinforce theory; interesting examples, topics, presentation;
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
CSSE463: Image Recognition Day 14
CS Fall 2015 (Shavlik©), Midterm Topics
CSSE463: Image Recognition Day 11
Lecture 24: Convolutional neural networks
Machine Learning I & II.
Estimating Link Signatures with Machine Learning Algorithms
Neural Networks CS 446 Machine Learning.
CSSE463: Image Recognition Day 17
An Introduction to Support Vector Machines
Machine Learning Week 1.
CSSE463: Image Recognition Day 11
Neural Networks Advantages Criticism
Hyperparameters, bias-variance tradeoff, validation
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 20
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 20
Midterm Exam Closed book, notes, computer Format:
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 13
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 14
Midterm Exam Closed book, notes, computer Similar to test 1 in format:
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 18
Announcements Project 2 artifacts Project 3 due Thursday night
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 16
CSSE463: Image Recognition Day 14
Announcements Artifact due Thursday
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 16
Announcements Artifact due Thursday
Modeling IDS using hybrid intelligent systems
Support Vector Machines 2
Presentation transcript:

CSSE463: Image Recognition Day 20

Term project next steps From the Lit Review specification. Goal: Review what others have done. Don’t re-invent the wheel. Read papers! Summarize papers Due next Friday (Extensions available on request)

CSSE463: Image Recognition Day 20 Today: Lab for sunset detector Next week: Monday: Midterm Exam Tuesday: sunset detector lab (due Weds. 11:00 pm) Thursday: k-means clustering Friday: lab 6 (k-means)

Exam prep Bright blue roadmap sheet Exam review slides (courtesy reminder)

Last words on neural nets/SVM

How does svmfwd compute y1? y1 is just the weighted sum of contributions of individual support vectors: d = data dimension, e.g., 294, s = kernel width. numSupVecs, svcoeff (alpha) and bias are learned during training. Note: looking at which of your training examples are support vectors can be revealing! (Keep in mind for sunset detector and term project) Much easier computation than training Was easy to implement on a device without MATLAB (ea smartphone)

SVMs vs. Neural Nets SVM: Neural networks: Training can take a long time with large data sets due to choosing parameters… But the classification runtime and space are O(sd), where s is the number of support vectors, and d is the dimensionality of the feature vectors. In the worst case, s = size of whole training set (like nearest neighbor) Overfitting is occurring: can use with accuracy to choose classifier But no worse than implementing a neural net with s perceptrons in the hidden layer. Empirically shown to have good generalizability even with relatively-small training sets and no domain knowledge. Neural networks: can tune architecture. Lots of parameters! Q3 on old SVM quiz

Common model of learning machines Statistical Learning (svmtrain) Labeled Training Images Normalize Extract Features (color, texture) Summary Test Image Extract Features (color, texture) Classifier (svmfwd) Label

Sunset Process I suggest writing as you go Loop over 4 folders of images Extract features Normalize Split into train and test and label Save Loop over kernel params Train Test Record accuracy, #sup vec For SVM with param giving best accuracy, Generate ROC curve Find good images Do extension I suggest writing as you go