School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP 112 2014 # 18.

Slides:



Advertisements
Similar presentations
Artificial Intelligence 12. Two Layer ANNs
Advertisements

Classification Classification Examples
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Perceptron Learning Rule
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Data Mining Classification: Alternative Techniques
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
CSC321: Neural Networks Lecture 3: Perceptrons
Overview over different methods – Supervised Learning
An introduction to: Deep Learning aka or related to Deep Neural Networks Deep Structural Learning Deep Belief Networks etc,
Simple Neural Nets For Pattern Classification
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
An Illustrative Example
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Data Mining with Neural Networks (HK: Chapter 7.5)
Machine learning Image source:
Radial-Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Generic object detection with deformable part-based models
Machine learning Image source:
Neural Networks Lecture 8: Two simple learning algorithms
Pattern Recognition Vidya Manian Dept. of Electrical and Computer Engineering University of Puerto Rico INEL 5046, Spring 2007
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Processing of large document collections Part 2 (Text categorization) Helena Ahonen-Myka Spring 2006.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Rotation Invariant Neural-Network Based Face Detection
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
ADVANCED PERCEPTRON LEARNING David Kauchak CS 451 – Fall 2013.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Images and 2D Graphics COMP # 17.
Supervised Machine Learning: Classification Techniques Chaleece Sandberg Chris Bradley Kyle Walsh.
COMP53311 Other Classification Models: Neural Network Prepared by Raymond Wong Some of the notes about Neural Network are borrowed from LW Chan’s notes.
Perceptrons Michael J. Watts
Chapter 6 Neural Network.
1 Learning Bias & Clustering Louis Oliphant CS based on slides by Burr H. Settles.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Neural Networks References: “Artificial Intelligence for Games” "Artificial Intelligence: A new Synthesis"
Neural Networks Lecture 4 out of 4. Practical Considerations Input Architecture Output.
Linear Models & Clustering Presented by Kwak, Nam-ju 1.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Fall 2004 Backpropagation CS478 - Machine Learning.
Deep Learning Amin Sobhani.
an introduction to: Deep Learning
Learning with Perceptrons and Neural Networks
Data Mining, Neural Network and Genetic Programming
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Perceptrons Lirong Xia.
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Neural Networks Dr. Peter Phillips.
Classification with Perceptrons Reading:
AV Autonomous Vehicles.
CS 4/527: Artificial Intelligence
Machine Learning Week 1.
network of simple neuron-like computing elements
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Artificial Intelligence 12. Two Layer ANNs
David Kauchak CS51A Spring 2019
An introduction to: Deep Learning aka or related to Deep Neural Networks Deep Structural Learning Deep Belief Networks etc,
A task of induction to find patterns
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Perceptrons Lirong Xia.
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18

COMP112 17: 2 Menu Graphics Image recognition AI and Perceptrons Admin

COMP112 17: 3 Image recognition Multiple tasks. Recognising an image as belonging to a category Determining if an image contains an instance of some category Identifying the objects in an image. Eg: character recognition Red eye correction. Face detection and face recognition. Pedestrian / hazard detection for cars Google image indexing

COMP112 17: 4 How? Matching against prototypes Classification: Match whole image distinguishing fixed categories vs identifying category, if there is anything at all. Detection: Sweep prototype across image and match at each point (Like convolution filter)

COMP112 17: 5 Matching prototypes.

COMP112 17: 6 Matching prototypes Problems: What prototypes? May need lots. How do you match? average difference of pixel values? Issues to address: scaling rotation brightness pose Red eye detection: Simple prototype: red circle surrounded by a dark region. May need several prototypes at different sizes.

COMP112 17: 7 Features Don’t use raw pixels. Compute a set of feature values describing the image remove variations in scale, rotation, brightness, etc. eg SIFT Compare computed feature vector to feature vector of prototypes. May still not be good enough! … … 5352

COMP112 17: 8 Classifiers Nicer to have a “classifier” function: input = features output = class label … 6159 Classifier “Young girl” We can learn classifiers automatically from large datasets

COMP112 17: 9 Perceptrons Linear weighted threshold function Early kind of “Neural Network” Very efficient to compute. Features: weights Output node … 6159 >> Weights represent importance of features (positive or negative) “Yes”  i f i  w i >  Need a perceptron for each possible class Same formula as the convolution filter!

COMP112 17: 10 Problem: How do you work out the weights? Very simple learning algorithm: Until it is right: Present an example (+ve or –ve) If perceptron is correct, do nothing If –ve example and wrong active weights too high / threshold too low  Subtract feature vector from weight vector If +ve example and wrong active weights too low / threshold too high  Add feature vector to weight vector Very general learning mechanism! Learning Perceptrons

COMP112 17: 11 Problems with Perceptrons Learning algorithm guaranteed to converge, if instances are linearly separable! Many interesting tasks are not linearly separable. Tanks vs non-tanks Need to automatically learn higher level features “E” vs “not E” what should the weights be?

COMP112 17: 12 Standard “Neural Network” = multi-level perceptron. Very much more powerful Learning the weights is not so simple! But still possible. Neural Nets “Yes” >> … 6159 ≻4≻ … … 6159 ≻3≻ … … 6159 ≻5≻ … … 6159 ≻2≻ … … 6159 ≻1≻ … 6159 “Hidden” nodes represent new features.

COMP112 17: 13 Learning Neural Networks Learning NN classifiers: Present a training example (along with the correct answer) Compute how much to change each weight to bring the actual answer closer to the correct answer. Adjust weights Repeat a lot of times! Neural Nets are just one of many AI classifier learning mechanisms. Support Vector Machines, Decision Trees, Naïve Bayes, Bayesian Networks, …. All require good features to start with. The image recognition problem is not yet solved