CS292 Computational Vision and Language Pattern Recognition and Classification.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Pseudo-Relevance Feedback For Multimedia Retrieval By Rong Yan, Alexander G. and Rong Jin Mwangi S. Kariuki
1 Machine Learning: Lecture 10 Unsupervised Learning (Based on Chapter 9 of Nilsson, N., Introduction to Machine Learning, 1996)
Bayesian Decision Theory
Kohonen Self Organising Maps Michael J. Watts
Pattern recognition Professor Aly A. Farag
 Firewalls and Application Level Gateways (ALGs)  Usually configured to protect from at least two types of attack ▪ Control sites which local users.
Region labelling Giving a region a name. Image Processing and Computer Vision: 62 Introduction Region detection isolated regions Region description properties.
K nearest neighbor and Rocchio algorithm
Chapter 1: Introduction to Pattern Recognition
Algorithm Design Techniques: Induction Chapter 5 (Except Sections 5.6 and 5.7)
Chapter 2: Pattern Recognition
1 Pattern Recognition Pattern recognition is: 1. The name of the journal of the Pattern Recognition Society. 2. A research area in which patterns in data.
CSE803 Fall Pattern Recognition Concepts Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching.
Algorithm Design Techniques: Induction Chapter 5 (Except Section 5.6)
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Aprendizagem baseada em instâncias (K vizinhos mais próximos)
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Stockman CSE803 Fall Pattern Recognition Concepts Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Introduction to machine learning
Exercise Session 10 – Image Categorization
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural Pattern Recognition Neural networks.
Processing of large document collections Part 2 (Text categorization) Helena Ahonen-Myka Spring 2006.
Presented by Tienwei Tsai July, 2005
Classification. An Example (from Pattern Classification by Duda & Hart & Stork – Second Edition, 2001)
Perception Introduction Pattern Recognition Image Formation
Machine Learning CSE 681 CH2 - Supervised Learning.
1 Pattern Recognition Concepts How should objects be represented? Algorithms for recognition/matching * nearest neighbors * decision tree * decision functions.
Nearest Neighbor (NN) Rule & k-Nearest Neighbor (k-NN) Rule Non-parametric : Can be used with arbitrary distributions, No need to assume that the form.
November 13, 2014Computer Vision Lecture 17: Object Recognition I 1 Today we will move on to… Object Recognition.
Pattern Recognition April 19, 2007 Suggested Reading: Horn Chapter 14.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
Visual Information Systems Recognition and Classification.
Chapter 4: Pattern Recognition. Classification is a process that assigns a label to an object according to some representation of the object’s properties.
MACHINE LEARNING 10 Decision Trees. Motivation  Parametric Estimation  Assume model for class probability or regression  Estimate parameters from all.
1Ellen L. Walker Category Recognition Associating information extracted from images with categories (classes) of objects Requires prior knowledge about.
A NOVEL METHOD FOR COLOR FACE RECOGNITION USING KNN CLASSIFIER
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 15/16 – TP14 Pattern Recognition Miguel Tavares.
Clustering Instructor: Max Welling ICS 178 Machine Learning & Data Mining.
Introduction to Pattern Recognition (การรู้จํารูปแบบเบื้องต้น)
Vector Quantization CAP5015 Fall 2005.
Learning Kernel Classifiers 1. Introduction Summarized by In-Hee Lee.
Eick: kNN kNN: A Non-parametric Classification and Prediction Technique Goals of this set of transparencies: 1.Introduce kNN---a popular non-parameric.
Machine Learning Lecture 1: Intro + Decision Trees Moshe Koppel Slides adapted from Tom Mitchell and from Dan Roth.
Debrup Chakraborty Non Parametric Methods Pattern Recognition and Machine Learning.
IMAGE PROCESSING RECOGNITION AND CLASSIFICATION
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
CSSE463: Image Recognition Day 11
Pattern Recognition Sergios Theodoridis Konstantinos Koutroumbas
Machine Learning Basics
Instance Based Learning (Adapted from various sources)
K Nearest Neighbor Classification
CSSE463: Image Recognition Day 11
Nearest-Neighbor Classifiers
An Introduction to Supervised Learning
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Object Recognition Today we will move on to… April 12, 2018
Multivariate Methods Berlin Chen
Multivariate Methods Berlin Chen, 2005 References:
CSSE463: Image Recognition Day 11
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
CSSE463: Image Recognition Day 11
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Presentation transcript:

CS292 Computational Vision and Language Pattern Recognition and Classification

Classification and Recognition

Pattern Recognition This section gives a brief survey of methods used to recognise objects. These methods apply to the recognition of objects in images, and are applicable to any other kind of data as well. The basic approach views an instance to be recognised as a vector of measurements. Introduction to some simple methods whereby a machine can learn to recognise objects by being taught from samples. Together with previous sessions, we should understand the design of some complete machine vision systems and should be able to experiment with building a complete set of algorithms for some simple, yet real problem.

Pattern Recognition Problems In many practical problems, there is a need to make some decision about the content of an image or about the classification of an object that it contains. e.g.,handprinted character recognition; a food market recognition system; rch.nsf/pages/machine399.html

Common Model for Classification Classes Feature extractor Classifier

Classes There is a set of m known classes of objects. These are known either by some description or by having a set of examples for each of the classes. An ideal class is a set of objects having some important common properties; in practice, a class to which an object belongs is denoted by some class label. Classification is a process that assigns a label to an object according to some representation of the object’s properties. A classifier is a device or algorithm that inputs an object representation and output a class label. A reject class is a generic class for objects that cannot be placed in any of the designated classes.

Feature Extractor The feature extractor extracts information relevant to classification from the data input by the sensor.

Figure 1 classification system diagram: discriminant functions f(x, K) perform some computation on input feature vector x using some knowledge K from training and pass results to a final stage that determines the class. x1x1 x2x2 xdxd Input feature vector Distance or probability computations F1(x, K) F2(x,K) Fm(x,K) Compare & decide C(x) Output classification

Features Used for Representation A crucial issue for both theory and practice is what representation or encoding of the object is used in the recognition process? Alternatively, what features are important for recognition?

Feature Vector Representation Object may be compared for similarity based on their representation as vector of measurements. Suppose each object is represented by exactly d measurements. The ith coordinate of such a feature vector has the same meaning for each object;

Feature Vector Representation The similarity, or closeness, between the feature vector representations of two objects can then be described using Euclidean distance between the vectors defined in Equation 1. Sometimes the Euclidean distance between an observed vector and a stored class prototype can provide a useful classification function. Definition: the Euclidean distance between two d- dimensional feature vectors x1 and x2 is ||x 1 -x 2 || = (Equation 1)

Figure 2 Two compact classes: classification using nearest mean will yield a low error rate. x x x x x x x x x x x x x xx x x x x x x x x x o o o. o ooooo o o o o class mean x Class mean

Classification using the Nearest Class Mean A simple classification algorithm is to summarize the sample data from each class using the class mean vector, or centroid = where x i,j is the jth sample feature vector from class i. An unknown object with feature vector x is classified as class i if it is [much] closer to the mean vector of class i than to any other class mean vector. We have the opportunity to put x into the reject class if it is not close enough to any of the sample means.

Classification using the Nearest Class Mean This classification method is simple and fast and will work in some problems where the sample vectors from each class are compact and far from those of the other classes. –A simple two class example with feature vectors of dimension d=2 is shown in figure 2. –Since there are samples of each class that are equidistant from both class centroids, the error rate will not be zero, although we expect it to be very low if the structure of the samples well represents the structure of future sensed objects.

Classification using the Nearest Class Mean –We now have one concrete interpretation for the function boxes of Figure 1: The ith function box computes the distance between unknown input x and the mean vector of training samples from that class. The training samples constitute the knowledge K about the class.

Problem:- Figure 3 shows three classes with complex structure:- Classification using nearest mean will yield poor result. x x x x x xx xx x x x x x x xxx xxx xx x x x x x x x x x xx o o o. o ooooo o o o Class o o ooooo o o o Class 1 Class 3 Class 2 x2x2 x1x1

Classification Using the Nearest Neighbours A more flexible but more expensive method of classification is to classify unknown feature vector x into class of individual sample closet to it. This is the nearest-neighbour rule. It can be effective even when classes have complex structure in d-space and when classes overlap. No assumptions need to be made about models for the distribution of feature vectors in space; the algorithm uses only the existing training samples.

Classification Using the Nearest Neighbours - algorithm Compute the k nearest neighbours of x and return the majority class S is a set of n labelled class samples s i where s i. x is a feature vector and s i. c is its integer class label. x is the unknown input feature vector to be classified. A is an array capable of holding up to k samples in the sorted order by distance d. The value returned is a class label in the range [1,m] Procedure K_Nearest_Neighbours(x, S) { make A empty; for all samples s i in S { d =Euclidean distance between s i and x; if A has less than k elements then insert (d, s i ) into A; else if d is less than max A then{ remove the max from A; insert (d, s i ) in A; } assert A has k samples from S closest to x; if a majority of the labels s i.c from A are class c 0, then classify x to class c 0 ; else classify x into the reject class return (class_of_x) }