Lecture 20 Object recognition I

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

2 – In previous chapters: – We could design an optimal classifier if we knew the prior probabilities P(wi) and the class- conditional probabilities P(x|wi)
Pattern Classification. Chapter 2 (Part 1): Bayesian Decision Theory (Sections ) Introduction Bayesian Decision Theory–Continuous Features.
Bayesian Decision Theory
Pattern Classification Chapter 2 (Part 2)0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Bayesian Decision Theory Chapter 2 (Duda et al.) – Sections
Prénom Nom Document Analysis: Parameter Estimation for Pattern Recognition Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Pattern Classification, Chapter 1 1 Basic Probability.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Prénom Nom Document Analysis: Fundamentals of pattern recognition Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Pattern Recognition Topic 2: Bayes Rule Expectant mother:
METU Informatics Institute Min 720 Pattern Classification with Bio-Medical Applications PART 2: Statistical Pattern Classification: Optimal Classification.
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural Pattern Recognition Neural networks.
Principles of Pattern Recognition
Speech Recognition Pattern Classification. 22 September 2015Veton Këpuska2 Pattern Classification  Introduction  Parametric classifiers  Semi-parametric.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 02: BAYESIAN DECISION THEORY Objectives: Bayes.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 14 Oct 14, 2005 Nanjing University of Science & Technology.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Learning Theory Reza Shadmehr Linear and quadratic decision boundaries Kernel estimates of density Missing data.
Classification Heejune Ahn SeoulTech Last updated May. 03.
1 E. Fatemizadeh Statistical Pattern Recognition.
Computational Intelligence: Methods and Applications Lecture 23 Logistic discrimination and support vectors Włodzisław Duch Dept. of Informatics, UMK Google:
Chapter 4: Pattern Recognition. Classification is a process that assigns a label to an object according to some representation of the object’s properties.
Optimal Bayes Classification
: Chapter 3: Maximum-Likelihood and Baysian Parameter Estimation 1 Montri Karnjanadecha ac.th/~montri.
Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12.
Chapter 3: Maximum-Likelihood Parameter Estimation l Introduction l Maximum-Likelihood Estimation l Multivariate Case: unknown , known  l Univariate.
Digital Image Processing Lecture 25: Object Recognition Prof. Charlene Tsai.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.
1 CONTEXT DEPENDENT CLASSIFICATION  Remember: Bayes rule  Here: The class to which a feature vector belongs depends on:  Its own value  The values.
ECE 471/571 – Lecture 2 Bayesian Decision Theory 08/25/15.
1Ellen L. Walker Category Recognition Associating information extracted from images with categories (classes) of objects Requires prior knowledge about.
Chapter 12 Object Recognition Chapter 12 Object Recognition 12.1 Patterns and pattern classes Definition of a pattern class:a family of patterns that share.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Introduction to Pattern Recognition (การรู้จํารูปแบบเบื้องต้น)
Elements of Pattern Recognition CNS/EE Lecture 5 M. Weber P. Perona.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 04: GAUSSIAN CLASSIFIERS Objectives: Whitening.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 9: Review.
ECE 471/571 – Lecture 3 Discriminant Function and Normal Density 08/27/15.
Introduction to Classifiers Fujinaga. Bayes (optimal) Classifier (1) A priori probabilities: and Decision rule: given and decide if and probability of.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
Lecture 2. Bayesian Decision Theory
Lecture 1.31 Criteria for optimal reception of radio signals.
Chapter 3: Maximum-Likelihood Parameter Estimation
Introduction to Machine Learning
Chapter 12 Object Recognition
Classifiers Fujinaga.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
REMOTE SENSING Multispectral Image Classification
REMOTE SENSING Multispectral Image Classification
Advanced Pattern Recognition
An Introduction to Supervised Learning
Pattern Recognition and Image Analysis
Advanced Pattern Recognition
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
CONTEXT DEPENDENT CLASSIFICATION
EE513 Audio Signals and Systems
Classifiers Fujinaga.
Bayesian Classification
Digital Image Processing Lecture 24: Object Recognition
Bayesian Decision Theory
Hairong Qi, Gonzalez Family Professor
Hairong Qi, Gonzalez Family Professor
Presentation transcript:

Lecture 20 Object recognition I Pattern and pattern classes Classifiers based on Bayes Decision Theory Recognition based on decision-theoretical methods Optimum statistical classifiers Pattern recognition with Matlab

Patterns and Pattern classes A pattern is an arrangement of descriptors (features) Three commonly used pattern arrangements Vectors Strings Trees A pattern class is a family of patterns that share some common properties. Pattern recognition is to assign a given pattern to its respective class.

Example 1 Represent flow petals by features width and length Then three types of iris flowers are in different pattern classes

Example 2 Use signature as pattern vector

Example 3 Represent pattern by string

Example 4 Represent pattern by trees

2. Classifier based on Baysian Decision Theory Fundamental statistical approach Assumes relevant probabilities are known, compute the probability of the event observed, then make optimal decisions Bayes’ Theorem: Example: Suppose at Laurier, 50% are girl students, 30% are science students, among science students, 20% are girl students. If one meet a girl student at Laurier, what is the probability that she is a science student. B – girl students, A – science students. Then

Bayes theory Given x ∈ Rl and a set classes, ωi , i = 1, 2, . . . , c, the Bayes theory states that where P(ωi) is the a priori probability of class ωi ; i = 1, 2, . . . , c, P(ωi |x) is the a posteriori probability of class ωi given the value of x; p(x) is the probability density function (pdf ) of x; and p(x| ωi), i = 1 = 2, . . . , c, is the class conditional pdf of x given ωi (sometimes called the likelihood of ωi with respect to x).

Bayes classifier Let x ≡ [x(1), x(2), . . . , x(l)]T ∈ Rl be its corresponding feature vector, which results from some measurements. Also, we let the number of possible classes be equal to c, that is, ω1, . . . , ωc. Bayes decision theory: x is assigned to the class ωi if

Multidimensional Gaussian PDF

Example Consider a 2-class classification task in the 2-dimensional space, where the data in both classes, ω1, ω2, are distributed according to the Gaussian distributions N(m1,S1) and N(m2,S2), respectively. Let Assuming that, Classify x = [1.8, 1.8]T into ω1 or ω2 .

Solution The resulting values p1 = 0.042, p2 = 0.0189 m1=[1 1]'; m2=[3 3]'; S=eye(2); x=[1.8 1.8]'; p1=P1*comp_gauss_dens_val(m1,S,x); p2=P2*comp_gauss_dens_val(m2,S,x); The resulting values p1 = 0.042, p2 = 0.0189 According to the Bayesian classifier, x is assigned to ω1

Decision-theoretic methods Decision (discriminate) functions Decision boundary

Minimum distance classifier

Example

Minimum Mahalanobis distance classifiers

Example x=[0.1 0.5 0.1]'; m1=[0 0 0]'; m2=[0.5 0.5 0.5]'; m=[m1 m2]; z1=euclidean_classifier(m,x) S=[0.8 0.01 0.01;0.01 0.2 0.01; 0.01 0.01 0.2]; z2=mahalanobis_classifier(m,S,x); z1 = 1 < z2 = 2 x is classified to w1

4. Matching by correlation Given a template w(s,t) (or mask), i.e. an m × n matrix, find the a sub m × n matrix in f(x,y) such that it best matches w, i.e. with largest correlation.

Correlation theorem [M, N] = size(f); f = fft2(f); w = conj(fft2(w, M, N)); g = real(ifft2(w.*f));

Example

Case study Optical character recognition (OCR) See the reference Preprocessing         Digitization, make binary         Noise elimination, thinning, normalizing Feature Extraction (by character, word part, word)         Segmentation (explicit or implicit)         Detection of major features (top-down approach) Matching         Recognition of character         Context verification from knowledge base Understanding and Action See the reference

Example

3. Optimum statistical classifiers

Bayes classifer for Gaussian pattern class Consider two patter classes with Gaussian distribution

N-dimensional case

Example

A real example

Linear classifier Two classes f(x) is a separation hyperplane How to obtain the coefficients, or weights wi By perceptron algorithm

How to obtain the coefficients, or weights wi

The Online Form of the Perceptron Algorithm

The Multiclass LS Classifier The classification rule is now as follows: Given x, classify it to class ωi if