Computer Vision Chapter 4

Slides:



Advertisements
Similar presentations
Principles of Density Estimation
Advertisements

Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/ Other Classification Techniques 1.Nearest Neighbor Classifiers 2.Support Vector Machines.
Data Mining Classification: Alternative Techniques
Data Mining Classification: Alternative Techniques
An Overview of Machine Learning
Pattern Classification. Chapter 2 (Part 1): Bayesian Decision Theory (Sections ) Introduction Bayesian Decision Theory–Continuous Features.
Pattern recognition Professor Aly A. Farag
Pattern Recognition: Readings: Ch 4: , , 4.13
0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Chapter 2: Pattern Recognition
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
ECSE 6610 Pattern Recognition Professor Qiang Ji Spring, 2011.
Principles of Pattern Recognition
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 03: GAUSSIAN CLASSIFIERS Objectives: Whitening.
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.6: Linear Models Rodney Nielsen Many of.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
Pattern Recognition 1 Pattern recognition is: 1. The name of the journal of the Pattern Recognition Society. 2. A research area in which patterns in data.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Chapter 4: Pattern Recognition. Classification is a process that assigns a label to an object according to some representation of the object’s properties.
Optimal Bayes Classification
Chapter 3: Maximum-Likelihood Parameter Estimation l Introduction l Maximum-Likelihood Estimation l Multivariate Case: unknown , known  l Univariate.
Digital Image Processing Lecture 25: Object Recognition Prof. Charlene Tsai.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
1Ellen L. Walker Category Recognition Associating information extracted from images with categories (classes) of objects Requires prior knowledge about.
Computer and Robot Vision II Chapter 20 Accuracy Presented by: 傅楸善 & 王林農 指導教授 : 傅楸善 博士.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Fast Query-Optimized Kernel Machine Classification Via Incremental Approximate Nearest Support Vectors by Dennis DeCoste and Dominic Mazzoni International.
CSSE463: Image Recognition Day 11 Due: Due: Written assignment 1 tomorrow, 4:00 pm Written assignment 1 tomorrow, 4:00 pm Start thinking about term project.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 04: GAUSSIAN CLASSIFIERS Objectives: Whitening.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
Data-intensive Computing Algorithms: Classification Ref: Algorithms for the Intelligent Web 7/10/20161.
CS 9633 Machine Learning Support Vector Machines
Machine Learning for Computer Security
Machine Learning – Classification David Fenyő
Chapter 3: Maximum-Likelihood Parameter Estimation
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
CSSE463: Image Recognition Day 11
Pattern Recognition Pattern recognition is:
Classification Techniques: Bayesian Classification
LECTURE 05: THRESHOLD DECODING
CSSE463: Image Recognition Day 11
Course Outline MODEL INFORMATION COMPLETE INCOMPLETE
Hidden Markov Models Part 2: Algorithms
Instance Based Learning
Computer Vision Chapter 4
Computer Vision Chapter 4
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Announcements Project 2 artifacts Project 3 due Thursday night
Parametric Methods Berlin Chen, 2005 References:
Multivariate Methods Berlin Chen
LECTURE 05: THRESHOLD DECODING
Multivariate Methods Berlin Chen, 2005 References:
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 11
Computer and Robot Vision I
Computer and Robot Vision I
Linear Discrimination
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Computer Vision II Chapter 20 Accuracy
Presentation transcript:

Computer Vision Chapter 4 Statistical Pattern Recognition Presenter: 蔡玄中 Cell phone: 0965412965 E-mail: r06922141@ntu.edu.tw Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.

Pattern Discrimination Also called pattern identification Process: A unit is observed or measured A category assignment is made that names or classifies the unit as a type of object The category assignment is made only on observed measurement (pattern) DC & CV Lab. CSIE NTU

Introduction Units: Image regions and projected segments Each unit has an associated measurement vector Using decision rule to assign unit to class or category optimally DC & CV Lab. CSIE NTU

Introduction (Cont.) unit measurement vector decision rule (image regions or projected segments) decision rule optimally assign unit to a class DC & CV Lab. CSIE NTU

Introduction (Cont.) unit measurement vector decision rule (image regions or projected segments) decision rule optimally assign unit to a class smallest classification error DC & CV Lab. CSIE NTU

Introduction (Cont.) unit measurement vector decision rule How to reduce the dimensionality? Feature selection and extraction unit measurement vector (image regions or projected segments) Construction techniques decision rule Estimation of error optimally assign unit to a class smallest classification error DC & CV Lab. CSIE NTU

Introduction (Cont.) Statistical pattern recognition techniques: Feature selection and extraction techniques Decision rule construction techniques Techniques for estimating decision rule error DC & CV Lab. CSIE NTU

Economic Gain Matrix correct assign unit to a class incorrect Assigned State (t, a) Good Bad (g, g) (g, b) (b, g) (b, b) True State DC & CV Lab. CSIE NTU

Economic Gain Matrix (Cont.) We assume that the act of making category assignments carries consequences (t,a,d) economically or in terms of utility. e(t, a): economic gain/utility with true category t and assigned category a DC & CV Lab. CSIE NTU

Economic Gain Matrix (Cont.) Assigned State e(t, a) Good Bad e(g, g) e(g, b) e(b, g) e(b, b) True State DC & CV Lab. CSIE NTU

Jet Fan Blade DC & CV Lab. CSIE NTU

An Instance (Cont.) DC & CV Lab. CSIE NTU

Economic Gain Matrix (Cont.) Identity gain matrix Assigned State e(t, a) Good Bad 1 True State DC & CV Lab. CSIE NTU

Recall Some Definitions t: true category identification from set C a: assigned category from set C d: observed measurement from a set of measurements D (t, a, d): event of classifying the observed unit P(t, a, d): probability of the event (t, a, b) e(t, a): economic gain with true category t and assigned category a DC & CV Lab. CSIE NTU

Joke Time DC & CV Lab. CSIE NTU

Another Instance P(g, g): probability of true good, assigned good, P(g, b): probability of true good, assigned bad, ... e(g, g): economic consequence for event (g, g), … e positive: profit consequence e negative: loss consequence DC & CV Lab. CSIE NTU

Another Instance (cont.) DC & CV Lab. CSIE NTU

Another Instance (cont.) DC & CV Lab. CSIE NTU

Another Instance (cont.) Fraction of good objects manufactured P(g) = P(g, g) + P(g, b) P(b) = P(b, g) + P(b, b) Expected profit per object E = DC & CV Lab. CSIE NTU

Why Conditional Probability DC & CV Lab. CSIE NTU

Conditional Probability Given that an object is good, the probability that it is detected as good: Note: P(g, g) + P(g, b) = P(g) DC & CV Lab. CSIE NTU

Conditional Probability P(g, g) + P(g, b) = P(g) P(b, g) + P(b, b) = P(b) DC & CV Lab. CSIE NTU

Conditional Probability (cont.) The machine’s performance is characterized: P(b|g): false-alarm rate P(g|b): misdetection rate Note: P(g, g) + P(g, b) = P(g) DC & CV Lab. CSIE NTU

Conditional Probability (cont.) Another formula for expected profit per object DC & CV Lab. CSIE NTU

Conditional Probability (cont.) Another formula for expected profit per object Recall: E = DC & CV Lab. CSIE NTU

Example 4.1 P(g) = 0.95, P(b) = 0.05 DC & CV Lab. CSIE NTU

Example 4.1 (cont.) DC & CV Lab. CSIE NTU

Example 4.2 P(g) = 0.95, P(b) = 0.05 DC & CV Lab. CSIE NTU

Example 4.2 (cont.) DC & CV Lab. CSIE NTU

Recall Some Formulas P(g, g) + P(g, b) = P(g) P(b, g) + P(b, b) = P(b) P(g | g) + P(b | g) = 1 P(b | b) + P(g | b) = 1 DC & CV Lab. CSIE NTU

Recall Some Formulas E = DC & CV Lab. CSIE NTU

Recall unit measurement vector decision rule How to reduce the dimensionality? Feature selection and extraction unit measurement vector (image regions or projected segments) Construction techniques decision rule Estimation of error optimally assign unit to a class smallest classification error DC & CV Lab. CSIE NTU

Joke Time DC & CV Lab. CSIE NTU

Decision Rule Construction (t, a): summing (t, a, d) on every measurements d Therefore, Average economic gain DC & CV Lab. CSIE NTU

Decision Rule Construction (cont.) DC & CV Lab. CSIE NTU

Decision Rule Construction (cont.) We can use identity matrix as the economic gain matrix to compute the probability of correct assignment: DC & CV Lab. CSIE NTU

Economic Gain Matrix (Cont.) Identity gain matrix Assigned State e(t, a) Good Bad 1 True State DC & CV Lab. CSIE NTU

Fair Game Assumption Decision rule uses only measurement data in assignment; the nature and the decision rule are not in collusion In other words, P(a| t, d) = P(a| d) DC & CV Lab. CSIE NTU

Fair Game Assumption (cont.) From the definition of conditional probability Fair game assumption: P(a| t, d) = P(a| d) So P(t, a, d) = DC & CV Lab. CSIE NTU

Fair Game Assumption (cont.) By fair game assumption, P(t, a, d) = By definition, = DC & CV Lab. CSIE NTU

Fair Game Assumption (cont.) The fair game assumption leads to the fact that conditioned on measurement d, the true category and the assigned category are independent. DC & CV Lab. CSIE NTU

Fair Game Assumption (cont.) P(t | d): a conditional probability that nature determines P(a | d): assigns category a to an observed unit In order to distinguish them, we will use f(a | d) for the conditional probability associated with the decision rule DC & CV Lab. CSIE NTU

Deterministic Decision Rule We use the notation f(a|d) to completely define a decision rule; f(a|d) presents all the conditional probability associated with the decision rule A deterministic decision rule: Decision rules which are not deterministic are called probabilistic/nondeterministic/stochastic DC & CV Lab. CSIE NTU

Expected Value on f(a|d) Previous formula By and => DC & CV Lab. CSIE NTU

Expected Value on f(a|d) (cont.) To analyze the dependence f(a | d) has on E[e]: regroup DC & CV Lab. CSIE NTU

Bayes Decision Rules Maximize expected economic gain Satisfy Constructing the optimal f DC & CV Lab. CSIE NTU

Bayes Decision Rules (cont.) DC & CV Lab. CSIE NTU

Bayes Decision Rules (cont.) P(c1,c1)=0.48 P(c1,c2)=0.12 + + DC & CV Lab. CSIE NTU

Continuous Measurement For the same example, try the continuous density function of the measurements: and Prove that they are indeed density function DC & CV Lab. CSIE NTU

Continuous Measurement (cont.) Suppose that the prior probability of is and the prior probability of is When , a Bayes decision rule will assign an observed unit to t1, which implies => = x: measurement DC & CV Lab. CSIE NTU

Continuous Measurement (cont.) .805 > .68, the continuous measurement has larger expected economic gain than discrete DC & CV Lab. CSIE NTU

Prior Probability The Bayes rule: Replace with The Bayes rule can be determined by assigning any categories that maximizes DC & CV Lab. CSIE NTU

Economic Gain Matrix Identity matrix Incorrect loses 1 A more balanced instance DC & CV Lab. CSIE NTU

Maximin Decision Rule Maximizes average gain over worst prior probability DC & CV Lab. CSIE NTU

Example 4.3 DC & CV Lab. CSIE NTU

Example 4.3 (cont.) DC & CV Lab. CSIE NTU

Example 4.3 (cont.) DC & CV Lab. CSIE NTU

Example 4.3 (cont.) The lowest Bayes gain is achieved when The lowest gain is 0.6714 DC & CV Lab. CSIE NTU

Example 4.3 (cont.) DC & CV Lab. CSIE NTU

Example 4.4 DC & CV Lab. CSIE NTU

Example 4.4 (cont.) DC & CV Lab. CSIE NTU

Example 4.4 (cont.) DC & CV Lab. CSIE NTU

Example 4.4 (cont.) DC & CV Lab. CSIE NTU

Decision Rule Error The misidentification errorαk The false-identification error βk DC & CV Lab. CSIE NTU

An Instance DC & CV Lab. CSIE NTU

Reserving Judgment The decision rule may withhold judgment for some measurements Then, the decision rule is characterized by the fraction of time it withhold judgment and the error rate for those measurement it does assign. It is an important technique to control error rate. 1. Reserved judgment可有效控制誤差率 2. 對於某些測量值,決策準則可能會抑制到某些判定結果。 3. 決策準則被對於那些它指定的測量值所抑制的判定結果和誤差率的時間比率所描述 DC & CV Lab. CSIE NTU

Nearest Neighbor Rule Assign pattern x to the closest vector in the training set The definition of “closest”: where is a metric or measurement space Chief difficulty: brute-force nearest neighbor algorithm computational complexity proportional to number of patterns in training set brute-force nearest neighbor:暴力法 DC & CV Lab. CSIE NTU

Binary Decision Tree Classifier Assign by hierarchical decision procedure DC & CV Lab. CSIE NTU

Major Problems Choosing tree structure Choosing features used at each non-terminal node Choosing decision rule at each non-terminal node DC & CV Lab. CSIE NTU

Decision Rules at the Non-terminal Node Thresholding the measurement component Fisher’s linear decision rule Bayes quadratic decision rule Bayes linear decision rule Linear decision rule from the first principal component DC & CV Lab. CSIE NTU

Error Estimation An important way to characterize the performance of a decision rule Training data set: must be independent of testing data set Hold-out method: a common technique construct the decision rule with half the data set, and test with the other half DC & CV Lab. CSIE NTU

Neural Network A set of units each of which takes a linear combination of values from either an input vector or the output of other units DC & CV Lab. CSIE NTU

Neural Network (cont.) Has a training algorithm Responses observed Reinforcement algorithms Back propagation to change weights DC & CV Lab. CSIE NTU

Summary Bayesian approach Maximin decision rule Misidentification and false-alarm error rates Nearest neighbor rule Construction of decision trees Estimation of decision rules error Neural network DC & CV Lab. CSIE NTU