Presentation is loading. Please wait.

Presentation is loading. Please wait.

Stockman CSE803 Fall 20091 Pattern Recognition Concepts Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching.

Similar presentations


Presentation on theme: "Stockman CSE803 Fall 20091 Pattern Recognition Concepts Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching."— Presentation transcript:

1 Stockman CSE803 Fall 20091 Pattern Recognition Concepts Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching * nearest neighbors * decision tree * decision functions * artificial neural networks How should learning/training be done?

2 Stockman CSE803 Fall 20092 Feature Vector Representation X=[x1, x2, …, xn], each xj a real number Xj may be object measurement Xj may be count of object parts Example: object rep. [#holes, Area, moments, ]

3 Stockman CSE803 Fall 20093 Possible features for char rec.

4 Stockman CSE803 Fall 20094 Some Terminology Classes: set of m known classes of objects (a) might have known description for each (b) might have set of samples for each Reject Class: a generic class for objects not in any of the designated known classes Classifier: Assigns object to a class based on features

5 Stockman CSE803 Fall 20095 Classification paradigms

6 Stockman CSE803 Fall 20096 Discriminant functions Functions f(x, K) perform some computation on feature vector x Knowledge K from training or programming is used Final stage determines class

7 Stockman CSE803 Fall 20097 Decision-Tree Classifier Uses subsets of features in seq. Feature extraction may be interleaved with classification decisions Can be easy to design and efficient in execution

8 Stockman CSE803 Fall 20098 Decision Trees #holes moment of inertia #strokes best axis direction #strokes - / 1 x w 0 A 8 B 0 1 2 < t  t 2 4 01 0 60 90 0 1

9 Stockman CSE803 Fall 20099 Classification using nearest class mean Compute the Euclidean distance between feature vector X and the mean of each class. Choose closest class, if close enough (reject otherwise) Low error rate at left

10 Stockman CSE803 Fall 200910 Nearest mean might yield poor results with complex structure Class 2 has two modes If modes are detected, two subclass mean vectors can be used

11 Stockman CSE803 Fall 200911 Scaling coordinates by std dev

12 Stockman CSE803 Fall 200912 Another problem for nearest mean classification If unscaled, object X is equidistant from each class mean With scaling X closer to left distribution Coordinate axes not natural for this data 1D discrimination possible with PCA

13 Stockman CSE803 Fall 200913 Receiver Operating Curve ROC Plots correct detection rate versus false alarm rate Generally, false alarms go up with attempts to detect higher percentages of known objects

14 Stockman CSE803 Fall 200914 Confusion matrix shows empirical performance

15 Stockman CSE803 Fall 200915 Bayesian decision-making

16 Stockman CSE803 Fall 200916 Normal distribution 0 mean and unit std deviation Table enables us to fit histograms and represent them simply New observation of variable x can then be translated into probability

17 Stockman CSE803 Fall 200917 Parametric Models can be used

18 Stockman CSE803 Fall 200918 Cherry with bruise Intensities at about 750 nanometers wavelength Some overlap caused by cherry surface turning away


Download ppt "Stockman CSE803 Fall 20091 Pattern Recognition Concepts Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching."

Similar presentations


Ads by Google