Download presentation
Presentation is loading. Please wait.
Published byChristine Burke Modified over 9 years ago
1
1 Pattern Recognition Concepts How should objects be represented? Algorithms for recognition/matching * nearest neighbors * decision tree * decision functions * artificial neural networks How should learning/training be done?
2
2 Feature Vector Representation X=[x1, x2, …, xn], each x j a real number X j may be object measurement X j may be count of object parts Example: object rep. [#holes, Area, moments, ] Moment is, loosely speaking, a quantitative measure of the shape of a set of points. Second moment", for example, is widely used and measures the "width" of a set of points in one dimension
3
3 Possible features for char rec. Inertia is the resistance of any physical object to a change in its state of motion or rest, or the tendency of an object to resist any change in its motion. Strokes are the individual pen movements that are required to draw a character.
4
4 Some Terminology Classes: set of m known classes of objects (a) might have known description for each (b) might have set of samples for each Reject Class: a generic class for objects not in any of the designated known classes Classifier: Assign object to a class based on features
5
5 Classification paradigms
6
6 Discriminant functions Functions f(x, K) perform some computation on feature vector x Knowledge K from training or programming is used Final stage determines class
7
7 Decision-Tree Classifier Uses subsets of features in seq. Feature extraction may be interleaved with classification decisions Can be easy to design and efficient in execution
8
8 Decision Trees #holes moment of inertia #strokes best axis direction #strokes - / 1 x w 0 A 8 B 0 1 2 < t t 2 4 01 0 60 90 0 1
9
9 Classification using nearest class mean Compute the Euclidean distance between feature vector X and the mean of each class. Choose closest class, if close enough (reject otherwise) Low error rate at left
10
10 Nearest mean might yield poor results with complex structure Class 2 has two modes If modes are detected, two subclass mean vectors can be used
11
11 Scaling coordinates by std dev
12
12 Another problem for nearest mean classification If unscaled, object X is equidistant from each class mean With scaling X closer to left distribution Coordinate axes not natural for this data 1D discrimination possible with PCA
13
13 Receiver Operating Curve ROC Plots correct detection rate versus false alarm rate Generally, false alarms go up with attempts to detect higher percentages of known objects
14
14 Confusion matrix shows empirical performance
15
15 Bayesian decision-making
16
16 Normal distribution 0 mean and unit std deviation Table enables us to fit histograms and represent them simply New observation of variable x can then be translated into probability
17
17 Parametric Models can be used
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.