Download presentation
Presentation is loading. Please wait.
Published byCarol Mildred Bridges Modified over 9 years ago
1
Pattern Recognition April 19, 2007 Suggested Reading: Horn Chapter 14
2
Class and Label A Class is a particular “pattern” that one wants to detect from the input data. For example, circles, squares, ellipses, etc. The set of classes defines the scope of the pattern recognition problem. A labeling (of a data point) is the assignment of the data point to one of the known classes. John Joe
3
Pattern Recognition Extract meaningful measurements from data points to form feature vectors. For face recognition, each pixel intensity can be considered as one measurement. The association: data point its feature vector Allows us to work directly with vectors in vector space (mathematically tractable) Second important ingredient: How to compare features, i.e., we need (good) metrics.
4
Geometry in High Dimension Typically, this provides us with a collection of vectors in some high dimensional vector space. L 2 norm between two vectors
5
Types of Problems Supervised Learning Problem: Labeled training data are given Unsupervised Learning Problem: Unlabeled data are given. who is this?
6
Supervised Learning Problem Decision boundary Test sample Generalization of the training data to unseen data. A classifier is an algorithm that, given labeled training data, assigns a test data a class.
7
Nearest Neighbor classifier X i, j are training data such that i indexes the class and j, the class’ member. A sample x is assigned class k, if (for all i, j) Drawbacks: Need to store all training samples and compute all pairwise distances (expensive for high dimensional data). Advantage: Easy to understand and implement.
8
Nearest-Centroid Classifier Each class is now represented by its centroid. Each test data point is tested against only the centroids. Good for data such that each class form a nice round cluster. Connection with Gaussisan distribution A sample x is assigned class k, if (for all i)
9
PCA-based classifiers Data from each class from a linear subspace Example 1: Face images under different lighting conditions Assume lambertian reflectance for the faces (e.g., ignoring the oily forehead, etc). The data are close to a small dimensional subspace.
10
PCA-based classifier Each class can now be represented to a linear subspace (instead of just its centroid) and the test data is tested against subspaces. X is assigned to class k Widely used in face recognition and other pattern classification problems.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.