Download presentation
Presentation is loading. Please wait.
Published byEthel Dixon Modified over 6 years ago
1
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley & Sons, with the permission of the authors and the publisher
2
Chapter 4: Nonparametric Techniques (Sections 1-6)
Introduction Density Estimation Parzen Windows Kn–Nearest-Neighbor Estimation The Nearest-Neighbor Rule Metrics and Nearest-Neighbor Classification
3
1. Introduction All Parametric densities are unimodal (have a single local maximum), whereas many practical problems involve multi-modal densities Nonparametric procedures can be used with arbitrary distributions and without the assumption that the forms of the underlying densities are known There are two types of nonparametric methods: Estimate density functions P(x |j) without assuming a model Parzen Windows Bypass density functions and directly estimate P(j |x) k-Nearest Neighbor (kNN) Pattern Classification, Ch4
4
Parzen windows kNN Pattern Classification, Ch4
5
The Nearest-Neighbor Rule
Let Dn = {x1, x2, …, xn} be a set of n labeled prototypes Let x’ Dn be the closest prototype to a test point x then the nearest-neighbor rule for classifying x is to assign it the label associated with x’ The nearest-neighbor rule leads to an error rate greater than the minimum possible: the Bayes rate If the number of prototypes is large (unlimited), the error rate of the nearest-neighbor classifier is never worse than twice the Bayes rate (it can be demonstrated!) If n , it is always possible to find x’ sufficiently close so that: P(i | x’) P(i | x) If P(m | x) 1, then the nearest neighbor selection is almost always the same as the Bayes selection Pattern Classification, Ch4
6
The k-nearest-neighbor rule
Goal: Classify x by assigning it the label most frequently represented among the k nearest samples and use a voting scheme Usually choose k odd so no voting ties Pattern Classification, Ch4
7
Pattern Classification, Ch4
8
Pattern Classification, Ch4
9
Pattern Classification, Ch4
10
Pattern Classification, Ch 4
11
Step-by-step algorithm for finding the nearest neighbor class decision regions and decision boundaries in 2D Find the midpoints between all pairs of points. Find the perpendicular bisectors of the lines between all pairs of points (they go through the midpoints found in step 1). Find the point regions, the region surrounding each point that is closest to the point (this region is outlined by the perpendicular bisector segments that are perpendicular to the shortest line from the point to the bisector segment). These regions are called Voronoi cells. Merge adjoining point regions of the same class (such as a two-class problem of dog versus cat) to obtain class decision regions (any point falling into the region is assigned to the class of the region). This is done by eliminating the boundary lines (perpendicular bisector segments) between points of the same class. The resulting connected line segments defining the decision regions are called the decision boundaries. Pattern Classification, Ch4
12
Pattern Classification, Ch4
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.