Download presentation
Presentation is loading. Please wait.
1
Faculty of Engineering, Kagawa University,
Quadratic boundaries in N-N classifiers with dissimilarity-based representations Yo Horikawa Faculty of Engineering, Kagawa University, Takamatsu Japan The nearest neighbor (N-N) classifiers with the dissimilarity-based representations cause quadratic decision boundaries. The dissimilarity-based representations are effective for the classification of high-dimensional patterns with different variances.
2
Dissimilarity-based representations
x2 d(x, xm) d(x, x2) d(x) d(x, x1) x x1 d(x, x1) d(x, xm) xm d(x, x2) Feature space Dissimilarity space
3
Feature space Dissimilarity space
EX. 1 Feature space Dissimilarity space x1 = (0, 0) ∈ C1 x2 = (1, 0) ∈ C2 d1 = (0, 1, 1) d2 = (1, 0, 0) x3 = (1, 0) ∈ C d3 = (1, 0, 0) the N-N boundary in the feature space x = (x, y) d = ((x2+y2)1/2, ((x-1)2+y2)1/2, ((x-1)2+y2)1/2) the N-N boudary in the dissimilarity space: h(x, y) = -4(x2-2x+y2+1)1/2+2(x2+y2)1/2+1 <===> h(x, y) = d(d, d1)2- d(d, d2)2 Quadratic discriminant function Linear discriminant function
4
EX. 2 x1 = (-2, -2) ∈ C1 x3 = (1, 0) ∈ C2 x2 = (0, 0) ∈ C1 x4 = (1.5, 0.5) ∈ C2 Decision boundary with the N-N classifier in the dissimilarity space: min(d(d, d1)2, d(d, d2)2)-min(d(d, d3)2, d(d, d4)2) = 0 Combination of quadratic curves In general, the decision boundary with the dissimilarity-based representations surrounds the prototypes of the class with small variance. → effective when the variances within class differ.
5
Effects of the dimensionality of the patterns
EX. 1 Correct classification ratio of C1 with the N-N classifier Original patterns: x1, x ∈ C1 (Un(-0.5, 0.5)) x3, x4 (= 0) ∈ C2 (δn(0)) n = 2 Dissimilarity representations: d1 = (0, |x1-x2|, |x1|, |x1|) d2 = (|x1-x2|, 0, |x2|, |x2|) d3 (= d4) = (|x1|, |x2|, 0, 0) Fig. A1. Correct classification ratio for n-dimensional data C1 (Un(-0.5, 0.5)) and C2 (δn(0)) with dissimilarity representations and original patterns.
6
EX. 2 Original patterns: 10 prototypes ∈ C1 Nn(0, 12)
Correct classification ratio of C1 with the N-N classifier Original patterns: 10 prototypes ∈ C1 Nn(0, 12) 10 prototypes ∈ C2 Nn(0.5, 0.52) n = 2 Dissimilarity representations: 20-dimensional space Fig. A2. Correct classification ratio for n-dimensional data C1 (Nn(0, 12)) and C2 (Nn(0.5, 0.52)) with dissimilarity representations and original patterns.
7
Texture classification
The dissimilarity-based representations are applied to the classification of texture images with the bispectrum-based features. 5 prototypes: randomly shifted in [-10, 10], rotated in [0º, 360º], scaled in [×0.5, ×1.0] and with noise N(0, 1002) for Bark.0000 and Bark.0001. 100 test patterns suffering from the random transformations and noise of the same kinds for each image are classified with the invariant features based on the bispectrum (the dimensionality is 108) and with their dissimilarity representations using the N-N method. The correct classification ratio over ten trials: 0.82 with the original features 0.99 with the dissimilarity representations. (a) Bark (b) Bark.0001 Fig. 3. Texture image data in VisTex [12].
8
Summary Dissimilarity-based Quadratic boundary Effective for high-
The dissimilarity-based representations make the decision boundaries in the N-N classifiers quadratic, which are close to those of the optimal Bayes rule when patterns are normally distributed with different variances within class. Further, the dissimilarity-based representations are effective for higher-dimensional patterns with small prototypes. This is attributed to the fact that quadratic decision boundaries correctly divide regions opposite to prototypes, which are dominant in a high-dimensional space. Dissimilarity-based Quadratic boundary Effective for high- representation dimensional Patterns
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.