Download presentation
Presentation is loading. Please wait.
Published byAugusta Cain Modified over 9 years ago
1
Kernel nearest means Usman Roshan
2
Feature space transformation Let Φ(x) be a feature space transformation. For example if we are in a two- dimensional vector space and x=(x 1, x 2 ) then
3
Computing Euclidean distances in a different feature space The advantage of kernels is that we can compute Euclidean and other distances in different features spaces without explicitly doing the feature space conversion.
4
First note that the Euclidean distance between two vectors can be written as In feature space we have where K is the kernel matrix. Computing Euclidean distances in a different feature space
5
Computing distance to mean in feature space Recall that the mean of a class (say C 1 ) is given by In feature space the mean Φ m would be
6
Computing distance to mean in feature space
8
Replace K(m,m) and K(m,x) with calculations from previous slides
9
Kernel nearest means algorithm Compute kernel Let x i (i=0..n-1) be the training datapoints and y i (i=0..n’-1) the test. For each mean mi compute K(m i,m i ) For each datapoint y i in the test set do –For each mean m j do d j = K(m j,m j ) + K(y i,y i ) - 2K(m i,y j ) Assign y i to the class with the minimum d j
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.