Kernel nearest means Usman Roshan
Feature space transformation Let Φ(x) be a feature space transformation. For example if we are in a two- dimensional vector space and x=(x 1, x 2 ) then
Computing Euclidean distances in a different feature space The advantage of kernels is that we can compute Euclidean and other distances in different features spaces without explicitly doing the feature space conversion.
First note that the Euclidean distance between two vectors can be written as In feature space we have where K is the kernel matrix. Computing Euclidean distances in a different feature space
Computing distance to mean in feature space Recall that the mean of a class (say C 1 ) is given by In feature space the mean Φ m would be
Computing distance to mean in feature space
Replace K(m,m) and K(m,x) with calculations from previous slides
Kernel nearest means algorithm Compute kernel Let x i (i=0..n-1) be the training datapoints and y i (i=0..n’-1) the test. For each mean mi compute K(m i,m i ) For each datapoint y i in the test set do –For each mean m j do d j = K(m j,m j ) + K(y i,y i ) - 2K(m i,y j ) Assign y i to the class with the minimum d j