Download presentation
Presentation is loading. Please wait.
1
A Unified View of Kernel k-means, Spectral Clustering and Graph Cuts Dhillon, Inderjit S., Yuqiang Guan, and Brian Kulis
2
K means and Kernel K means
3
Weighted Kernel k means Matrix Form
4
Spectral Methods
6
Represented with Matrix L for Ncut Ratio assoc Ratio cut Norm assoc
7
Weighted Graph Cut Weighted association Weighted cut
8
Conclusion Spectral Methods are special case of Kernel K means
9
Solve the uniformed problem A standard result in linear algebra states that if we relax the trace maximizations such that Y is an arbitrary orthonormal matrix, then the optimal Y is of the form V k Q, where V k consists of the leading k eigenvectors of W 1/2 KW 1/2 and Q is an arbitrary k × k orthogonal matrix. As these eigenvectors are not indicator vectors, we must then perform postprocessing on the eigenvectors to obtain a discrete clustering of the point
10
From Eigen Vector to Cluster Indicator Normalized U with L 2 norm equal to 1 2 1
11
The Other Way Using k means to solve the graph cut problem: (random start points+ EM, local optimal). To make sure k mean converge, the kernel matrix must be positive definite. This is not true for arbitrary kernel matrix
12
The effect of the regularization a i is in a i is not in
13
Experiment results
14
Results (ratio association)
15
Results (normalized association)
16
Image Segmentation
17
Thank you. Any Question?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.