Download presentation
Presentation is loading. Please wait.
Published byEvan Fitzgerald Modified over 9 years ago
1
Distance Metric Learning with Spectral Clustering By Sheil Kumar
2
Spectral Clustering Based on the MinCut Problem Cuts deal with pairwise similarity measures, and thus can capture non-linear relationships.
3
Spectral Clustering Cont. sigma =.1
4
Spectral Clustering Cont. sigma =.1
5
Spectral Clustering Cont. sigma = 3
6
Motivation Finding the sigma parameter automatically and optimally will give us better clustering of the data. It is hard to formulate a RBF distance metric such that the sigma is easily isolatable. Mahalanobis Distance?
7
Defining a Distance Metric 1) The distance metric must represent similarities between data points. 2) Commonly RBF Kernels are used as distance metrics in SC. 3) The Mahalanobis Distance must form positive distance values representing similarity, NOT dissimilarity.
8
The MinCut Problem The about eqn is subject to constraints y i must take on discrete binary values, and y T D1 = 0. If y is relaxed to take on real values, this minimization is equivalent to the eigenvalue system This eqn is easily shown by substituting z = D 1/2 y. z 0 = D 1/2 1 is an eigenvector, with eigenvalue = 0.
9
Minimizing Eigenvectors D -1/2 (D-W)D -1/2 is a symmetric semi-positive definite matrix because (D-W) (also known as the Laplacian Matrix) is known to be symmetric semi-positive definite. z0 is the smallest eigenvector of D -1/2 (D-W)D -1/2, and all other eigenvectors are perpendicular to it. z1, the second smallest eigenvector has the property z 1 T z 0 = 0 = y 1 T D1
10
Minimizing Eigenvectors Cont. Thus we obtain arg.min z T z 0 = 0 : z T D -1/2 (D-W)D -1/2 z z T z and equivalently: arg.min y T D1 = 0 : y T (D-W)y y T Dy Minimizing the second smallest eigenvector solution of this equation is guaranteed to give us the normalized cut solution with the second constraint satisfied.
11
Trace SDP Given that the lambdas are eigenvector solutions to matrix K, we see that minimizing over the lambdas is equivalent to minimizing over the tr(KB). Minimizing our second eigenvector can be rewritten as a Procrustes Problem. B = a weighted outer product of the eigenvectors such that the eigenvectors are normalized, and the weights are in strictly increasing order.
12
Trace SDP Cont. Because we know that we want to only minimize our second smallest and smallest eigenvector, we can set alpha n and n-1 to 1, and the rest to 0.
13
Solving the SDP
14
The K Matrix
15
Solving the SDP Cont.
16
Solving the SDP
17
Some Results (more coming)
18
More results
19
Conclusions Unclear as of right now, whether linear transformations *help* clustering. More interesting Distance Metrics
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.