Clustering II CMPUT 466/551 Nilanjan Ray. Mean-shift Clustering Will show slides from:

Slides:



Advertisements
Similar presentations
Class 12: Communities Network Science: Communities Dr. Baruch Barzel.
Advertisements

Partitional Algorithms to Detect Complex Clusters
Ch 7.7: Fundamental Matrices
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
8.3 Representing Relations Connection Matrices Let R be a relation from A = {a 1, a 2,..., a m } to B = {b 1, b 2,..., b n }. Definition: A n m  n connection.
1 Perceptual Organization and Linear Algebra Charless Fowlkes Computer Science Dept. University of California at Berkeley.
Modularity and community structure in networks
Normalized Cuts and Image Segmentation
Online Social Networks and Media. Graph partitioning The general problem – Input: a graph G=(V,E) edge (u,v) denotes similarity between u and v weighted.
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
1 Representing Graphs. 2 Adjacency Matrix Suppose we have a graph G with n nodes. The adjacency matrix is the n x n matrix A=[a ij ] with: a ij = 1 if.
Principal Component Analysis
Image segmentation. The goals of segmentation Group together similar-looking pixels for efficiency of further processing “Bottom-up” process Unsupervised.
Lecture 21: Spectral Clustering
Communities in Heterogeneous Networks Chapter 4 1 Chapter 4, Community Detection and Mining in Social Media. Lei Tang and Huan Liu, Morgan & Claypool,
Spectral Clustering Scatter plot of a 2D data set K-means ClusteringSpectral Clustering U. von Luxburg. A tutorial on spectral clustering. Technical report,
Spectral Clustering 指導教授 : 王聖智 S. J. Wang 學生 : 羅介暐 Jie-Wei Luo.
Eigenvalues and Eigenvectors
CS 584. Review n Systems of equations and finite element methods are related.
Normalized Cuts and Image Segmentation Jianbo Shi and Jitendra Malik, Presented by: Alireza Tavakkoli.
Ch 7.8: Repeated Eigenvalues
A Unified View of Kernel k-means, Spectral Clustering and Graph Cuts
Segmentation Graph-Theoretic Clustering.
MRF Labeling With Graph Cut CMPUT 615 Nilanjan Ray.
Efficient Spatiotemporal Grouping Using the Nyström Method Charless Fowlkes, U.C. Berkeley Serge Belongie, U.C. San Diego Jitendra Malik, U.C. Berkeley.
אשכול בעזרת אלגורתמים בתורת הגרפים
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Computer Vision - A Modern Approach Set: Segmentation Slides by D.A. Forsyth Segmentation and Grouping Motivation: not information is evidence Obtain a.
Graphs, relations and matrices
Network Measures Social Media Mining. 2 Measures and Metrics 2 Social Media Mining Network Measures Klout.
Stochastic Approach for Link Structure Analysis (SALSA) Presented by Adam Simkins.
Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which usually.
Graph-based consensus clustering for class discovery from gene expression data Zhiwen Yum, Hau-San Wong and Hongqiang Wang Bioinformatics, 2007.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University COT 5410 – Spring 2004.
Lecture7 Topic1: Graph spectral analysis/Graph spectral clustering and its application to metabolic networks Topic 2: Concept of Line Graphs Topic 3: Introduction.
Presenter : Kuang-Jui Hsu Date : 2011/5/3(Tues.).
Feature Matching Longin Jan Latecki. Matching example Observe that some features may be missing due to instability of feature detector, view change, or.
CSSE463: Image Recognition Day 34 This week This week Today: Today: Graph-theoretic approach to segmentation Graph-theoretic approach to segmentation Tuesday:
Segmentation using eigenvectors Papers: “Normalized Cuts and Image Segmentation”. Jianbo Shi and Jitendra Malik, IEEE, 2000 “Segmentation using eigenvectors:
Segmentation Course web page: vision.cis.udel.edu/~cv May 7, 2003  Lecture 31.
Chapter 14: SEGMENTATION BY CLUSTERING 1. 2 Outline Introduction Human Vision & Gestalt Properties Applications – Background Subtraction – Shot Boundary.
Linear Equations in Linear Algebra
1 1.7 © 2016 Pearson Education, Inc. Linear Equations in Linear Algebra LINEAR INDEPENDENCE.
Lecture7 Topic1: Graph spectral analysis/Graph spectral clustering and its application to metabolic networks Topic 2: Different centrality measures of.
Lecture 5: Mathematics of Networks (Cont) CS 790g: Complex Networks Slides are modified from Networks: Theory and Application by Lada Adamic.
Spectral Analysis based on the Adjacency Matrix of Network Data Leting Wu Fall 2009.
Meeting 18 Matrix Operations. Matrix If A is an m x n matrix - that is, a matrix with m rows and n columns – then the scalar entry in the i th row and.
Analysis of Social Media MLD , LTI William Cohen
Domain decomposition in parallel computing Ashok Srinivasan Florida State University.
Graph spectral analysis/
1.7 Linear Independence. in R n is said to be linearly independent if has only the trivial solution. in R n is said to be linearly dependent if there.
 In the previews parts we have seen some kind of segmentation method.  In this lecture we will see graph cut, which is a another segmentation method.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Spectral Clustering Shannon Quinn (with thanks to William Cohen of Carnegie Mellon University, and J. Leskovec, A. Rajaraman, and J. Ullman of Stanford.
Mesh Segmentation via Spectral Embedding and Contour Analysis Speaker: Min Meng
A Tutorial on Spectral Clustering Ulrike von Luxburg Max Planck Institute for Biological Cybernetics Statistics and Computing, Dec. 2007, Vol. 17, No.
CS 2750: Machine Learning Clustering Prof. Adriana Kovashka University of Pittsburgh January 25, 2016.
Lecture 5 Graph Theory prepped by Lecturer ahmed AL tememe 1.
Document Clustering with Prior Knowledge Xiang Ji et al. Document Clustering with Prior Knowledge. SIGIR 2006 Presenter: Suhan Yu.
CS 140: Sparse Matrix-Vector Multiplication and Graph Partitioning
Spectral clustering of graphs
CSSE463: Image Recognition Day 34
Degree and Eigenvector Centrality
Segmentation Graph-Theoretic Clustering.
Grouping.
Spectral Clustering Eric Xing Lecture 8, August 13, 2010
3.3 Network-Centric Community Detection
Spectral clustering methods
CSSE463: Image Recognition Day 34
Presentation transcript:

Clustering II CMPUT 466/551 Nilanjan Ray

Mean-shift Clustering Will show slides from:

Spectral Clustering Let’s visit a serious issue with K-means K-means tries to figure out compact, hyper- ellipsoid like structures What if the clusters are not ellipsoid like compact? K-means fails. What can we do? Spectral clustering can be a remedy here.

Basic Spectral Clustering Forms a similarity matrix w ij for all pairs of observations i, j. This is a dense graph with data points as the vertex set. Edge strength is given by the w ij, similarity between i th and j th observations. Clustering can be conceived as a partitioning of the graph into connected components, where within a component, the edge weights are large, whereas, across the components they are low.

Basic Spectral Clustering… Form the Laplacian of this graph: where G is a diagonal matrix with entries, L is positive semi-definite and has a constant eigenvector (all 1’s) with zero eigenvalue. Find m smallest eigenvectors Z=[z 1 z 2 z m ] of L, ignoring the constant eigenvector. Cluster (say by K-means) N observations with features as rows of matrix Z.

Why Spectral Clustering Works The graph cut cost for a label vector f: So, a small value of will be obtained if pairs of points with large adjacencies same labels. In reality, we only have weak and strong edges. So look for small eigenvalues. The constant eigenvector corresponding to 0 eigenvalue is actually a trivial solution that suggests to put all N observations into a single cluster. If a graph has K connected components, the nodes of the graph can be reordered so that L will be block diagonal with K diagonal blocks and L will have zero eigenvalue with multiplicity K, one for each connected component. Corresponding eigenvectors will have indicator variables indentifying these connected components. Choose eigenvectors corresponding to small eigenvalues and cluster them into K classes. Insight 1: Insight 2: Combining Insight 1 and 2:

A Tiny Example: A Perfect World W =[ ]; We observe two classes each with 2 observations here. W is a perfect block diagonal matrix here. L = ]; Laplacian L Eigenvalues of L: 0, 0, 1, 1.6 Eigenvectors corresponding to two 0 eigenvalues: [ ] and [ ]

The Real World Tiny Example W =[ ] L =[ ] [V,D]=eig(L) V = D = Eigenvectors: Eigenvalues: Notice that eigenvalue 0 has a constant eigenvector. The next eigenvalue has an eigenvector that clearly indicates the class memberships.

Normalized Graph Cut for Image Segmentation A cell image Similarity: Pixel locations

NGC Example (a) A blood cell image. (b) Eigenvector corresponding to second smallest eigenvalue. (c) Binary labeling via Otsu’s method. (d) Eigenvector corresponding to third smallest eigenvalue. (e) Ternary labeling via k-means clustering. Demo: NCUT.m (a) (b) (c) (d) (e)