Spectral partitioning works: Planar graphs and finite element meshes

Slides:



Advertisements
Similar presentations
More Vectors.
Advertisements

TexPoint fonts used in EMF.
1 A camera is modeled as a map from a space pt (X,Y,Z) to a pixel (u,v) by ‘homogeneous coordinates’ have been used to ‘treat’ translations ‘multiplicatively’
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Leting Wu Xiaowei Ying, Xintao Wu Aidong Lu and Zhi-Hua Zhou PAKDD 2011 Spectral Analysis of k-balanced Signed Graphs 1.
Nonlinear Dimension Reduction Presenter: Xingwei Yang The powerpoint is organized from: 1.Ronald R. Coifman et al. (Yale University) 2. Jieping Ye, (Arizona.
Extremum Properties of Orthogonal Quotients Matrices By Achiya Dax Hydrological Service, Jerusalem, Israel
Information Networks Graph Clustering Lecture 14.
Normalized Cuts and Image Segmentation
Online Social Networks and Media. Graph partitioning The general problem – Input: a graph G=(V,E) edge (u,v) denotes similarity between u and v weighted.
Clustering II CMPUT 466/551 Nilanjan Ray. Mean-shift Clustering Will show slides from:
Projective Geometry- 3D
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Graph Clustering. Why graph clustering is useful? Distance matrices are graphs  as useful as any other clustering Identification of communities in social.
Graph Clustering. Why graph clustering is useful? Distance matrices are graphs  as useful as any other clustering Identification of communities in social.
Lecture 21: Spectral Clustering
Spectral Clustering Scatter plot of a 2D data set K-means ClusteringSpectral Clustering U. von Luxburg. A tutorial on spectral clustering. Technical report,
Spectral Clustering 指導教授 : 王聖智 S. J. Wang 學生 : 羅介暐 Jie-Wei Luo.
CS 584. Review n Systems of equations and finite element methods are related.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Clustering In Large Graphs And Matrices Petros Drineas, Alan Frieze, Ravi Kannan, Santosh Vempala, V. Vinay Presented by Eric Anderson.
Expanders Eliyahu Kiperwasser. What is it? Expanders are graphs with no small cuts. The later gives several unique traits to such graph, such as: – High.
אשכול בעזרת אלגורתמים בתורת הגרפים
Tutorial 10 Iterative Methods and Matrix Norms. 2 In an iterative process, the k+1 step is defined via: Iterative processes Eigenvector decomposition.
Spectral Graph Theory. Outline: Definitions and different spectra Physical analogy Description of bisection algorithm Relationship of spectrum to graph.
Normal Estimation in Point Clouds 2D/3D Shape Manipulation, 3D Printing March 13, 2013 Slides from Olga Sorkine.
Stats & Linear Models.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University COT 5410 – Spring 2004.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Abhiram Ranade IIT Bombay
Structure Preserving Embedding Blake Shaw, Tony Jebara ICML 2009 (Best Student Paper nominee) Presented by Feng Chen.
Models and Algorithms for Complex Networks Graph Clustering and Network Communities.
Linear algebra: matrix Eigen-value Problems
Spectral Analysis based on the Adjacency Matrix of Network Data Leting Wu Fall 2009.
Spectral Sequencing Based on Graph Distance Rong Liu, Hao Zhang, Oliver van Kaick {lrong, haoz, cs.sfu.ca {lrong, haoz, cs.sfu.ca.
Projective Geometry Hu Zhan Yi. Entities At Infinity The ordinary space in which we lie is Euclidean space. The parallel lines usually do not intersect.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University.
Signal & Weight Vector Spaces
Performance Surfaces.
Spectral Partitioning: One way to slice a problem in half C B A.
Graphs, Vectors, and Matrices Daniel A. Spielman Yale University AMS Josiah Willard Gibbs Lecture January 6, 2016.
 In the previews parts we have seen some kind of segmentation method.  In this lecture we will see graph cut, which is a another segmentation method.
Multi-way spectral partitioning and higher-order Cheeger inequalities University of Washington James R. Lee Stanford University Luca Trevisan Shayan Oveis.
Lecture 19 Minimal Spanning Trees CSCI – 1900 Mathematics for Computer Science Fall 2014 Bill Pine.
Presented by Alon Levin
Spectral Clustering Shannon Quinn (with thanks to William Cohen of Carnegie Mellon University, and J. Leskovec, A. Rajaraman, and J. Ullman of Stanford.
Normalized Cuts and Image Segmentation Patrick Denis COSC 6121 York University Jianbo Shi and Jitendra Malik.
CS 140: Sparse Matrix-Vector Multiplication and Graph Partitioning
Spectral Methods for Dimensionality
CS Visual Recognition Projective Geometry Projective Geometry is a mathematical framework describing image formation by perspective camera. Under.
Jianping Fan Dept of CS UNC-Charlotte
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Lecture 16 Cramer’s Rule, Eigenvalue and Eigenvector
Segmentation Graph-Theoretic Clustering.
Additive Combinatorics and its Applications in Theoretical CS
Numerical Analysis Lecture 16.
Introduction Wireless Ad-Hoc Network
Tutte Embedding: How to Draw a Graph
Chapter 2 Analytic Function
Matrix Algebra and Random Vectors
Spectral Clustering Eric Xing Lecture 8, August 13, 2010
3.3 Network-Centric Community Detection
Numerical Analysis Lecture 17.
Quantum Two.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Lecture 13: Singular Value Decomposition (SVD)
Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Performance Surfaces.
James Demmel CS 267 Applications of Parallel Computers Lecture 14: Graph Partitioning - I James Demmel.
Lecture 20 SVD and Its Applications
Presentation transcript:

Spectral partitioning works: Planar graphs and finite element meshes Daniel A. Spielman, Shang-Hua Teng Presented By Yariv Yaari

Paper Result Spectral partitioning on a planar graph of bounded degree will find a cut of ratio Similar results for k-nearest neighbor graphs in a fixed dimension

Outline Introduction Spectral Partitioning Bound on Fiedler value using Embedding Bound for Planar graphs Bisection from low-ratio cut Questions?

Introduction Our goal is to find good partitioning of graphs. (also called cut) Partition of G=(V,E) is , we define Good partition, large and small The cut ratio is

The Laplacian The Laplacian of a graph G, L(G) is an nxn matrix with entries defined by E. For G=(V,E), , and We are interested in eigenvalues, eigenvectors of the Laplacian.

Spectral Partitioning Partition the graph using an eigenvector of the Laplacian. Choose a number s and split v into: and There are several approaches to choose s, we will only consider the choice optimizing the cut’s ratio, and only for an eigenvector of a specific eigenvalue, Fiedler value.

The Laplacian - properties For any vector x, Therefore, L(G) is symmetric positive semidefinite matrix, all eigenvalues are non-negative reals. 0 is always an eigenvalue. If G is connected, its eigenvectors are spanned by (1,1,1,…,1). The second smallest eigenvalue is called Fiedler value.

Fielder Value Since L(G) is symmetric, the eigenvectors are orthogonal and we get The minimized quotient is called Rayleigh Quotient, and it can be used to find a low ratio cut.

Rayleigh quotient (Mihail): for a graph G of maximum degree d, for any vector x s.t. There is s such that the cut have ratio at most Therefore, we want to bind Fiedler value.

Embedding We will find a bound using an embedding into Rm. We use: where This is a direct result of the one-dimensional case.

“Kissing Disk” embedding (Koebe–Andreev–Thurston): For any planar graph G=(V,E) there are disks with pair wise disjoint interiors s.t.

Sphere Preserving maps A sphere preserving map is a map f s.t. the image of any sphere under f is a sphere and similarly the pre-image. We will use sphere preserving maps between a sphere and a hyperplane. For our purposes, a hyperplane is a sphere, so a möbius transformation is sphere preserving.

Bound for planar graphs We will soon prove the existence of a sphere preserving map from the plane to the sphere s.t. the centroid of the centers of the disks is the origin. Then denote the centers, the radii, we then get and for all i,j Therefore we can split it between i and j and get the bound But the caps are disjoint, so

Bound for planar graphs - cont. Also, So, summing everything we get Therefore, and using its eigenvector one can find a cut of ratio

Sphere preserving maps - cont We want a sphere preserving map that will map centers of spheres to a set on a sphere with the centroid at the origin. First, build a family of sphere preserving maps. For a sphere and a point denote the stereographic projection of the sphere on the extended hyperplane tangent to the sphere at (extended, with infinity point). This is sphere preserving.

Sphere preserving maps - cont Any möbius transformation is sphere preserving, we will only use dilation, denote a dilation with factor around by Any composition of Sphere preserving maps is sphere preserving Our family of sphere preserving functions will be

Sphere preserving maps - cont We now extend the definition for as (this is not continuous) Now, for a cap C on the unit sphere denote its center by we would like to show that for all there is s.t.

Sphere preserving maps - cont We will need that there is no point shared by the interior of at least half of the caps (in current case, the interiors are disjoint). We would like to use a mapping from to the centroid of however, this map is not continuous. Choose s.t. for all , most of the caps are contained within a ball of radius around

Sphere preserving maps - cont Now define a weight function: And so

Sphere preserving maps - cont is continuous and approach zero where is not continuous, so is continuous. Now, if the centers are in or and most of them in . Therefore lies on line between the origin and This implies (by Brewer’s fixed point), and we’re done.

Questions?