Laplacian Matrices of Graphs: Algorithms and Applications ICML, June 21, 2016 Daniel A. Spielman.

Slides:



Advertisements
Similar presentations
05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Combinatorial and algebraic tools for multigrid Yiannis Koutis Computer Science.
Advertisements

Fast Regression Algorithms Using Spectral Graph Theory Richard Peng.
Solving Laplacian Systems: Some Contributions from Theoretical Computer Science Nick Harvey UBC Department of Computer Science.
The Combinatorial Multigrid Solver Yiannis Koutis, Gary Miller Carnegie Mellon University TexPoint fonts used in EMF. Read the TexPoint manual before you.
Introduction to Network Theory: Modern Concepts, Algorithms
Uniform Sampling for Matrix Approximation Michael Cohen, Yin Tat Lee, Cameron Musco, Christopher Musco, Richard Peng, Aaron Sidford M.I.T.
Solving linear systems through nested dissection Noga Alon Tel Aviv University Raphael Yuster University of Haifa.
Algorithm Design Using Spectral Graph Theory Richard Peng Joint Work with Guy Blelloch, HuiHan Chin, Anupam Gupta, Jon Kelner, Yiannis Koutis, Aleksander.
Information Networks Graph Clustering Lecture 14.
Online Social Networks and Media. Graph partitioning The general problem – Input: a graph G=(V,E) edge (u,v) denotes similarity between u and v weighted.
Graph Laplacian Regularization for Large-Scale Semidefinite Programming Kilian Weinberger et al. NIPS 2006 presented by Aggeliki Tsoli.
Trees and BFSs Page 1 The Network Simplex Method Spanning Trees and Basic Feasible Solutions.
SDD Solvers: Bridging theory and practice Yiannis Koutis University of Puerto Rico, Rio Piedras joint with Gary Miller, Richard Peng Carnegie Mellon University.
New Techniques for Visualisation of Large and Complex Networks with Directed Edges Tim Dwyer 1 Yehuda Koren 2 1 Monash University, Victoria, Australia.
Approximation Algoirthms: Semidefinite Programming Lecture 19: Mar 22.
Graph Sparsifiers: A Survey Nick Harvey Based on work by: Batson, Benczur, de Carli Silva, Fung, Hariharan, Harvey, Karger, Panigrahi, Sato, Spielman,
Lecture 21: Spectral Clustering
Graph Sparsifiers: A Survey Nick Harvey UBC Based on work by: Batson, Benczur, de Carli Silva, Fung, Hariharan, Harvey, Karger, Panigrahi, Sato, Spielman,
Sampling from Gaussian Graphical Models via Spectral Sparsification Richard Peng M.I.T. Joint work with Dehua Cheng, Yu Cheng, Yan Liu and Shanghua Teng.
Semidefinite Programming
Spectral Graph Theory and its Applications Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity.
CS232.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Chapter 2 Matrices Definition of a matrix.
Hypercubes and Neural Networks bill wolfe 10/23/2005.
Ramanujan Graphs of Every Degree Adam Marcus (Crisply, Yale) Daniel Spielman (Yale) Nikhil Srivastava (MSR India)
Basic Mathematics for Portfolio Management. Statistics Variables x, y, z Constants a, b Observations {x n, y n |n=1,…N} Mean.
Computer Graphics Recitation The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.
Solving System of Linear Equations. 1. Diagonal Form of a System of Equations 2. Elementary Row Operations 3. Elementary Row Operation 1 4. Elementary.
Today Wrap up of probability Vectors, Matrices. Calculus
Graph-based consensus clustering for class discovery from gene expression data Zhiwen Yum, Hau-San Wong and Hongqiang Wang Bioinformatics, 2007.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University COT 5410 – Spring 2004.
Yiannis Koutis , U of Puerto Rico, Rio Piedras
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
ECON 1150 Matrix Operations Special Matrices
Fast, Randomized Algorithms for Partitioning, Sparsification, and
Complexity of direct methods n 1/2 n 1/3 2D3D Space (fill): O(n log n)O(n 4/3 ) Time (flops): O(n 3/2 )O(n 2 ) Time and space to solve any problem on any.
CS 219: Sparse Matrix Algorithms
TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A AA A A A A A AAA Fitting a Graph to Vector Data Samuel I. Daitch (Yale)
Spectral graph theory Network & Matrix Computations CS NMC David F. Gleich.
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
L 7: Linear Systems and Metabolic Networks. Linear Equations Form System.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
CS 290H Administrivia: May 14, 2008 Course project progress reports due next Wed 21 May. Reading in Saad (second edition): Sections
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
Graphs, Vectors, and Matrices Daniel A. Spielman Yale University AMS Josiah Willard Gibbs Lecture January 6, 2016.
CS 290H 31 October and 2 November Support graph preconditioners Final projects: Read and present two related papers on a topic not covered in class Or,
A Tutorial on Spectral Clustering Ulrike von Luxburg Max Planck Institute for Biological Cybernetics Statistics and Computing, Dec. 2007, Vol. 17, No.
Sparsified Matrix Algorithms for Graph Laplacians Richard Peng Georgia Tech.
Laplacian Matrices of Graphs: Algorithms and Applications ICML, June 21, 2016 Daniel A. Spielman.
Random Walk for Similarity Testing in Complex Networks
Shan Lu, Jieqi Kang, Weibo Gong, Don Towsley UMASS Amherst
Part 3 Linear Programming
Review of Matrix Operations
Resparsification of Graphs
Efficient methods for finding low-stretch spanning trees
Solving Linear Systems Ax=b
CS 290H Administrivia: April 16, 2008
Parallel Algorithm Design using Spectral Graph Theory
Density Independent Algorithms for Sparsifying
Part 3 Linear Programming
Matrix Martingales in Randomized Numerical Linear Algebra
3.3 Network-Centric Community Detection
A Numerical Analysis Approach to Convex Optimization
Part 3 Linear Programming
Shan Lu, Jieqi Kang, Weibo Gong, Don Towsley UMASS Amherst
On Solving Linear Systems in Sublinear Time
Optimization on Graphs
Presentation transcript:

Laplacian Matrices of Graphs: Algorithms and Applications ICML, June 21, 2016 Daniel A. Spielman

Laplacians Interpolation on graphs Spring networks Clustering Isotonic regression Sparsification Solving Laplacian Equations Best results The simplest algorithm Outline

Interpolate values of a function at all vertices from given values at a few vertices. Minimize Subject to given values 1 0 ANAPC10 CDC27 ANAPC5 UBE2C ANAPC2 CDC20 (Zhu,Ghahramani,Lafferty ’03) Interpolation on Graphs

Interpolate values of a function at all vertices from given values at a few vertices. Minimize Subject to given values ANAPC10 CDC27 ANAPC5 UBE2C ANAPC2 CDC20 (Zhu,Ghahramani,Lafferty ’03) Interpolation on Graphs

Interpolate values of a function at all vertices from given values at a few vertices. Minimize Subject to given values Take derivatives. Minimize by solving Laplacian ANAPC10 CDC27 ANAPC5 UBE2C ANAPC2 CDC20 (Zhu,Ghahramani,Lafferty ’03) Interpolation on Graphs

Interpolate values of a function at all vertices from given values at a few vertices. Minimize Subject to given values ANAPC10 CDC27 ANAPC5 UBE2C ANAPC2 CDC20 (Zhu,Ghahramani,Lafferty ’03) Interpolation on Graphs In the solution, variables are the average of their neighbors

The Laplacian Quadratic Form

The Laplacian Matrix of a Graph

Nail down some vertices, let rest settle View edges as rubber bands or ideal linear springs In equilibrium, nodes are averages of neighbors. Spring Networks

Nail down some vertices, let rest settle View edges as rubber bands or ideal linear springs Spring Networks When stretched to length potential energy is

Nail down some vertices, let rest settle Physics: position minimizes total potential energy subject to boundary constraints (nails) Spring Networks

0 ANAPC10 CDC27 ANAPC5 UBE2C ANAPC2 CDC20 Spring Networks Interpolate values of a function at all vertices from given values at a few vertices. Minimize 1

1 0 ANAPC10 CDC27 ANAPC5 UBE2C ANAPC2 CDC20 Spring Networks Interpolate values of a function at all vertices from given values at a few vertices. Minimize

Interpolate values of a function at all vertices from given values at a few vertices. Minimize ANAPC10 CDC27 ANAPC5 UBE2C ANAPC2 CDC20 Spring Networks In equilibrium, nodes are averages of neighbors.

(Tutte ’63) Drawing by Spring Networks

(Tutte ’63)

Drawing by Spring Networks (Tutte ’63)

Drawing by Spring Networks (Tutte ’63)

Drawing by Spring Networks (Tutte ’63)

If the graph is planar, then the spring drawing has no crossing edges! Drawing by Spring Networks (Tutte ’63)

Drawing by Spring Networks (Tutte ’63)

Drawing by Spring Networks (Tutte ’63)

Drawing by Spring Networks (Tutte ’63)

Drawing by Spring Networks (Tutte ’63)

Drawing by Spring Networks (Tutte ’63)

S S Measuring boundaries of sets Boundary: edges leaving a set

S S Characteristic Vector of S: Measuring boundaries of sets

Boundary: edges leaving a set S S Characteristic Vector of S: Measuring boundaries of sets

Find large sets of small boundary S S 0.5 Heuristic to find x with small Compute eigenvector Consider the level sets Spectral Clustering and Partitioning

Symmetric Non-positive off-diagonals Diagonally dominant The Laplacian Matrix of a Graph

where Laplacian Matrices of Weighted Graphs

B is the signed edge-vertex adjacency matrix with one row for each W is the diagonal matrix of weights Laplacian Matrices of Weighted Graphs where

Laplacian Matrices of Weighted Graphs

Quickly Solving Laplacian Equations Where m is number of non-zeros and n is dimension S,Teng ’04: Using low-stretch trees and sparsifiers

Where m is number of non-zeros and n is dimension Koutis, Miller, Peng ’11: Low-stretch trees and sampling S,Teng ’04: Using low-stretch trees and sparsifiers Quickly Solving Laplacian Equations

Where m is number of non-zeros and n is dimension Cohen, Kyng, Pachocki, Peng, Rao ’14: Koutis, Miller, Peng ’11: Low-stretch trees and sampling S,Teng ’04: Using low-stretch trees and sparsifiers Quickly Solving Laplacian Equations

Cohen, Kyng, Pachocki, Peng, Rao ’14: Koutis, Miller, Peng ’11: Low-stretch trees and sampling S,Teng ’04: Using low-stretch trees and sparsifiers Quickly Solving Laplacian Equations Good code: LAMG (lean algebraic multigrid) – Livne-Brandt CMG (combinatorial multigrid) – Koutis

An -accurate solution to is an x satisfying where Quickly Solving Laplacian Equations Where m is number of non-zeros and n is dimension S,Teng ’04: Using low-stretch trees and sparsifiers

An -accurate solution to is an x satisfying Quickly Solving Laplacian Equations S,Teng ’04: Using low-stretch trees and sparsifiers Allows fast computation of eigenvectors corresponding to small eigenvalues.

Laplacians appear when solving Linear Programs on on graphs by Interior Point Methods Maximum and Min-Cost Flow (Daitch, S ’08, Mądry ‘13) Shortest Paths (Cohen, Mądry, Sankowski, Vladu ‘16) Lipschitz Learning : regularized interpolation on graphs (Kyng, Rao, Sachdeva,S ‘15) Isotonic Regression (Kyng, Rao, Sachdeva ‘15) Laplacians in Linear Programming

(Ayer et. al. ‘55) Isotonic Regression A function is isotonic with respect to a directed acyclic graph if x increases on edges.

Isotonic Regression College GPA High-school GPA SAT (Ayer et. al. ‘55)

Isotonic Regression College GPA High-school GPA SAT Estimate by nearest neighbor? (Ayer et. al. ‘55)

Isotonic Regression College GPA High-school GPA SAT Estimate by nearest neighbor? We want the estimate to be monotonically increasing (Ayer et. al. ‘55)

Isotonic Regression College GPA High-school GPA SAT Given find the isotonic x minimizing (Ayer et. al. ‘55)

Isotonic Regression College GPA High-school GPA SAT Given find the isotonic x minimizing (Ayer et. al. ‘55)

Given find the isotonic x minimizing (Kyng, Rao, Sachdeva ’15) Fast IPM for Isotonic Regression

Given find the isotonic x minimizing or for any in time (Kyng, Rao, Sachdeva ’15) Fast IPM for Isotonic Regression

Signed edge-vertex incidence matrix x is isotonic if Linear Program for Isotonic Regression

Given y, minimize subject to Linear Program for Isotonic Regression

Given y, minimize subject to Linear Program for Isotonic Regression

Given y, minimize subject to Linear Program for Isotonic Regression

Given y, minimize subject to Linear Program for Isotonic Regression

Minimize subject to Linear Program for Isotonic Regression

Minimize subject to IPM solves a sequence of equations of form with positive diagonal matrices Linear Program for Isotonic Regression

are positive diagonal Laplacian! Linear Program for Isotonic Regression

are positive diagonal Laplacian! Kyng, Rao, Sachdeva’15: Reduce to solving Laplacians to constant accuracy Linear Program for Isotonic Regression

Spectral Sparsification Every graph can be approximated by a sparse graph with a similar Laplacian

for all x A graph H is an -approximation of G if Approximating Graphs

for all x A graph H is an -approximation of G if Approximating Graphs That is, Where iff

for all x A graph H is an -approximation of G if Approximating Graphs Preserves boundaries of every set

Solutions to linear equations are similiar for all x A graph H is an -approximation of G if Approximating Graphs

Spectral Sparsification Every graph G has an -approximation H with edges (Batson, S, Srivastava ’09)

Spectral Sparsification Every graph G has an -approximation H with edges (Batson, S, Srivastava ’09) Random regular graphs approximate complete graphs

Fast Spectral Sparsification (S & Srivastava ‘08) If sample each edge with probability inversely proportional to its effective spring constant, only need samples (Lee & Sun ‘15) Can find an -approximation with edges in time for every

(Kyng & Sachdeva ‘16) Approximate Gaussian Elimination Gaussian Elimination: compute upper triangular U so that Approximate Gaussian Elimination: compute sparse upper triangular U so that

Gaussian Elimination 1. Find the rank-1 matrix that agrees on the first row and column 2. Subtract it

Gaussian Elimination 1. Find the rank-1 matrix that agrees on the first row and column 2. Subtract it 3. Repeat

Gaussian Elimination 1. Find the rank-1 matrix that agrees on the next row and column 2. Subtract it

Gaussian Elimination 1. Find the rank-1 matrix that agrees on the next row and column 2. Subtract it

Gaussian Elimination

Computation time proportional to the sum of the squares of the number of nonzeros in these vectors

Gaussian Elimination of Laplacians If this is a Laplacian, then so is this

Gaussian Elimination of Laplacians If this is a Laplacian, then so is this When eliminate a node, add a clique on its neighbors

Approximate Gaussian Elimination when eliminate a node, add a clique on its neighbors 2. Sparsify that clique, without ever constructing it (Kyng & Sachdeva ‘16)

Approximate Gaussian Elimination 1.When eliminate a node of degree d, add d edges at random between its neighbors, sampled with probability proportional to the weight of the edge to the eliminated node 1

(Kyng & Sachdeva ‘16) Approximate Gaussian Elimination 0. Initialize by randomly permuting vertices, and making copies of every edge 1.When eliminate a node of degree d, add d edges at random between its neighbors, sampled with probability proportional to the weight of the edge to the eliminated node Total time is

(Kyng & Sachdeva ‘16) Approximate Gaussian Elimination Analysis by Random Matrix Theory: Write U T U as a sum of random matrices. Random permutation and copying control the variances of the random matrices Apply Matrix Freedman inequality (Tropp ‘11)

(Kyng & Sachdeva ‘16) Approximate Gaussian Elimination 0. Initialize by randomly permuting vertices, and making copies of every edge Total time is 1.When eliminate a node of degree d, add d edges at random between its neighbors, sampled with probability proportional to the weight of the edge to the eliminated node Can be improved by sacrificing some simplicity

Other families of linear systems complex-weighted Laplacians connection Laplacians Other optimization problems Recent Developments (Kyng, Lee, Peng, Sachdeva, S ‘16)

To learn more My web page on: Laplacian linear equations, sparsification, local graph clustering, low-stretch spanning trees, and so on. My class notes from “Graphs and Networks” and “Spectral Graph Theory” Lx = b, by Nisheeth Vishnoi Laplacians.jl