05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Combinatorial and algebraic tools for multigrid Yiannis Koutis Computer Science.

Slides:



Advertisements
Similar presentations
Numerical Linear Algebra in the Streaming Model Ken Clarkson - IBM David Woodruff - IBM.
Advertisements

Partitional Algorithms to Detect Complex Clusters
Partial Differential Equations
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Solving Laplacian Systems: Some Contributions from Theoretical Computer Science Nick Harvey UBC Department of Computer Science.
Approximation Algorithms Chapter 14: Rounding Applied to Set Cover.
The Combinatorial Multigrid Solver Yiannis Koutis, Gary Miller Carnegie Mellon University TexPoint fonts used in EMF. Read the TexPoint manual before you.
Solving linear systems through nested dissection Noga Alon Tel Aviv University Raphael Yuster University of Haifa.
Online Social Networks and Media. Graph partitioning The general problem – Input: a graph G=(V,E) edge (u,v) denotes similarity between u and v weighted.
Graph Laplacian Regularization for Large-Scale Semidefinite Programming Kilian Weinberger et al. NIPS 2006 presented by Aggeliki Tsoli.
Iterative methods TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA A A A A A A A.
Lecture 17 Introduction to Eigenvalue Problems
SDD Solvers: Bridging theory and practice Yiannis Koutis University of Puerto Rico, Rio Piedras joint with Gary Miller, Richard Peng Carnegie Mellon University.
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Lecture 21: Spectral Clustering
Eigenvalues and Eigenvectors
Graph Based Semi- Supervised Learning Fei Wang Department of Statistical Science Cornell University.
CS 584. Review n Systems of equations and finite element methods are related.
A Unified View of Kernel k-means, Spectral Clustering and Graph Cuts
The Landscape of Ax=b Solvers Direct A = LU Iterative y’ = Ay Non- symmetric Symmetric positive definite More RobustLess Storage (if sparse) More Robust.
Multigrid Eigensolvers for Image Segmentation Andrew Knyazev Supported by NSF DMS This presentation is at
CS345 Data Mining Link Analysis Algorithms Page Rank Anand Rajaraman, Jeffrey D. Ullman.
CS240A: Conjugate Gradients and the Model Problem.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 6. Eigenvalue problems.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University COT 5410 – Spring 2004.
Yiannis Koutis , U of Puerto Rico, Rio Piedras
Fast, Randomized Algorithms for Partitioning, Sparsification, and
Graph Coalition Structure Generation Maria Polukarov University of Southampton Joint work with Tom Voice and Nick Jennings HUJI, 25 th September 2011.
Complexity of direct methods n 1/2 n 1/3 2D3D Space (fill): O(n log n)O(n 4/3 ) Time (flops): O(n 3/2 )O(n 2 ) Time and space to solve any problem on any.
CS 219: Sparse Matrix Algorithms
Andreas Papadopoulos - [DEXA 2015] Clustering Attributed Multi-graphs with Information Ranking 26th International.
Discrete Algorithms & Math Department Preconditioning ‘03 Algebraic Tools for Analyzing Preconditioners Bruce Hendrickson Erik Boman Sandia National Labs.
Algorithms 2005 Ramesh Hariharan. Algebraic Methods.
Case Study in Computational Science & Engineering - Lecture 5 1 Iterative Solution of Linear Systems Jacobi Method while not converged do { }
Introduction to Scientific Computing II Multigrid Dr. Miriam Mehl Institut für Informatik Scientific Computing In Computer Science.
Introduction to Scientific Computing II Multigrid Dr. Miriam Mehl.
Lecture 21 MA471 Fall 03. Recall Jacobi Smoothing We recall that the relaxed Jacobi scheme: Smooths out the highest frequency modes fastest.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University.
CS 290H Administrivia: May 14, 2008 Course project progress reports due next Wed 21 May. Reading in Saad (second edition): Sections
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
Graphs, Vectors, and Matrices Daniel A. Spielman Yale University AMS Josiah Willard Gibbs Lecture January 6, 2016.
CS 290H 31 October and 2 November Support graph preconditioners Final projects: Read and present two related papers on a topic not covered in class Or,
 In the previews parts we have seen some kind of segmentation method.  In this lecture we will see graph cut, which is a another segmentation method.
Consider Preconditioning – Basic Principles Basic Idea: is to use Krylov subspace method (CG, GMRES, MINRES …) on a modified system such as The matrix.
Link Analysis Algorithms Page Rank Slides from Stanford CS345, slightly modified.
1 Algebraic and combinatorial tools for optimal multilevel algorithms Yiannis Koutis Carnegie Mellon University.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
A Tutorial on Spectral Clustering Ulrike von Luxburg Max Planck Institute for Biological Cybernetics Statistics and Computing, Dec. 2007, Vol. 17, No.
Conjugate gradient iteration One matrix-vector multiplication per iteration Two vector dot products per iteration Four n-vectors of working storage x 0.
Laplacian Matrices of Graphs: Algorithms and Applications ICML, June 21, 2016 Daniel A. Spielman.
Laplacian Matrices of Graphs: Algorithms and Applications ICML, June 21, 2016 Daniel A. Spielman.
Eigenvalues and Eigenvectors
Spectral Methods for Dimensionality
Eigenvalues and Eigenvectors
Efficient methods for finding low-stretch spanning trees
Solving Linear Systems Ax=b
CS 290H Administrivia: April 16, 2008
June 2017 High Density Clusters.
Spectral Clustering.
Density Independent Algorithms for Sparsifying
Introduction to Multigrid Method
Structural Properties of Low Threshold Rank Graphs
Spectral Clustering Eric Xing Lecture 8, August 13, 2010
On Clusterings: Good, Bad, and Spectral
Iterative Methods and Combinatorial Preconditioners
Clustering.
Chapter 2 Determinants.
Eigenvalues and Eigenvectors
Optimization on Graphs
Presentation transcript:

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Combinatorial and algebraic tools for multigrid Yiannis Koutis Computer Science Department Carnegie Mellon University

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 multilevel methods citations 25 free software packages 10 special conferences since 1983 Algorithms not always working Limited theoretical understanding

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 multilevel methods: our goals provide theoretical understanding solve multilevel design problems small changes in current software study structure of eigenspaces of Laplacians extensions for multilevel eigensolvers

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 quick definitions Given a graph G, with weights w ij Laplacian: A(i,j) = -w ij, row sums =0 Normalized Laplacian: (A,B) is a measure of how well B approximates A (and vice-versa)

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 linear systems : preconditioning Goal: Solve Ax = b via an iterative method A is a Laplacian of size n with m edges. Complexity depends on (A,I) and m Solution: Solve B -1 Ax = B -1 b Bz=y must be easily solvable (A,B) is small B is the preconditioner

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 combinatorial preconditioners the Vaidya thread B is a sparse subgraph of A, possibly with additional edges Solving Bz=y is performed as follows: 1.Gaussian elimination on degree · 2 nodes of B 2.A new system must be solved 3.Recursively call the same algorithm on to get an approximate solution.

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 combinatorial preconditioners the Vaidya thread Graph Sparsification [ Spielman, Teng ] Low stretch trees [ Elkin, Emek, Spielman, Teng ] Near optimal O(m poly( log n)) complexity

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 combinatorial preconditioners the Vaidya thread Graph Sparsification [ Spielman, Teng ] Low stretch trees [ Elkin, Emek, Spielman, Teng ] Near optimal O(m poly( log n)) complexity Focus on constructing a good B (A,B) is well understood – B is sparser than A B can look complicated even for simple graphs A

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 combinatorial preconditioners the Gremban - Miller thread the support graph S is bigger than A

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 combinatorial preconditioners the Gremban - Miller thread the support graph S is bigger than A Quotient 1

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 combinatorial preconditioners the Gremban - Miller thread The preconditioner S is often a natural graph S inherits the sparsity properties of A S is equivalent to a dense graph B of size equal to that of A : (A,S) = (A,B) Analysis of (A,S) made easy by work of [Maggs, Miller, Ravi, Woo, Parekh] Existence of good S by work of [Racke]

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions Other results

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Suppose we are given m clusters in A R(i,j) = 1 if the j th cluster contains node i R is n x m Quotient R is the clustering matrix algebraic expressions

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 The inverse preconditioner The normalized version R T D 1/2 is the weighted clustering matrix algebraic expressions

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions Other results

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 good partitions and low frequency invariant subspaces Suppose the graph A has a good clustering defined by the clustering matrix R Let Let y be any vector such that

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Suppose the graph A has a good clustering defined by the clustering matrix R Let Let y be any vector such that Theorem: The inequality is tight up to a constant for certain graphs good partitions and low frequency invariant subspaces quality test?

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 good partitions and low frequency invariant subspaces Let y be any vector such that Let x be mostly a linear combination of eigenvectors corresponding to eigenvalues close to Theorem: Prove ? We can find random vector x and check the distance to the closest y

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 multigrid – short introduction General class of algorithms Richardson iteration: High frequency components are reduced:

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 initial and smoothed error initial errorsmoothed error

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Define a smaller graph Q Define a projection operator Rproject Define a lift operator Rlift the basic multigrid algorithm 1.Apply t rounds of smoothing 2.Take the residual r = b-Ax old 3. Solve Qz = R project r 4.Form new iterate x new = x old + R lift z 5.Apply t rounds of smoothing how many? which iteration ? recursion is this needed ?

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 algebraic multigrid (AMG) Goals: The range of R project must approximate the unreduced error very well. The error not reduced by smoothing must be reduced by the smaller grid.

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 algebraic multigrid (AMG) Goals: The range of R project must approximate the unreduced error very well. The error not reduced by smoothing must be reduced in the smaller grid. Jacobi iteration: or scaled Richardson: Find a clustering R project = (R lift ) T Q = R project T A R project

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 algebraic multigrid (AMG) Goals: The range of R project must approximate the unreduced error very well. The error not reduced by smoothing must be reduced in the smaller grid. Jacobi iteration: or scaled Richardson Find a clustering [heuristic] R project = (R lift ) T [heuristic] Q = R project T A R project

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 two level analysis Analyze the maximum eigenvalue of where The matrix T 1 eliminates the error in A low frequency eigenvector has a significant component in

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 two level analysis Starting hypothesis: Let X be the subspace corresponding to eigenvalues smaller than. Let Y be the null space of R project. Assume, 2 · / Two level convergence : error reduced by Proving the hypothesis ? Limited cases

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 current state there is no systematic AMG approach that has proven effective in any kind of general context [BCFHJMMR, SIAM Journal on Scientific Computing, 2003]

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 our contributions – two level There exists a good clustering given by R. The quality is measured by the condition number (A,S) Q = R T A R Richardsons with Projection matrix

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 our contributions - two level analysis Starting hypothesis: Let X be the subspace corresponding to eigenvalues smaller than. Let Y be the null space of R project = R T D 1/2 Assume, 2 · / Two level convergence : error reduced by Proving the hypothesis ? Yes! Using (A,S) Result holds for t=1 smoothing Additional smoothings do not help

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 our contributions - recursion There is a matrix M which characterizes the error reduction after one full multigrid cycle We need to upper bound its maximum eigenvalue as a function of the two-level eigenvalues the maximum eigenvalue of M is upper bounded by the sum of the maximum eigenvalues over all two-levels

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 towards full convergence Goal: The error not reduced by smoothing must be reduced by the smaller grid A different point of view The small grid does not reduce part of the error. It rather changes its spectral profile.

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 full convergence for regular d-dimensional toroidal meshes A simple change in the implementation of the algorithm: where T 2 has eigenvalues 1 and -1 T 2 x low = x high

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 full convergence for regular d-dimensional toroidal meshes With t=O(log log n) smoothings Using recursive analysis: max (M) · 1/2 Both pre-smoothings and post-smoothings are needed Holds for perturbations of toroidal meshes

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Thanks!