Parallel Algorithm Design using Spectral Graph Theory

Slides:



Advertisements
Similar presentations
Fast Regression Algorithms Using Spectral Graph Theory Richard Peng.
Advertisements

Satisfiability Modulo Theories (An introduction)
Lecture 19: Parallel Algorithms
The Combinatorial Multigrid Solver Yiannis Koutis, Gary Miller Carnegie Mellon University TexPoint fonts used in EMF. Read the TexPoint manual before you.
An Efficient Parallel Solver for SDD Linear Systems Richard Peng M.I.T. Joint work with Dan Spielman (Yale)
Algorithm Design Using Spectral Graph Theory Richard Peng Joint Work with Guy Blelloch, HuiHan Chin, Anupam Gupta, Jon Kelner, Yiannis Koutis, Aleksander.
1 s-t Graph Cuts for Binary Energy Minimization  Now that we have an energy function, the big question is how do we minimize it? n Exhaustive search is.
1 Parallel Algorithms II Topics: matrix and graph algorithms.
Iterative methods TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA A A A A A A A.
Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
Support Vector Machines and Kernel Methods
ECE734 VLSI Arrays for Digital Signal Processing Algorithm Representations and Iteration Bound.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Lecture 9 Wednesday, 11/15/06 Linear Programming.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Nonlinear Optimization for Optimal Control
Introduction to Linear and Integer Programming Lecture 7: Feb 1.
2010/5/171 Overview of graph cuts. 2010/5/172 Outline Introduction S-t Graph cuts Extension to multi-label problems Compare simulated annealing and alpha-
Sparse Matrix Methods Day 1: Overview Day 2: Direct methods
Chapter 9 Graph algorithms Lec 21 Dec 1, Sample Graph Problems Path problems. Connectedness problems. Spanning tree problems.
The Landscape of Ax=b Solvers Direct A = LU Iterative y’ = Ay Non- symmetric Symmetric positive definite More RobustLess Storage (if sparse) More Robust.
Randomness in Computation and Communication Part 1: Randomized algorithms Lap Chi Lau CSE CUHK.
Hardness Results for Problems P: Class of “easy to solve” problems Absolute hardness results Relative hardness results –Reduction technique.
The Fundamentals: Algorithms, the Integers & Matrices.
Yiannis Koutis , U of Puerto Rico, Rio Piedras
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Solving Scalar Linear Systems Iterative approach Lecture 15 MA/CS 471 Fall 2003.
Iterative Methods for Solving Linear Systems Leo Magallon & Morgan Ulloa.
Algorithms  Al-Khwarizmi, arab mathematician, 8 th century  Wrote a book: al-kitab… from which the word Algebra comes  Oldest algorithm: Euclidian algorithm.
CS 219: Sparse Matrix Algorithms
Support Vector Machines and Kernel Methods Machine Learning March 25, 2010.
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
1 BIM304: Algorithm Design Time: Friday 9-12am Location: B4 Instructor: Cuneyt Akinlar Grading –2 Midterms – 20% and 30% respectively –Final – 30% –Projects.
NP-Complete problems.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Onlinedeeneislam.blogspot.com1 Design and Analysis of Algorithms Slide # 1 Download From
Notes Over 3.1 Solving a System Graphically Graph the linear system and estimate the solution. Then check the solution algebraically.
Lecture 5 Graph Theory prepped by Lecturer ahmed AL tememe 1.
Conjugate gradient iteration One matrix-vector multiplication per iteration Two vector dot products per iteration Four n-vectors of working storage x 0.
Laplacian Matrices of Graphs: Algorithms and Applications ICML, June 21, 2016 Daniel A. Spielman.
Laplacian Matrices of Graphs: Algorithms and Applications ICML, June 21, 2016 Daniel A. Spielman.
The Landscape of Sparse Ax=b Solvers Direct A = LU Iterative y’ = Ay Non- symmetric Symmetric positive definite More RobustLess Storage More Robust More.
REVIEW Linear Combinations Given vectors and given scalars
1.4 The Matrix Equation Ax = b
Spectral Methods for Dimensionality
High Performance Linear System Solvers with Focus on Graph Laplacians
Richard Peng Georgia Tech Michael Cohen Jon Kelner John Peebles
Part 3 Linear Programming
Randomized Min-Cut Algorithm
ADVANCED COMPUTATIONAL MODELS AND ALGORITHMS
The Design and Analysis of Algorithms
Efficient methods for finding low-stretch spanning trees
Gauss-Siedel Method.
CS 290H Administrivia: April 16, 2008
Space-efficient graph algorithms
Design and Analysis of Computer Algorithm (CS575-01)
Lecture 22: Parallel Algorithms
Part 3 Linear Programming
Intro to NP Completeness
Vijay V. Vazirani Georgia Tech
CSE838 Lecture notes copy right: Moon Jung Chung
Linear Algebra Lecture 3.
 = N  N matrix multiplication N = 3 matrix N = 3 matrix N = 3 matrix
Lecture 15: Least Square Regression Metric Embeddings
Part 3. Linear Programming
Part 3 Linear Programming
Major Design Strategies
Linear Algebra Lecture 16.
Major Design Strategies
On Solving Linear Systems in Sublinear Time
Optimization on Graphs
Presentation transcript:

Parallel Algorithm Design using Spectral Graph Theory Gary L Miller 1

Spectral Graph Theory Use linear algebra to solve graph problems Use graph theory to solve linear algebra problems We start with 1.

Talking Point We need to develop fast parallel primitives that have many applications An important primitive: linear system solvers.

oldest Computational Problem

Image Denoising Image Segmentation Given image + noise, recover image. Space X Moon Launch

Convex Programs as a higher Primitive Linear programs Maximum Flow in a graph. Minimum cost maximum flow Single source shortest path General Convex Programs Image denoising Machine Learning

Interior point methods for Convex Programs Karmarker, Renagar, Ye, Nesterov, Nemirovsky, Boyd, Vanderburghe, INPUT: Convex program with m constraints RUNTIME: O(√ m) linear system solves. One of the major breakthroughs of the 20th century. How fast is the solver?

Special Linear Systems A is Symmetric Diagonally Dominant (SDD) Symmetric. Diagonal entry at least sum of absolute values of off diagonals.

Problems whose interior point pivot is a SDD system Linear programs Maximum Flow in a graph. Minimum cost maximum flow Single source shortest path General Convex Programs Image denoising Several Machine Learning problems

Fundamental Problem: Solving Linear Systems n-by-n m non-zero entries Given matrix A, vector b Find vector x such that Ax=b Maybe draw matrix Ok with almost exact solutions

Fast SDD solvers Input: n by n SDD matrix A with m non-zeros vector b Output: Approximate solution x to Ax=b. Runtime: O(m log n log(1/ε)) Parallel solver, O(m1/3) depth and nearly-linear work

Theoretical Applications of SDD Solvers: Multipule Iterations Learning on graphical models. Planar graph embeddings. Finite Element PDEs Generating random spanning Maximum flow

The Graph algorithms in the solver Low Stretch Spanning Trees

A better tree Recursive ‘C’ Construction

Future directions Going from theory to practice. Fundamental algorithm design missing More applications Work-efficient parallelizations? Fast solvers for Symmetric Positive Definite systems.

Thank You! 16