Ilan Ben-Bassat Omri Weinstein

Slides:



Advertisements
Similar presentations
Scientific Computing QR Factorization Part 2 – Algorithm to Find Eigenvalues.
Advertisements

10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Random Walks Ben Hescott CS591a1 November 18, 2002.
Lecture 3: Markov processes, master equation
6.896: Probability and Computation Spring 2011 Constantinos (Costis) Daskalakis lecture 2.
Overview of Markov chains David Gleich Purdue University Network & Matrix Computations Computer Science 15 Sept 2011.
Lecture 19 Singular Value Decomposition
Graph Clustering. Why graph clustering is useful? Distance matrices are graphs  as useful as any other clustering Identification of communities in social.
Seminar on Random Walks on Graphs 2009/2010 Ilan Ben Bassat Omri Weinstein Mixing Time – General Chains.
Lecture 21: Spectral Clustering
Symmetric Matrices and Quadratic Forms
Symmetric Matrices and Quadratic Forms
Lecture 16 Graphs and Matrices in Practice Eigenvalue and Eigenvector Shang-Hua Teng.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 4 March 30, 2005
Chapter 3 Determinants and Matrices
Zig-Zag Expanders Seminar in Theory and Algorithmic Research Sashka Davis UCSD, April 2005 “ Entropy Waves, the Zig-Zag Graph Product, and New Constant-
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 6 May 7, 2006
Preference Analysis Joachim Giesen and Eva Schuberth May 24, 2006.
Expanders Eliyahu Kiperwasser. What is it? Expanders are graphs with no small cuts. The later gives several unique traits to such graph, such as: – High.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
1 On the Computation of the Permanent Dana Moshkovitz.
Lecture 16 Cramer’s Rule, Eigenvalue and Eigenvector Shang-Hua Teng.
Matrices CS485/685 Computer Vision Dr. George Bebis.
Stochastic Approach for Link Structure Analysis (SALSA) Presented by Adam Simkins.
GROUPS & THEIR REPRESENTATIONS: a card shuffling approach Wayne Lawton Department of Mathematics National University of Singapore S ,
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Fundamentals from Linear Algebra Ghan S. Bhatt and Ali Sekmen Mathematical Sciences and Computer Science College of Engineering Tennessee State University.
Gram-Schmidt Orthogonalization
6.896: Probability and Computation Spring 2011 Constantinos (Costis) Daskalakis lecture 3.
Time to Equilibrium for Finite State Markov Chain 許元春(交通大學應用數學系)
Date: 2005/4/25 Advisor: Sy-Yen Kuo Speaker: Szu-Chi Wang.
geometric representations of graphs
Similar diagonalization of real symmetric matrix
Presented by Alon Levin
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
1 Entropy Waves, The Zigzag Graph Product, and New Constant-Degree Expanders Omer Reingold Salil Vadhan Avi Wigderson Lecturer: Oded Levy.
Final Outline Shang-Hua Teng. Problem 1: Multiple Choices 16 points There might be more than one correct answers; So you should try to mark them all.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
Tutorial 6. Eigenvalues & Eigenvectors Reminder: Eigenvectors A vector x invariant up to a scaling by λ to a multiplication by matrix A is called.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
Density of States for Graph Analysis
MA2213 Lecture 8 Eigenvectors.
Orthogonal Matrices & Symmetric Matrices
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Markov Chains and Random Walks
Lecture 5.2: Special Graphs and Matrix Representation
Markov Chains Mixing Times Lecture 5
Spectral Clustering.
Degree and Eigenvector Centrality
Graphs.
Lecture 16 Cramer’s Rule, Eigenvalue and Eigenvector
GROUPS & THEIR REPRESENTATIONS: a card shuffling approach
Prof. Paolo Ferragina, Algoritmi per "Information Retrieval"
Randomized Algorithms Markov Chains and Random Walks
CS485/685 Computer Vision Dr. George Bebis
6-4 Symmetric Matrices By毛.
Derivative of scalar forms
Prof. Paolo Ferragina, Algoritmi per "Information Retrieval"
geometric representations of graphs
Sturm-Liouville Theory
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Lecture 13: Singular Value Decomposition (SVD)
Corollary If A is diagonalizable and rank A=r,
Subject :- Applied Mathematics
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Lecture 20 SVD and Its Applications
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Ilan Ben-Bassat Omri Weinstein Bounding the mixing time via Spectral Gap Graph Random Walk Seminar Fall 2009 Ilan Ben-Bassat Omri Weinstein

Outline Why spectral gap? Undirected Regular Graphs. Directed Reversible Graphs. Example (Unit hypercube). Conductance. Reversible Chains.

P- Transition matrix (ergodic) of an undirected regular graph. P is real, stochastic and symmetric, thus: All eigenvalues are real. P has n real (orthogonal) eigenvectors.

P’s eigenvalues satisfy: Why do we have an eigenvalue 1? Why do all eigenvalues satisfy ? Why are there no more 1’s? Why are there no (-1)’s?

Laplacian Matrix L = I – P So, L is symmetric and positive semi definite. L has eigenvalue 0 with eigenvector 1v. Claim: The multiplicity of 0 is 1.

So?...

Intuition How could eigenvalues and mixing time be connected?

Larger Spectral Gap = Rapid Mixing Spectral Gap and Mixing Time The spectral gap determines the mixing rate: Larger Spectral Gap = Rapid Mixing

As for P’s spectral decomposition, P has an orthonormal basis of eigenvectors. We can bound by . So, the mixing time is bounded by:

Assume directed reversible graph (or general undirected graph) Assume directed reversible graph (or general undirected graph). We have no direct spectral analysis. But P is similar to a symmetric matrix!

Proof Let be a matrix with diagonal entries . Claim: is a symmetric matrix. From reversibility:

S and P have the same eigenvalues. What about eigenvectors?

Still: Why do all eigenvalues satisfy ? (same) Why do we have an eigenvalue 1? (same) Why is it unique? same for (-1). As for 1: Omri will prove:

According to spectral decomposition of symmetric matrices: Note:

Main Lemma: For every , we define: So, for every we get:

So, we can get

Now, we can bound the mixing time:

Summary We have bounded the mixing time for irreducible, a-periodic reversible graphs. Note: Reducible graphs have no unique eigenvalue. Periodic graphs – the same (bipartite graph).

Graph Product Let . The product Is defined by and 1 (0,0) (0,1) (1,1)

Entry (i,j) 

So, only the permutations that were counted for the determinant of AG1, will be counted here. We instead of we get So,

The eigenvectors of Qn are We now re-compute every eigenvalue by: Adding n (self loops) Dividing by 2n (to get a transition matrix). Now we get And the mixing time satisfies: