The Cover Time of Random Walks Uriel Feige Weizmann Institute.

Slides:



Advertisements
Similar presentations
Generating Random Spanning Trees Sourav Chatterji Sumit Gulwani EECS Department University of California, Berkeley.
Advertisements

On allocations that maximize fairness Uriel Feige Microsoft Research and Weizmann Institute.
Quiz 1 a) Find the currents i1 and i2 in the circuit in the figure. b) Find the voltage vo. c) Verify that the total power developed equals the total power.
A Separator Theorem for Graphs with an Excluded Minor and its Applications Paul Seymour Noga Alon Robin Thomas Lecturer : Daniel Motil.
Heuristics for the Hidden Clique Problem Robert Krauthgamer (IBM Almaden) Joint work with Uri Feige (Weizmann)
Graph Isomorphism Algorithms and networks. Graph Isomorphism 2 Today Graph isomorphism: definition Complexity: isomorphism completeness The refinement.
Exact Inference in Bayes Nets
Approximation Algorithms for Unique Games Luca Trevisan Slides by Avi Eyal.
Amnon Ta-Shma Uri Zwick Tel Aviv University Deterministic Rendezvous, Treasure Hunts and Strongly Universal Exploration Sequences TexPoint fonts used in.
Absorbing Random walks Coverage
1 By Gil Kalai Institute of Mathematics and Center for Rationality, Hebrew University, Jerusalem, Israel presented by: Yair Cymbalista.
Random Walks Ben Hescott CS591a1 November 18, 2002.
Parallel random walks Brian Moffat. Outline What are random walks What are Markov chains What are cover/hitting/mixing times Speed ups for different graphs.
Analysis of Network Diffusion and Distributed Network Algorithms Rajmohan Rajaraman Northeastern University, Boston May 2012 Chennai Network Optimization.
Circuit Analysis III Section 06.
Semidefinite Programming
ADDITIONAL ANALYSIS TECHNIQUES LEARNING GOALS REVIEW LINEARITY The property has two equivalent definitions. We show and application of homogeneity APPLY.
Global Synchronization in Sensornets Jeremy Elson, Richard Karp, Christos Papadimitriou, Scott Shenker.
Expanders Eliyahu Kiperwasser. What is it? Expanders are graphs with no small cuts. The later gives several unique traits to such graph, such as: – High.
Randomness in Computation and Communication Part 1: Randomized algorithms Lap Chi Lau CSE CUHK.
Steiner trees Algorithms and Networks. Steiner Trees2 Today Steiner trees: what and why? NP-completeness Approximation algorithms Preprocessing.
Ramanujan Graphs of Every Degree Adam Marcus (Crisply, Yale) Daniel Spielman (Yale) Nikhil Srivastava (MSR India)
Finding a maximum independent set in a sparse random graph Uriel Feige and Eran Ofek.
(work appeared in SODA 10’) Yuk Hei Chan (Tom)
MATH 310, FALL 2003 (Combinatorial Problem Solving) Lecture 10, Monday, September 22.
Mixing Times of Markov Chains for Self-Organizing Lists and Biased Permutations Prateek Bhakta, Sarah Miracle, Dana Randall and Amanda Streib.
cover times, blanket times, and majorizing measures Jian Ding U. C. Berkeley James R. Lee University of Washington Yuval Peres Microsoft Research TexPoint.
ⅠIntroduction to Set Theory 1. Sets and Subsets
DATA MINING LECTURE 13 Absorbing Random walks Coverage.
Advanced Algorithm Design and Analysis (Lecture 13) SW5 fall 2004 Simonas Šaltenis E1-215b
Expanders via Random Spanning Trees R 許榮財 R 黃佳婷 R 黃怡嘉.
Random Walks Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2005 Lecture 24April 7, 2005Carnegie Mellon University.
DATA MINING LECTURE 13 Pagerank, Absorbing Random Walks Coverage Problems.
 Ⅰ Introduction to Set Theory  1. Sets and Subsets  Representation of set:  Listing elements, Set builder notion, Recursive definition  , ,  
Markov Chains and Random Walks. Def: A stochastic process X={X(t),t ∈ T} is a collection of random variables. If T is a countable set, say T={0,1,2, …
Data Structures & Algorithms Graphs
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
Cover times, blanket times, and the GFF Jian Ding Berkeley-Stanford-Chicago James R. Lee University of Washington Yuval Peres Microsoft Research.
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
NP-Complete problems.
Random walks on undirected graphs and a little bit about Markov Chains Guy.
Relevant Subgraph Extraction Longin Jan Latecki Based on : P. Dupont, J. Callut, G. Dooms, J.-N. Monette and Y. Deville. Relevant subgraph extraction from.
CS 3343: Analysis of Algorithms Lecture 25: P and NP Some slides courtesy of Carola Wenk.
Seminar on random walks on graphs Lecture No. 2 Mille Gandelsman,
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Graphs, Vectors, and Matrices Daniel A. Spielman Yale University AMS Josiah Willard Gibbs Lecture January 6, 2016.
Date: 2005/4/25 Advisor: Sy-Yen Kuo Speaker: Szu-Chi Wang.
8.4 Closures of Relations Definition: The closure of a relation R with respect to property P is the relation obtained by adding the minimum number of.
Presented by Alon Levin
Week 11 - Wednesday.  What did we talk about last time?  Graphs  Paths and circuits.
Generating Random Spanning Trees via Fast Matrix Multiplication Keyulu Xu University of British Columbia Joint work with Nick Harvey TexPoint fonts used.
An algorithmic proof of the Lovasz Local Lemma via resampling oracles Jan Vondrak IBM Almaden TexPoint fonts used in EMF. Read the TexPoint manual before.
1 Entropy Waves, The Zigzag Graph Product, and New Constant-Degree Expanders Omer Reingold Salil Vadhan Avi Wigderson Lecturer: Oded Levy.
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Review: Discrete Mathematics and Its Applications
Algorithms and Networks
Markov Chains and Random Walks
Hans Bodlaender, Marek Cygan and Stefan Kratsch
Random walks on undirected graphs and a little bit about Markov Chains
Minimum Spanning Tree 8/7/2018 4:26 AM
Random walks on complex networks
Introduction to Trees Section 11.1.
Algorithms and networks
Planarity Testing.
Randomized Algorithms Markov Chains and Random Walks
On the effect of randomness on planted 3-coloring models
Review: Discrete Mathematics and Its Applications
Complexity Theory in Practice
ADDITIONAL ANALYSIS TECHNIQUES
Presentation transcript:

The Cover Time of Random Walks Uriel Feige Weizmann Institute

Random Walks Simple graph. Move to a neighbor chosen uniformly at random.

Random Walks

Hitting time and its variants Random variables associated with a random walk. Here we shall only deal with their expectations. Hitting time H(s,t). Expected number of steps to reach t starting at s. Commute time. Symmetric. C(s,t) = C(t,s) = H(s,t) + H(t,s). Difference time. Anti-symmetric. D(s,t) = -D(t,s) = H(s,t) - H(t,s).

Cover time Cov(s,G). The expected number of steps it takes a walk that starts at s to visit all vertices. Cov(G). Maximum over s of Cov(s,G). Cov + (G). Cover and return to start. What characterizes the cover time of a graph? How large might it be? How small? Special families of graphs. Deterministic algorithms for estimating the cover time for general graphs.

Computing the hitting time System of n linear equations. H(t,t) = 0. H(v,t) = 1 + avg H(N(v),t). Compute all hitting times to t by one matrix inversion. (Related approach computes hitting times for all pairs [Tetali 1999].) Applies to arbitrary Markov chains. Corollary: Hitting time is rational and computable in polynomial time.

Reducing cover time to hitting time Markov chain M on states (v,S). v - current vertex. S – vertices already visited. Step in G from u to v corresponds to step in M from (u,S) to (v,S+{v}). Cov + (s,G) = H((s,{s}),(s,V)) Corollary: Cover time is rational and computable in exponential time.

A detour - electrical networks Many analogies between random walks in graphs and electrical networks. Can help (depending on a persons background) in transferring intuition and theorems from one area to the other.

Effective Resistance Every edge – a resistor of 1 ohm. Voltage difference of 1 volt between u and v. R(u,v) – inverse of electrical current from u to v. _ u v +

Understanding the commute time Theorem [Chandra, Raghavan, Ruzzo, Smolensky, Tiwari 1989]: For every graph with m edges and every two vertices u and v, C(u,v) = 2mR(u,v) Proof: by comparing the respective systems of linear equations, for random walks and for electrical current flows.

Easy useful principles Removing an edge – increases is resistance to be infinite. Adding/removing an edge anywhere in the graph can only reduce/increase effective resistance. Contracting an edge – reduces its resistance to 0. Contracting an edge anywhere in the graph can only reduce effective resistance.

Series-parallel graphs R=R1+R2 1/R =1/R1 + 1/R2 R1 R2 R1 R2

Fosters network theorem For every connected graph on n vertices, the sum of effective resistances taken over all neighboring pairs of vertices is n-1.

Relating cover time to commute time [Aleliunas, Karp, Lipton, Lovasz, Rackoff 1979] Cover time is upper bounded by sum of commute times along edges of a spanning tree.

Spanning tree argument Arbitrary spanning tree [AKLLR, CRRST]: Best spanning tree [Feige 1995]: Lollipop graph: 2n/3 clique n/3 path

Coupon collector The spanning tree upper bound gives Cov(clique)<O(n 2 ). Too pessimistic. Covering a clique is almost like throwing balls in bins at random, until every bin has a ball. Hence Observe that H(u,v) = n-1. Covering requires a ln n overhead.

Relating cover time to hitting time [Matthews 1988] nth harmonic number

Proof of Matthews bound Arbitrarily order all vertices but s. Let Pr[i] denote the probability that i is the last vertex to be visited among {1, …, i}. For random permutation, Pr[i] = 1/i.

Lower bound on cover time [Feige 1995]: Proof: either there is a pair of vertices that witness the lower bound through their mutual hitting times, or a generalization of the Matthews bound (applying it to subsets of vertices) works.

Some special classes of graphs Order of magnitude of cover time: Path n 2 Expanders n log n 2-dim grids n log 2 n 3-dim grids n log n Full d-ary tree n log 2 n / log d In many cases, much more is known.

Regularity and cover time [Kahn, Linial, Nisan, Saks 1989]: the cover time on regular graphs is at most 4n 2. [Coppersmith, Feige, Shearer 1996]: every spanning tree has resistance at most 3n/d. [Feige 1997]: cover time at most 2n 2. Worse example known (necklace): 15n 2 /16.

Irregular graphs [Coppersmith, Feige, Shearer 1996]: every graph has a spanning tree of resistance at most O(n avg(1/deg)). Proof: random spanning tree. Uses the fact that fraction of spanning trees that use edge (u,v) is exactly R[u,v]. Upper bound on Cov + (G) based on irregularity avg(deg) x avg(1/deg) of G.

Spanning tree - without return [Feige 1997] (proof essentially, by induction): In every graph there is a vertex s with Path is the most difficult tree to cover (starting at the middle).

Approximating Cov(G) Max[C(u,v)] approximates Cov(G) within a factor of log n. Augmented Matthews lower bound (AMLB): [Kahn, Kim, Lovasz, Vu 2000]: AMLB approximated Cov(G) within a factor of O((log log n) 2 ), and can be efficiently approximated within a factor of 2.

Approximating Cov(s,G) Cov(s,G) might be much larger than max[H(s,v)]. key graph [Chlamtac, Feige, Rabinovich 2003, 2005]: Cov(s,G) can be approximated within a ratio of O(log n approx[Cov(G)]).

Tools used in proof Cycle identity for reversible MC: H(u,v)+H(v,w)+H(w,u) = H(u,w)+H(w,v)+H(v,u) Transitivity of difference time: D(u,v) > 0, D(v,w) > 0 imply D(u,w) > 0. Induces order …w,…v, …u,… Partition order into homogeneous blocks. Upper bound Cov(s,G) by covering block after block.

Full d-ary trees Cover time known in great detail [Aldous]. The technique: Compute return time to root r (easy). Compute expected number of returns to root during cover (recursive formula). Multiply the two to get Cov + (r,T).

Techniques for approximating the cover time Systems of linear equations (hitting times). Using identities involving cover time (Aldous). Effective resistance (commute times, Fosters theorem, etc.). Spanning tree arguments and extensions. Matthews bounds and extensions. Graph partitioning (order induced by difference time).

Open questions Deterministic approximation of Cov(G) and of Cov(s,G). (Conjecture: PTAS on trees soon.) Extremal problems. Which (regular) graphs have the largest/smallest cover times? (Conjectures exist.)

Additional topics Some results (e.g., correspondence with effective resistance) extend to reversible Markov chains. Some results (e.g., Matthews bounds) extend to arbitrary Markov Chains. This talk referred only to expected cover time. More known (and open) on full distribution of cover time.