Cover times, blanket times, and the GFF Jian Ding Berkeley-Stanford-Chicago James R. Lee University of Washington Yuval Peres Microsoft Research.

Slides:



Advertisements
Similar presentations
Generating Random Spanning Trees Sourav Chatterji Sumit Gulwani EECS Department University of California, Berkeley.
Advertisements

The Cover Time of Random Walks Uriel Feige Weizmann Institute.
Time-Space Tradeoffs in Resolution: Superpolynomial Lower Bounds for Superlinear Space Chris Beck Princeton University Joint work with Paul Beame & Russell.
Orthogonal Drawing Kees Visser. Overview  Introduction  Orthogonal representation  Flow network  Bend optimal drawing.
Approximation Algorithms Chapter 14: Rounding Applied to Set Cover.
Trees and Markov convexity James R. Lee Institute for Advanced Study [ with Assaf Naor and Yuval Peres ] RdRd x y.
A survey of some results on the Firefighter Problem Kah Loon Ng DIMACS Wow! I need reinforcements!
Time-Space Tradeoffs in Resolution: Superpolynomial Lower Bounds for Superlinear Space Chris Beck Princeton University Joint work with Paul Beame & Russell.
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Absorbing Random walks Coverage
Random Walks Ben Hescott CS591a1 November 18, 2002.
Entropy Rates of a Stochastic Process
Parallel random walks Brian Moffat. Outline What are random walks What are Markov chains What are cover/hitting/mixing times Speed ups for different graphs.
CS774. Markov Random Field : Theory and Application Lecture 04 Kyomin Jung KAIST Sep
Analysis of Network Diffusion and Distributed Network Algorithms Rajmohan Rajaraman Northeastern University, Boston May 2012 Chennai Network Optimization.
Generalized Chebyshev polynomials and plane trees Anton Bankevich St. Petersburg State University Jass’07.
Combinatorial optimization and the mean field model Johan Wästlund Chalmers University of Technology Sweden.
Complexity Theory CSE 331 Section 2 James Daly. Reminders Project 4 is out Due Friday Dynamic programming project Homework 6 is out Due next week (on.
CPSC 689: Discrete Algorithms for Mobile and Wireless Systems Spring 2009 Prof. Jennifer Welch.
Approximation Algorithm: Iterative Rounding Lecture 15: March 9.
1 University of Freiburg Computer Networks and Telematics Prof. Christian Schindelhauer Mobile Ad Hoc Networks Theory of Data Flow and Random Placement.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 23 Instructor: Paul Beame.
Computability and Complexity 24-1 Computability and Complexity Andrei Bulatov Approximation.
CSE 421 Algorithms Richard Anderson Lecture 27 NP Completeness.
Recurrences / HW: 2.4 Quiz: 2.1, 4.1, 4.2, 5.2, 7.3, 7.4 Midterm: 8 given a recursive algorithm, state the recurrence solve a recurrence, using Master.
Finding a maximum independent set in a sparse random graph Uriel Feige and Eran Ofek.
Hardness Results for Problems
Packing Element-Disjoint Steiner Trees Mohammad R. Salavatipour Department of Computing Science University of Alberta Joint with Joseph Cheriyan Department.
cover times, blanket times, and majorizing measures Jian Ding U. C. Berkeley James R. Lee University of Washington Yuval Peres Microsoft Research TexPoint.
Distance scales, embeddings, and efficient relaxations of the cut cone James R. Lee University of California, Berkeley.
Volume distortion for subsets of R n James R. Lee Institute for Advanced Study & University of Washington Symposium on Computational Geometry, 2006; Sedona,
Beating the Union Bound by Geometric Techniques Raghu Meka (IAS & DIMACS)
ⅠIntroduction to Set Theory 1. Sets and Subsets
CS774. Markov Random Field : Theory and Application Lecture 08 Kyomin Jung KAIST Sep
1 By: MOSES CHARIKAR, CHANDRA CHEKURI, TOMAS FEDER, AND RAJEEV MOTWANI Presented By: Sarah Hegab.
Theory of Computing Lecture 15 MAS 714 Hartmut Klauck.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
DATA MINING LECTURE 13 Absorbing Random walks Coverage.
The Complexity of Optimization Problems. Summary -Complexity of algorithms and problems -Complexity classes: P and NP -Reducibility -Karp reducibility.
Edge-disjoint induced subgraphs with given minimum degree Raphael Yuster 2012.
A PTAS for Computing the Supremum of Gaussian Processes Raghu Meka (IAS/DIMACS)
An Efficient Algorithm for Enumerating Pseudo Cliques Dec/18/2007 ISAAC, Sendai Takeaki Uno National Institute of Informatics & The Graduate University.
 Ⅰ Introduction to Set Theory  1. Sets and Subsets  Representation of set:  Listing elements, Set builder notion, Recursive definition  , ,  
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
Embeddings, flow, and cuts: an introduction University of Washington James R. Lee.
Lecture 2: Statistical learning primer for biologists
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Random Geometric Graph Model Model for ad hoc/sensor networks n nodes placed in d-dim space Connectivity threshold r Two nodes u,v connected iff ||u-v||
Date: 2005/4/25 Advisor: Sy-Yen Kuo Speaker: Szu-Chi Wang.
Tilings, Geometric Representations, and Discrete Analytic Functions László Lovász Microsoft Research
Approximation Algorithms based on linear programming.
Topic Overview and Study Checklist. From Chapter 7 in the white textbook: Modeling with Differential Equations basic models exponential logistic modified.
Shuffling by semi-random transpositions Elchanan Mossel, U.C. Berkeley Joint work with Yuval Peres and Alistair Sinclair.
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
The NP class. NP-completeness
Chapter 10 NP-Complete Problems.
Probability Theory and Parameter Estimation I
Peer-to-Peer and Social Networks
Path Coupling And Approximate Counting
Reconstruction on trees and Phylogeny 1
Exact Inference Continued
Possibilities and Limitations in Computation
NP-Completeness Yin Tat Lee
Intro to NP Completeness
Problem Solving 4.
NP-Completeness Yin Tat Lee
Daniel Dadush Centrum Wiskunde & Informatica (CWI) Aussois 2019
Major Design Strategies
Presentation transcript:

cover times, blanket times, and the GFF Jian Ding Berkeley-Stanford-Chicago James R. Lee University of Washington Yuval Peres Microsoft Research

random walks on graphs By putting conductances { c uv } on the edges of the graph, we can get any reversible Markov chain.

hitting and covering Hitting time: H ( u, v ) = expected # of steps to hit v starting at u Commute time: κ ( u, v ) = H ( u, v ) + H ( v, u ) (metric) Cover time: t cov ( G ) = expected time to hit all vertices of G, starting from the worst vertex

hitting and covering path complete graph expander 2-dimensional grid 3-dimensional grid complete d-ary tree n 2 n log n n (log n ) 2 n log n n (log n ) 2 /log d orders of magnitude of some cover times [coupon collecting] [Broder-Karlin 88] [Aldous 89, Zuckerman 90] [Zuckerman 90] Hitting time: H ( u, v ) = expected # of steps to hit v starting at u Commute time: κ ( u, v ) = H ( u, v ) + H ( v, u ) (metric) Cover time: t cov ( G ) = expected time to hit all vertices of G, starting from the worst vertex

hitting and covering Hitting time: H(u,v) = expected # of steps to hit v starting at u Commute time: κ (u,v) = H(u,v) + H(v,u) (metric) Cover time: t cov ( G ) = expected time to hit all vertices of G, starting from the worst vertex regular trees random graphs discrete torus, lattices [Aldous 91] [Cooper-Frieze 08] [Dembo-Peres-Rosen-Zeitouni 04] asymptotically optimal bounds

hitting and covering (1- o (1)) n log n · t cov ( G ) · 2 nm general bounds ( n = # vertices, m = # edges) [Alelinuas-Karp-Lipton-Lovasz-Rackoff’79] [Feige’95, Matthews’88] Hitting time: H ( u, v ) = expected # of steps to hit v starting at u Commute time: κ ( u, v ) = H ( u, v ) + H ( v, u ) (metric) Cover time: t cov ( G ) = expected time to hit all vertices of G, starting from the worst vertex

electrical resistance + u v R eff ( u, v ) = inverse of electrical current flowing from u to v [Chandra-Raghavan-Ruzzo-Smolensky-Tiwari’89]: If G has m edges, then for every pair u, v κ ( u, v ) = 2 m R eff ( u, v ) (endows κ with certain geometric properties) Hitting time: H ( u, v ) = expected # of steps to hit v starting at u Commute time: κ ( u, v ) = H ( u, v ) + H ( v, u ) (metric) Cover time: t cov ( G ) = expected time to hit all vertices of G, starting from the worst vertex

computation Hitting time: Easy to compute in deterministic poly time by solving system of linear equations Cover time: Easy to compute in exponential time by a determinsitic algorithm and polynomial time by a randomized algorithm Natural question: Does there exist a poly-time deterministically computable O( 1 )-approximation for general graphs? [Aldous-Fill’94]

approximation in derministic poly-time Clearly: [Kahn-Kim-Lovasz-Vu’99] show that Matthews’ lower bound gives an O (log log n ) 2 approximation to t cov [Matthews’88] proved: and [Feige-Zeitouni’09] give a ( 1 + ² )-approximation for trees for every ² > 0, using recursion.

blanket times Blanket times [Winkler-Zuckerman’96] : The ¯ -blanket time t blanket ( G, ¯ ) is the expected first time T at which all the local times, are within a factor of ¯

blanket times Blanket times [Winkler-Zuckerman’96] : The ¯ -blanket time t blanket ( G, ¯ ) is the expected first time T at which all the local times, are within a factor of ¯

blanket time conjecture Conjecture [Winkler-Zuckerman’96]: For every graph G and 0 < ¯ < 1, t blanket ( G, ¯ ) ³ t cov ( G ). Proved for some special cases. ³³ True up to (log log n) 2 by [Kahn-Kim-Lovasz-Vu’99] ³

Gaussian free field on a graph A centered Gaussian process { g v } v 2 V satisfying for all u, v 2 V : and for some fixed v 0 2 V.

Gaussian free field on a graph A centered Gaussian process { g v } v 2 V satisfying for all u, v 2 V : and for some fixed v 0 2 V. Density proportional to

Gaussian free field on a graph A centered Gaussian process { g v } v 2 V satisfying for all u, v 2 V : and for some fixed v 0 2 V. Covariance given by the Green function of SRW killed at v 0 : expected # visits to v from RW start at u before hitting v 0

main result For every graph G =( V, E ), for every 0 < ¯ < 1, where { g v } v 2 V is the GFF on G. Positively resolves the Winkler-Zuckerman blanket time conjectures. ³³ ³

geometrical interpretation where D is the diameter of the projection of this embedding onto a random line:

Gaussian processes

Consider a Gaussian process { X u : u 2 S } with E ( X u )=0 8 u 2 S Such a process comes with a natural metric transforming ( S, d ) into a metric space. PROBLEM: What is E max { X u : u 2 S } ?

majorizing measures Majorizing measures theorem [Fernique-Talagrand]: is a functional on metric spaces, given by an explicit formula which takes exponential time to calculate from the definition

main theorem, restated For every graph G =( V, E ), where · = commute time. We also construct a deterministic poly-time algorithm to approximate within a constant factor. T HEOREM: C OROLLARY: There is a deterministic poly-time O( 1 )-approximation for t cov Positively answers the question of Aldous and Fill.

majorizing measures Majorizing measures theorem [Fernique-Talagrand]:

Gaussian processes P ROBLEM: What is E max { X u : u 2 S } ? ® If random variables are “independent,” expect the union bound to be tight. Gaussian concentration: Expect max for k points is about

Gaussian processes Gaussian concentration: Sudakov minoration:

chaining

Best possible tree upper bound yields [Dudley’67]: where N ( S, d, ² ) is the minimal number of ² -balls needed to cover S

hints of a connection Gaussian concentration: Sudakov minoration:

hints of a connection Sudakov minoration: Matthew’s bound (1988):

hints of a connection Gaussian concentration: KKLV’99 concentration:

hints of a connection Dudley’67 entropy bound: Barlow-Ding-Nachmias-Peres’09 analog for cover times

hints of a connection

Dynkin isomorphism theory

Ray-Knight (1960s): Characterize the local times of Brownian motion. Dynkin (1980): General connection of Markov process to Gaussian fields. The version we used is due to Eisenbaum, Kaspi, Marcus, Rosen, and Shi (2000)

Generalized Ray-Knight theorem law

blanket time vs. GFF

the lower bound law

only one Everest, but...

a problem on Gaussian processes Gaussian free field:

a problem on Gaussian processes Gaussian free field: We need strong estimates on the size of this window. (want to get a point there with probability at least 0.1 ) Problem: Majorizing measures handles first moments, but we need second moment bounds.

percolation on trees and the GFF

First and second moments agree for percolation on balanced trees Problem: General Gaussian processes behaves nothing like percolation! Resolution: Processes coming from the Isomorphism Theorem all arise from a GFF

percolation on trees and the GFF First and second moments agree for percolation on balanced trees For DGFFs, using electrical network theory, we show that it is possible to select a subtree of the MM tree and a delicate filtration of the probability space so that the Gaussian process can be coupled to a percolation process.

example: precise asymptotics in 2D Bolthausen, Deuschel and Giacomin (2001): Dembo, Peres, Rosen, and Zeitouni (2004):

open questions Lower bound holds for: - complete graph - complete d-ary tree - discrete 2D torus - bounded degree graphs [Ding’ 11 ] Q UESTION: Is there a deterministic, polynomial-time ( 1 + ² )-approximation to the cover time for every ² > 0 ? Q UESTION: Is the standard deviation of the time-to-cover bounded by the maximum hitting time? Upper bound is true.