Download presentation
Presentation is loading. Please wait.
Published byMaurice Cunningham Modified over 9 years ago
1
The Poincaré Constant of a Random Walk in High- Dimensional Convex Bodies Ivona Bezáková Thesis Advisor: Prof. Eric Vigoda
2
Goal: Efficient algorithm for sampling points from high-dimensional convex bodies Approach: Random walks
3
Overview of the talk Motivation & History Introduction to random walks, Markov chains Definition and basic properties of ball walks Overview of the algorithm for uniform sampling Analysis of the Poincaré constant
4
Sampling points from convex bodies efficiently Computation of volume Sampling contingency tables [Morris] Universal portfolios [Kalai & Vempala] Convex programs [Bertsimas & Vempala] Motivation Why is computing volume difficult? Straightforward approach: - find bounding box - sample sufficiently many points from the box - compute ratio: # points in the body vs. total # points
5
This algorithm correctly approximates volume. Where is the catch? Ratio of the volume of the body vs. volume of the box decreases exponentially with dimension! This results in exponential running time. Note: For small dimension (2D, 3D) this algorithms works well. Goal: Find algorithm running in time polynomial in dimension.
6
How efficient sampling helps with volume? intersect with balls doubling in volume this defines sequence of convex bodies sample points from i- th body, compute ratio: i-th vs. (i-1)-st body return product of ratios Why is this better than the bounding box? - Volume of bodies max. doubles Result: volume of the original body vs. volume of the smallest body (a ball)
7
History negative results for volume computation: - cannot be approximated by deterministic polytime algorithm randomized approximation: - Dyer, Frieze, Kannan ’89 - improvements: combinations of [Applegate, Dyer, Frieze, Kannan, Lovász, Simonovits] - Kannan, Lovász, Simonovits ‘97 - #P-complete [Elekes, Bárány & Füredi, Dyer & Frieze, Khachiyan, Lawrence, Lovász & Simonovits]
8
Notation convex body of diameter (given by membership oracle) step-size ball unit-ball
9
Ball Walks Speedy Walk – next point Metropolis Walk – next point if then Problem: How to implement?
10
Markov Chains State space Transition distribution (likelihood of going from to ) For speedy walk we have state space = convex body if otherwise
11
Markov Chains 2 Stationary distribution = limiting distribution (fix-point)i.e. Mixing time For given mixing time is the expected number of steps needed to get close to the stationary distribution. Want: rapid mixing, i.e. time polynomial in and
12
Comparison KLS vs. this work Kannan, Lovász, Simonovits study so-called conductance for bounding mixing time. ? poly-logarithmic in We bound so-called Poincaré constant (generalization of conductance) and get mixing time cubic in spectral gap
13
New ideas in KLS separated analysis of speedy walk (fast mixing in principle) and Metropolis walk (efficient implementation) for volume computation: introduced isotropic position to reduce diameter of the body Why Poincaré constant? generalization might lead to better analysis through other quantities (log-Sobolev, [Frieze & Kannan, Jerrum & Son]) the same difficulty our focus: survey
14
Poincaré constant where Well-studied Quantities and Dirichlet form(local variance) measures decaying of variance
15
Well-studied Quantities 2 (Properties of Poincaré) Thm: For (lazy reversible) Markov Chain For Markov chains defined on finite state spaces the Poincaré constant equals the spectral gap. where is the distribution after steps with probability ½ stay at the same statecorresponds to symmetric chains and is the stationary distribution
16
Thoughts about Poincaré constant If then Thus, in this case and the chain mixes (very) rapidly. Intuitively, this corresponds to a complete graph, where we can get from any point to any other point.
17
Well-studied Quantities 3 Conductance equals Poincaré over indicator functions trivially Cheeger-type inequality by Jerrum and Sinclair, ‘89
18
Properties of Ball Walks Local conductance Ball walks: stationary distribution where reversible
19
From Speedy Walk to Uniform Sampling (Overview) bound Poincaré constant for speedy walk mixing time for speedy walk running time of Metropolis walk (assuming good starting distribution) obtain a good starting distribution from a sample point from the speedy distribution obtain a sample point close to the uniform distribution
20
From Speedy Walk to Uniform Sampling Poincaré inequality (for speedy walk): Ifthen for some dimension-independent constant Mixing time of speedy walk: For givendistribution after steps withinfrom speedy distribution (assuming reasonable starting distribution ) Thm: For (lazy reversible) Markov Chain where is the distribution after steps and is the stationary distribution
21
From Speedy Walk to Uniform Sampling (Overview) bound Poincaré constant for speedy walk mixing time for speedy walk running time of Metropolis walk (assuming good starting distribution) obtain a good starting distribution from a sample point from the speedy distribution obtain a sample point close to the uniform distribution
22
From Speedy Walk to Uniform Sampling 2 From speedy to Metropolis walk Run M. walk until speedy steps Mixing time of Metropolis walk: Ifthen we expect the total number of steps (speedy + Metropolis) (with exception )to be at most where is the average local conductance:
23
From Speedy Walk to Uniform Sampling (Overview) bound Poincaré constant for speedy walk mixing time for speedy walk running time of Metropolis walk (assuming good starting distribution) obtain a good starting distribution from a sample point from the speedy distribution obtain a sample point close to the uniform distribution
24
From Speedy Walk to Uniform Sampling 3 Obtaining a good starting distribution Let and for where Algo: Sample from according to For obtain : Run Metropolis in starting at
25
From Speedy Walk to Uniform Sampling 4 Good starting distribution for Metropolis walk Thm: For sufficiently small and the distribution of is within of. Expected total number of oracle calls (with exception ) is less than
26
From Speedy Walk to Uniform Sampling (Overview) bound Poincaré constant for speedy walk mixing time for speedy walk running time of Metropolis walk (assuming good starting distribution) obtain a good starting distribution from a sample point from the speedy distribution obtain a sample point close to the uniform distribution
27
From Speedy Walk to Uniform Sampling 5 From speedy distribution to the uniform distribution Algo: Shrink Sample from until Return Thm: If and sufficiently small then the distribution of is away from the uniform distribution. Expected number of samples needed.
28
From Speedy Walk to Uniform Sampling (Overview) bound Poincaré constant for speedy walk mixing time for speedy walk running time of Metropolis walk (assuming good starting distribution) obtain a good starting distribution from a sample point from the speedy distribution obtain a sample point close to the uniform distribution
29
Proof of the Poincaré Inequality Restricted variance, Dirichlet form, expected value Poincaré inequality (for speedy walk): Ifthen for any function for some dimension-independent constant where
30
Idea of the proof: For a sufficiently small set such that does not vary much within Assuming Poincaré does not hold, we find a set contradicting the above Find needle-like s.t. wlog and Chop to obtain desired set
31
Needle-like Body Eliminate dimensions one by one (inductively) assume has fat dimensions and while projection of onto two fat dim. there exists a point s.t. any line through cuts into appx. half take hyperplane s.t. at least one of these must be true or
32
Shrinking Last Dimension Goal: find s.t. last dim. of is and where is a constant (dependent on ) How? Chop into Ideally ? But
33
Assumption Idea: relate towhere We get where and Next goal: bound
34
Chopping of What do we need? does not vary much within, i.e. for any let width of is at most We will show that this chopping allows us to bound appropriately
35
Properties of local conductance is concave over From Brunn-Minkowski Thm: is Lipschitz over : For any Implications for the The width of increases, then it is (full width) and then it decreases For sufficiently small the width of any is at least
36
From Dinghas’ Thm: For the middle section is convex. Now we can split into several sums and estimate them separately Thus where is the number of slabs in the middle section What to do outside the middle section?
37
In the left section, the increase exponentially This allows us to bound We obtain similar bounds for other parts of the sum, putting them together we get We wanted Thus we proved the Poincaré inequality.
38
THANK YOU
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.