Presentation is loading. Please wait.

Presentation is loading. Please wait.

Date: 2005/4/25 Advisor: Sy-Yen Kuo Speaker: Szu-Chi Wang.

Similar presentations


Presentation on theme: "Date: 2005/4/25 Advisor: Sy-Yen Kuo Speaker: Szu-Chi Wang."— Presentation transcript:

1 Date: 2005/4/25 Advisor: Sy-Yen Kuo Speaker: Szu-Chi Wang

2 Outline  Notation and Preliminaries  Rapid Mixing Markov Chains  Commonly Studied Models  Conclusions and Future Works  References

3 Notation  A Markov chain is specified by the transition matrix P  Let  0 be the initial distribution and  t be the distribution after t steps The dynamics follows  If P is irreducible and aperiodic (viz ergodic) then  t converges to a unique stationary distribution  such that (independent of  0 )

4 Preliminaries  Conceptually M defines a random walk over  (viz moving from one configuration to another)  Design a Markov chain that would converge quickly to the desired distribution provides a useful tool for hard sampling problems  Two questions immediately arise 1. How do we modify this chain in order to sample from a complicated distribution? 2. How long do we have to simulate the walk before we can trust our samples? (viz they are chose from a distribution very close to  )

5 The Metropolis Algorithm  The most celebrated technique to assign the transition probabilities of a Markov chain so that it will converge to any chosen distribution  Let  be the desired probability distribution and d i be the degree of i For each neighbor j of node i let laziness factor required knowledge

6 The Convergence Time  Thus the next question to ask how quickly  t converges to   Relevant metrics 1. The total variation difference between  t and  is 2. For  > 0 the mixing time is defined as  A Markov chain is called rapidly mixing if is bounded above by poly(n) and

7 Foundations of Algebraic Graph Theory  Let G(V, E) be and n-vertex, undirected graph with max degree   Given the canonical labeling of eigenvalues i and orthonormal eigenvectors e i for the adjacency matrix A(G) 1. If G in connected, then 2 < 1 2. For 1  i  n, | i |   3.  is an eigenvalue iff G is regular 4. If G is d-regular, then the eigenvalue 1 =  has the eigenvector 5. G is bipartite iff for every eigenvalue there is an eigenvalue  6. Suppose that G is connected, then G is bipartite iff  is an eigenvalue 7. If G is d-regular and bipartite, n =   and

8 The Mixing Time  It is well-established that the eigenvalue gap of the transition matrix provides a good bound on the mixing rate  Let 0, 1, |  |-1 be the eigenvalues of P, 1 = 0 > | 1 |  | i | for all i  2 Let then for all we have  Practically, determining the eigenvalues tends to be far too difficult

9 Techniques for Bounding Mixing Times  Conductance For any set S   let, where is regarded as the capacity of (x, y) and The conductance is defined as  For a finite, reversible, ergodic Markov chain M with loop prob.  ½ for all states, the mixing time of M satisfies

10  Coupling A coupling is a Markov chain M on    defining a stochastic process with the properties: I. Each of the processes X t and Y t is a faithful copy of M (given initial states X 0 = x and Y 0 = y) II. If X t = Y t then X t+1 = Y t+1 Techniques for Bounding Mixing Times (cont.)

11  Path Coupling Let  be an integer-valued metric defined on which takes values in Let S be a subset of such that for all there exist a path between X t and Y t Suppose a Coupling of the Markov Chain M is defined on all pairs such that   < 1 s.t. for all, then the mixing time of M satisfies Techniques for Bounding Mixing Times (cont.)

12  For G = (V, E), let and N(v) denote the neighbors of v A proper k-coloring is an assignment such that all adjacent vertices receive different colors  The positive-recurrent states of M are the proper coloring of G and the chain is ergodic on these states Commonly Studied Model

13 Illustration of Path Coupling uu only updates with z  N(u) and c  {c x, c y } may succeed or fail in exactly one chain rapid mixing if

14 A Cutting-Edge Study Non-uniform Random Membership Management in Peer-to-Peer Networks Ming ZhongKai ShenJoel Seiferas INFOCOM 2005

15 Electrical Networks b a 1 2 c 1 node branch resistance 1.0 amp Solve it via Kirchhoff’s Law and Ohm’s Law 0.5 volt 1.0 volt

16 Electrical Networks (cont.)  Given G, let N(G) be defined as (1) it has a node for each vertex in V (2) for every edge in E it has a 1.0 ohm resistance in N(G)  Use the language of electrical network theory for N(G) The effective resistance R uv between two u, v is |volt (u) – volt (v)| when one amp is injected into u and removed from v  The commute time C uv between two nodes u and v is the expected time for a random walk starting at u to return u after at least one visit to v

17 Electrical Networks (cont.)  Corollaries 1. For any two vertices u and v in G the commute time satisfies 2. Let T be any spanning tree of G and C(G) denote the cover time 3. The resistance of G characterizes its cover time:

18 Conclusions and Future Works  Markov chain Monte Carlo serves as a computational means for approximate sampling from large and complicated sets  Future directions might include - Membership management in large-scale distributed networks - Information dissemination in sensor/mobile ad hoc networks - Reliable surveillance systems - Interdisciplinary studies (e.g. in Statistical Physics, the prob. of a configuration is related to its energy)

19 References [1] R. Motwani and P. Raghavan, Randomized Algorithms, Cambridge Press [2] R. Bubley and M. Dyer, Path Coupling: A technique for Proving Rapid Mixing in Markov Chains, Proc. of 38th IEEE FOCS, 1997 [3] D. Randall, Mixing, Proc. of 44th IEEE FOCS, 2003 [4] M. Zhong, K. Shen, J. Seiferas, Non-uniform Random Membership Management in Peer-to-Peer Networks, Proc. of IEEE INFOCOM, 2005

20 Thanks for Your Attention


Download ppt "Date: 2005/4/25 Advisor: Sy-Yen Kuo Speaker: Szu-Chi Wang."

Similar presentations


Ads by Google