Date: 2005/4/25 Advisor: Sy-Yen Kuo Speaker: Szu-Chi Wang.

Slides:



Advertisements
Similar presentations
Generating Random Spanning Trees Sourav Chatterji Sumit Gulwani EECS Department University of California, Berkeley.
Advertisements

The Cover Time of Random Walks Uriel Feige Weizmann Institute.
Great Theoretical Ideas in Computer Science
Gibbs sampler - simple properties It’s not hard to show that this MC chain is aperiodic. Often is reversible distribution. If in addition the chain is.
1 The Monte Carlo method. 2 (0,0) (1,1) (-1,-1) (-1,1) (1,-1) 1 Z= 1 If  X 2 +Y 2  1 0 o/w (X,Y) is a point chosen uniformly at random in a 2  2 square.
Approximation Algorithms for Unique Games Luca Trevisan Slides by Avi Eyal.
Markov Chains 1.
Artur Czumaj Dept of Computer Science & DIMAP University of Warwick Testing Expansion in Bounded Degree Graphs Joint work with Christian Sohler.
Markov Chain Monte Carlo Prof. David Page transcribed by Matthew G. Lee.
11 - Markov Chains Jim Vallandingham.
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Random Walks Ben Hescott CS591a1 November 18, 2002.
Lecture 3: Markov processes, master equation
Graduate School of Information Sciences, Tohoku University
6.896: Probability and Computation Spring 2011 Constantinos (Costis) Daskalakis lecture 2.
Overview of Markov chains David Gleich Purdue University Network & Matrix Computations Computer Science 15 Sept 2011.
More on Rankings. Query-independent LAR Have an a-priori ordering of the web pages Q: Set of pages that contain the keywords in the query q Present the.
Approximate Counting via Correlation Decay Pinyan Lu Microsoft Research.
Analysis of Network Diffusion and Distributed Network Algorithms Rajmohan Rajaraman Northeastern University, Boston May 2012 Chennai Network Optimization.
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
Great Theoretical Ideas in Computer Science.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
Introduction to PageRank Algorithm and Programming Assignment 1 CSC4170 Web Intelligence and Social Computing Tutorial 4 Tutor: Tom Chao Zhou
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
1 Mazes In The Theory of Computer Science Dana Moshkovitz.
Undirected ST-Connectivity 2 DL Omer Reingold, STOC 2005: Presented by: Fenghui Zhang CPSC 637 – paper presentation.
EXPANDER GRAPHS Properties & Applications. Things to cover ! Definitions Properties Combinatorial, Spectral properties Constructions “Explicit” constructions.
1 Markov Chains Algorithms in Computational Biology Spring 2006 Slides were edited by Itai Sharon from Dan Geiger and Ydo Wexler.
Zig-Zag Expanders Seminar in Theory and Algorithmic Research Sashka Davis UCSD, April 2005 “ Entropy Waves, the Zig-Zag Graph Product, and New Constant-
Expanders Eliyahu Kiperwasser. What is it? Expanders are graphs with no small cuts. The later gives several unique traits to such graph, such as: – High.
Complexity 1 Mazes And Random Walks. Complexity 2 Can You Solve This Maze?
1 On the Computation of the Permanent Dana Moshkovitz.
Sampling and Approximate Counting for Weighted Matchings Roy Cagan.
Approximating The Permanent Amit Kagan Seminar in Complexity 04/06/2001.
Problems, cont. 3. where k=0?. When are there stationary distributions? Theorem: An irreducible chain has a stationary distribution  iff the states are.
Mixing Times of Self-Organizing Lists and Biased Permutations Sarah Miracle Georgia Institute of Technology.
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Entropy Rate of a Markov Chain
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
GRAPHS CSE, POSTECH. Chapter 16 covers the following topics Graph terminology: vertex, edge, adjacent, incident, degree, cycle, path, connected component,
Data Structures Week 9 Introduction to Graphs Consider the following problem. A river with an island and bridges. The problem is to see if there is a way.
Algorithms to Approximately Count and Sample Conforming Colorings of Graphs Sarah Miracle and Dana Randall Georgia Institute of Technology (B,B)(B,B) (R,B)(R,B)
Expanders via Random Spanning Trees R 許榮財 R 黃佳婷 R 黃怡嘉.
Random Walks on Distributed N etworks Masafumi Yamash ita (Kyushu Univ., Japan)
Approximation Algorithms
6.896: Probability and Computation Spring 2011 Constantinos (Costis) Daskalakis lecture 3.
Markov Chains and Random Walks. Def: A stochastic process X={X(t),t ∈ T} is a collection of random variables. If T is a countable set, say T={0,1,2, …
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
Random walks on undirected graphs and a little bit about Markov Chains Guy.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Seminar on random walks on graphs Lecture No. 2 Mille Gandelsman,
Artur Czumaj DIMAP DIMAP (Centre for Discrete Maths and it Applications) Computer Science & Department of Computer Science University of Warwick Testing.
Complexity and Efficient Algorithms Group / Department of Computer Science Testing the Cluster Structure of Graphs Christian Sohler joint work with Artur.
Introduction to Graph Theory
Great Theoretical Ideas in Computer Science for Some.
Chromatic Coloring with a Maximum Color Class Bor-Liang Chen Kuo-Ching Huang Chih-Hung Yen* 30 July, 2009.
COMPSCI 102 Introduction to Discrete Mathematics.
Theory of Computational Complexity Probability and Computing Lee Minseon Iwama and Ito lab M1 1.
1 GRAPH Learning Outcomes Students should be able to: Explain basic terminology of a graph Identify Euler and Hamiltonian cycle Represent graphs using.
Complexity and Efficient Algorithms Group / Department of Computer Science Testing the Cluster Structure of Graphs Christian Sohler joint work with Artur.
Markov Chains and Random Walks
Markov Chains Mixing Times Lecture 5
Path Coupling And Approximate Counting
Haim Kaplan and Uri Zwick
Randomized Algorithms Markov Chains and Random Walks
On the effect of randomness on planted 3-coloring models
Ilan Ben-Bassat Omri Weinstein
Discrete Mathematics for Computer Science
Presented by Nick Janus
Presentation transcript:

Date: 2005/4/25 Advisor: Sy-Yen Kuo Speaker: Szu-Chi Wang

Outline  Notation and Preliminaries  Rapid Mixing Markov Chains  Commonly Studied Models  Conclusions and Future Works  References

Notation  A Markov chain is specified by the transition matrix P  Let  0 be the initial distribution and  t be the distribution after t steps The dynamics follows  If P is irreducible and aperiodic (viz ergodic) then  t converges to a unique stationary distribution  such that (independent of  0 )

Preliminaries  Conceptually M defines a random walk over  (viz moving from one configuration to another)  Design a Markov chain that would converge quickly to the desired distribution provides a useful tool for hard sampling problems  Two questions immediately arise 1. How do we modify this chain in order to sample from a complicated distribution? 2. How long do we have to simulate the walk before we can trust our samples? (viz they are chose from a distribution very close to  )

The Metropolis Algorithm  The most celebrated technique to assign the transition probabilities of a Markov chain so that it will converge to any chosen distribution  Let  be the desired probability distribution and d i be the degree of i For each neighbor j of node i let laziness factor required knowledge

The Convergence Time  Thus the next question to ask how quickly  t converges to   Relevant metrics 1. The total variation difference between  t and  is 2. For  > 0 the mixing time is defined as  A Markov chain is called rapidly mixing if is bounded above by poly(n) and

Foundations of Algebraic Graph Theory  Let G(V, E) be and n-vertex, undirected graph with max degree   Given the canonical labeling of eigenvalues i and orthonormal eigenvectors e i for the adjacency matrix A(G) 1. If G in connected, then 2 < 1 2. For 1  i  n, | i |   3.  is an eigenvalue iff G is regular 4. If G is d-regular, then the eigenvalue 1 =  has the eigenvector 5. G is bipartite iff for every eigenvalue there is an eigenvalue  6. Suppose that G is connected, then G is bipartite iff  is an eigenvalue 7. If G is d-regular and bipartite, n =   and

The Mixing Time  It is well-established that the eigenvalue gap of the transition matrix provides a good bound on the mixing rate  Let 0, 1, |  |-1 be the eigenvalues of P, 1 = 0 > | 1 |  | i | for all i  2 Let then for all we have  Practically, determining the eigenvalues tends to be far too difficult

Techniques for Bounding Mixing Times  Conductance For any set S   let, where is regarded as the capacity of (x, y) and The conductance is defined as  For a finite, reversible, ergodic Markov chain M with loop prob.  ½ for all states, the mixing time of M satisfies

 Coupling A coupling is a Markov chain M on    defining a stochastic process with the properties: I. Each of the processes X t and Y t is a faithful copy of M (given initial states X 0 = x and Y 0 = y) II. If X t = Y t then X t+1 = Y t+1 Techniques for Bounding Mixing Times (cont.)

 Path Coupling Let  be an integer-valued metric defined on which takes values in Let S be a subset of such that for all there exist a path between X t and Y t Suppose a Coupling of the Markov Chain M is defined on all pairs such that   < 1 s.t. for all, then the mixing time of M satisfies Techniques for Bounding Mixing Times (cont.)

 For G = (V, E), let and N(v) denote the neighbors of v A proper k-coloring is an assignment such that all adjacent vertices receive different colors  The positive-recurrent states of M are the proper coloring of G and the chain is ergodic on these states Commonly Studied Model

Illustration of Path Coupling uu only updates with z  N(u) and c  {c x, c y } may succeed or fail in exactly one chain rapid mixing if

A Cutting-Edge Study Non-uniform Random Membership Management in Peer-to-Peer Networks Ming ZhongKai ShenJoel Seiferas INFOCOM 2005

Electrical Networks b a 1 2 c 1 node branch resistance 1.0 amp Solve it via Kirchhoff’s Law and Ohm’s Law 0.5 volt 1.0 volt

Electrical Networks (cont.)  Given G, let N(G) be defined as (1) it has a node for each vertex in V (2) for every edge in E it has a 1.0 ohm resistance in N(G)  Use the language of electrical network theory for N(G) The effective resistance R uv between two u, v is |volt (u) – volt (v)| when one amp is injected into u and removed from v  The commute time C uv between two nodes u and v is the expected time for a random walk starting at u to return u after at least one visit to v

Electrical Networks (cont.)  Corollaries 1. For any two vertices u and v in G the commute time satisfies 2. Let T be any spanning tree of G and C(G) denote the cover time 3. The resistance of G characterizes its cover time:

Conclusions and Future Works  Markov chain Monte Carlo serves as a computational means for approximate sampling from large and complicated sets  Future directions might include - Membership management in large-scale distributed networks - Information dissemination in sensor/mobile ad hoc networks - Reliable surveillance systems - Interdisciplinary studies (e.g. in Statistical Physics, the prob. of a configuration is related to its energy)

References [1] R. Motwani and P. Raghavan, Randomized Algorithms, Cambridge Press [2] R. Bubley and M. Dyer, Path Coupling: A technique for Proving Rapid Mixing in Markov Chains, Proc. of 38th IEEE FOCS, 1997 [3] D. Randall, Mixing, Proc. of 44th IEEE FOCS, 2003 [4] M. Zhong, K. Shen, J. Seiferas, Non-uniform Random Membership Management in Peer-to-Peer Networks, Proc. of IEEE INFOCOM, 2005

Thanks for Your Attention