Download presentation
Presentation is loading. Please wait.
Published byEdmund Allen Modified over 9 years ago
1
CS774. Markov Random Field : Theory and Application Lecture 15 Kyomin Jung KAIST Oct 29 2009
2
Sampling What is sampling? Given a probability distribution, pick a point according to. e.g. Monte Carlo method for integration Choose numbers uniformly at random from the integration domain, and sum up the value of f at those points Sometimes we don’t know but can evaluate its ratios at given points (ex MRF)
3
How to use Sampling? Volume computation in Eucliden space. In MRF set up, one can compute marginal probabilities from random samplings. MRF on G
4
Some side questions In most randomized algorithm (ex sampling), we assume that we can choose uniform random numbers from {0,1} for polynomially many time. In practice, we usually use only “one time” random seed from a time function. Pure random binary sequence is expensive. An interesting research topic called “Pseudo- random generator” deals with it.
5
The Gibbs Sampling Develpoed by Geman & Geman 1984, Gelfand & Smith 1990. with distribution Consider a random vector Suppose that the full set of conditional distributions where MRF
6
The Gibbs Sampling We assume that these conditional distributions can be sampled (in MRF, it is possible). Start at some value The algorithm: Sample from
7
The Gibbs Sampling Cycle through the components again: … up to. At time n, update the i th component by from
8
Markov Chain A chain of events whose transitions occur at discrete times States S 1, S 2, … X t is the state that the chain is in at time t Conditional probability P(X t =S j |X t1 =S i1, X t2 =S i2, …, X tn =S in ) The system is a Markov Chain if the distribution of Xt is independent of all previous states except for its immediate predecessor Xt-1 P(X t =S j |X 1 =S i1, X 2 =S i2, …, X t-1 =S it-1 )=P(X t =S j |X t-1 =S it-1 )
9
Stationary distribution Stationary Distribution of Markov Chain If Markov Chain satisfies some mild conditions, it gradually forget its initial state. Eventually converge to a unique stationary distribution. Gibbs Sampling satisfies Detailed Balance Equation So Gibbs Sampling has as its stationary distribution. Hence, Gibbs Sampling is a special case of MCMC (Mrokov Chain Monte Carlo method).
10
An MRF on a graph G exhibits a correlation decay (long-range independence), if when is large Remind: Correlation Decay Practically, Gibbs Sampling works well (converges fast) when the MRF has correlation decay.
11
A similar algorithm: Hit and Run Hit and Run algorithm is used to sample from a convex set in an n-dimensional Eucliden space. It converges in
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.