Download presentation
Presentation is loading. Please wait.
1
Computational statistics 2009 Random walk
2
Computational statistics 2009 Random walk with absorbing barrier
3
Computational statistics 2009 Discrete-time Markov chains Let X 0, X 1, X 2, … be a sequence of integer-valued random variables Then {X n } n is said to be a Markov chain if for all i, j, i 0, …, i n-1, n Transition probabilities depend on the past history of the chain only through the current value
4
Computational statistics 2009 Discrete-time Markov chains - examples
5
Computational statistics 2009 Discrete-time Markov chain - transition matrix
6
Computational statistics 2009 Discrete-time Markov chain - transition matrix
7
Computational statistics 2009 Ehrenfest’s diffusion model Suppose that M molecules are distributed among the compartments A and B. Change at each time point the distribution among A and B by selecting one molecule at random and placing it in a randomly selected compartment. Let X n be the number of molecules in compartment A at time n. Do the limiting probabilities exist? AB
8
Computational statistics 2009 Stationary distribution of a Markov chain For an irreducible, aperiodic chain with states (x 1, …, x S ) and transition probabilities p ij = p(x i, x j ) there is a unique probability distribution with mass probabilities j = (x j ) satisfying This distribution is known as the stationary distribution of the Markov chain If the initial distribution (0) of a chain is equal to its stationary distribution, the marginal distribution (n) of the state at time n is again given by the stationary distribution
9
Computational statistics 2009 Limiting probabilities for irreducible aperiodic Markov chains For an aperiodic irreducible Markov chain with stationary distribution it can be shown that regardless of the initial distribution (0 ). The limiting probability that the process will be in state x j at time n, equals the long- run proportion of time that the process will be in state x j. A way to generate values from a distribution f is to construct a Markov chain with f as its stationary distribution, and to run the chain from an arbitrary starting value until the distribution (n) converges to f
10
Computational statistics 2009 Ehrenfest’s diffusion model – stationary distribution Suppose that M molecules are distributed among the compartments A and B. At each time point the distribution among A and B is changed by selecting one molecule at random and placing it in a randomly selected compartment. Let X n be the number of molecules in compartment A at time n. The stationary distribution is then given by. and the equations relating 1 to 0 and M to M-1. AB
11
Computational statistics 2009 A simple proposal-rejection method for random number generation Target distribution: Proposal chain: simple random walk Given x t, next propose y = x t + 1 or x t – 1, each with probability 0.5 Compute the “goodness ratio” Acceptance/rejection: Let U uniform (0,1) Accept if U < r ; Reject otherwise.
12
Computational statistics 2009 Random number generation - an example of a Markov chain proposal-rejection method Target distribution:
13
Computational statistics 2009 Random number generation - an example of a Markov chain proposal-rejection method Target distribution:
14
Computational statistics 2009 Markov Chain Monte Carlo (MCMC) methods Algorithms for sampling from probability distributions based on constructing a Markov chain that has the desired distribution as its equilibrium distribution The most common application of these algorithms is numerically calculating multi-dimensional integrals, such as
15
Computational statistics 2009 Random walk MCMC methods Metropolis-Hastings algorithm: Generates a random walk using a proposal density and a method for rejecting proposed moves Gibbs sampling: Requires that all the conditional distributions of the target distribution can be sampled exactly. Slice sampling: Alternates uniform sampling in the vertical direction with uniform sampling from the horizontal `slice' defined by the current vertical position..
16
Computational statistics 2009 Random number generation - the Metropolis-Hastings algorithm Start with any initial value x 0 and a proposal chain T(x, y) Suppose x t has been drawn at time t Draw y T(x t, y) (i.e. propose a move for the next step) Compute the Metropolis ratio (or “goodness ratio”) Acceptance/rejection:.
17
Computational statistics 2009 The Gibb’s sampler for bivariate distributions - a simple example Let's look at simulating observations of a bivariate normal vector (X, Y) with zero mean and unit variance for the marginals, and a correlation of between the two components. [X | Y = y] N( y, 1- 2 ) [Y | X = x] N( x, 1- 2 ) Let’s start from, say, x = 10, y = 5. Draw new x’s from [X | Y = y] and new y’s from [Y | X = x] 100 simulated values
18
Computational statistics 2009 Gibbs sampler The Gibbs sampler is a way to generate empirical distributions of a random vector (X, Y) when the conditional probability distributions F(X | Y) and G(Y | X) are known Start with a random set of possible X's, draw Y's from G(), then use those Y's to draw X's, and so on indefinitely. Keep track of the X's and Y's seen, and this will give samples enough to find the unconditional distribution of ( X, Y).
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.