Download presentation
Presentation is loading. Please wait.
Published byJemima Brittney Richards Modified over 9 years ago
1
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl
2
11.2.1 n-step transition probabilities (review) 2
3
Transition prob. matrix n-step transition prob. from state i to j is n-step transition matrix (for all states) is then For instance, two step transition matrix is 3
4
Chapman-Kolmogorov equations Prob. of going from state i at t=0, passing though state k at t=m, and ending at state j at t=m+n is In matrix notation, 4
5
11.2.2 state probabilities 5
6
State probability (pmf of an RV!) Let p(n) = {p j (n)}, for all j E, be the row vector of state probs. at time n (i.e., state prob. vector) Thus, p(n) is given by From the initial state In matrix notation 6
7
How an MC changes (Ex 11.10, 11.11) A two-state system 7 Silence (state 0) Speech (state 1) 0.1 0.9 0.8 0.2 Suppose p(0)=(0,1) Then p(1) = p(0)P= (0,1)P = (0.2, 0.8) p(2)= (0.2,0.8)P= (0,1)P 2 = (0.34, 0.66) p(4)= (0,1)P 4 = (0.507, 0.493) p(8)= (0,1)P 8 = (0.629, 0.371) p(16)= (0,1)P 16 = (0.665, 0.335) p(32)= (0,1)P 32 = (0.667, 0.333) p(64)= (0,1)P 64 = (0.667, 0.333) Suppose p(0)=(1,0) p(1) = p(0)P = (0.9, 0.1) p(2) = (1,0)P 2 = (0.83, 0.17) p(4)= (1,0)P 4 = (0.747, 0.253) p(8)= (1,0)P 8 = (0.686, 0.314) p(16)= (1,0)P 16 = (0.668, 0.332) p(32)= (1,0)P 32 = (0.667, 0.333) p(64)= (1,0)P 64 = (0.667, 0.333)
8
Independence of initial condition 8
9
The lesson to take away No matter what assumptions you make about the initial probability distribution, after a large number of steps, the state probability distribution is approximately (2/3, 1/3) 9 See p.666, 667
10
11.2.3 steady state probabilities 10
11
State probabilities (pmf) converge As n , then transition prob. matrix P n approaches a matrix whose rows are equal to the same pmf. In matrix notation, where 1 is a column vector of all 1’s, and =( 0, 1, … ) The convergence of P n implies the convergence of the state pmf’s 11
12
Steady state probability System reaches “equilibrium” or “steady state”, i.e., n , p j (n) j, p i (n-1) i In matrix notation, here is stationary state pmf of the Markov chain To solve this, 12
13
Speech activity system From the steady state probabilities 13 = P ( 1, 2 ) = ( 1, 2 ) 0.90.1 0.20.8 1 = 0.9 1 + 0.1 2 2 = 0.2 1 + 0.8 2 1 + 2 = 1 1 = 2 / 3 = 0.667 2 = 1 / 3 = 0.333
14
14 Question 11-1: Alice, Bob and Carol are playing Frisbee. Alice always throws to Carol. Bob always throws to Alice. Carol throws to Bob 2/3 of the time and to Alice 1/3 of the time. In the long run, what percentage of the time do each of the players have the Frisbee?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.