Download presentation
Presentation is loading. Please wait.
1
Courtesy of J. Bard, L. Page, and J. Heyl
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl
2
11.2.1 n-step transition probabilities (review)
3
Transition prob. matrix
n-step transition prob. from state i to j is n-step transition matrix (for all states) is then For instance, two step transition matrix is
4
Chapman-Kolmogorov equations
Prob. of going from state i at t=0, passing though state k at t=m, and ending at state j at t=m+n is In matrix notation,
5
state probabilities
6
State probability (pmf of an RV!)
Let p(n) = {pj(n)}, for all jE, be the row vector of state probs. at time n (i.e., state prob. vector) Thus, p(n) is given by From the initial state In matrix notation
7
How an MC changes (Ex 11.10, 11.11) A two-state system Silence Speech
0.9 0.1 A two-state system 0.8 Silence (state 0) Speech (state 1) 0.2 Suppose p(0)=(0,1) Suppose p(0)=(1,0) Then p(1) = p(0)P= (0,1)P = (0.2, 0.8) p(1) = p(0)P = (0.9, 0.1) p(2)= (0.2,0.8)P= (0,1)P2 = (0.34, 0.66) p(2) = (1,0)P2 = (0.83, 0.17) p(4)= (0,1)P4 = (0.507, 0.493) p(4)= (1,0)P4 = (0.747, 0.253) p(8)= (0,1)P8 = (0.629, 0.371) p(8)= (1,0)P8 = (0.686, 0.314) p(16)= (0,1)P16 = (0.665, 0.335) p(16)= (1,0)P16 = (0.668, 0.332) p(32)= (0,1)P32 = (0.667, 0.333) p(32)= (1,0)P32 = (0.667, 0.333) p(64)= (0,1)P64 = (0.667, 0.333) p(64)= (1,0)P64 = (0.667, 0.333)
8
Independence of initial condition
9
The lesson to take away No matter what assumptions you make about the initial probability distribution, after a large number of steps, the state probability distribution is approximately (2/3, 1/3) See p.666, 667
10
11.2.3 steady state probabilities
11
State probabilities (pmf) converge
As n , then transition prob. matrix Pn approaches a matrix whose rows are equal to the same pmf. In matrix notation, where 1 is a column vector of all 1’s, and =(0, 1, … ) The convergence of Pn implies the convergence of the state pmf’s
12
Steady state probability
System reaches “equilibrium” or “steady state”, i.e., n , pj(n) j, pi(n-1) i In matrix notation, here is stationary state pmf of the Markov chain To solve this,
13
Speech activity system
From the steady state probabilities = P (1, 2) = (1, 2) 1 = 0.9 2 2 = 0.1 2 1 + 2 = 1 1 = 2/3 = 0.667 2 = 1/3 = 0.333
14
Question 11-1: Alice, Bob and Carol are playing Frisbee. Alice always throws to Carol. Bob always throws to Alice. Carol throws to Bob 2/3 of the time and to Alice 1/3 of the time. In the long run, what percentage of the time do each of the players have the Frisbee?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.