Download presentation
Presentation is loading. Please wait.
1
Random Processes / Markov Processes
Pool example: The home I moved into came with an above ground pool that was green. I spent big $s and and got the pool clear again. Then the pump started leaking and I turned off the pump and eventually the pool turned green again. After fixing the pump, I finally got the pool to turn blue again. I have made the following observations: If I observe the pool each morning, it basically has three states: blue, blue/green, and green. If the pool is blue, the probability of it staying blue is about 80%, otherwise it turns blue/green. If the pool is blue/breen, there is equal probability of remaining blue/green, or turning blue or green. If the pool is green, there is a 60% probability of remaining green, otherwise the pool turns blue/green.
2
Random Processes / Markov Processes
If the pool is blue, the probability of it staying blue is about 80%, otherwise it turns blue/green. If the pool is blue/breen, there is equal probability of remaining blue/green, or turning blue or green. If the pool is green, there is a 60% probability of remaining green, otherwise the pool turns blue/green. G B/G B
3
Random Processes / Markov Processes
Probability Transition Matrix (P) – probability of transitioning from some current state to some next state in one step. state G B/G B G P = B/G B P is referred to as the probability transition matrix.
4
Random Processes / Markov Processes
What is a Markov Process? A stochastic (probabilistic) process which contains the Markovian property. A process has the Markovian property if: for t = 0,1,… and every sequence i,j, k0, k1,…kt-1. In other words, any future state is only dependent on it’s prior state.
5
Markov Processes cont. This conditional probability
is called the one-step transition probability. And if for all t = 1,2,… then the one-step transition probability is said to be stationary and therefore referred to as the stationary transition probability.
6
Markov Processes cont. Let pij = state 0 1 2 3 0 p00 p01 p02 p03
P = p p p p13 2 p p p p23 3 p p p p33 P is referred to as the probability transition matrix.
7
Markov Processes cont. Suppose the probability you win is based on if you won the last time you played some game. Say, if you won last time, then there is a 70% chance of winning the next time. However, if you lost last time, there is a 60% chance you lose the next time. Can the process of winning and losing be modeled as a Markov process? Let state 0 be you win, and state 1 be you lose, then: state P =
8
Markov Processes cont. See handout on n-step transition matrix.
9
Markov Processes cont. Let, state 0 1 2 ... N 0 p0 p1 p2 … pN
Pn = p p p2 … pN 2 p p p2 … pN 3 p p p2 … pN Then P= [p0 , p1 , p2 , p3 …pN ] are the steady state probabilities.
10
Markov Processes cont. Observing that P(n) = P(n-1)P, As , P = PP.
[p0 , p1 ,,p2 ,…pN ] = [p0 , p1 ,,p2 ,…pN ] p p p02 … p0N p p p12 … p1N p p p22 … p2N pN pN pN2 … p3N The inner product of this matrix equation results in N+1 equations and N+1 unknowns, however rank of the P matrix is N. However, note that p0 + p1 + p2+ p3 …pN = 1. Therefore N+1 equations and N+1 unknowns.
11
Markov Processes cont. Show example of obtaining P = PP from transition matrix: state P =
12
Markov Processes cont. Break for Exercise
13
Markov Processes cont. State diagrams: state 0 1 P = 0 .70 .30
1
14
Markov Processes cont. State diagrams: state 0 1 2 3 P = 0 .5 .5 0 0
1 2 3
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.