Download presentation
Presentation is loading. Please wait.
1
Discrete-time markov chain (continuation)
2
Converting SOME non-Markov Chains to Markov ChainS
Let us explain this using an example: Suppose we have a stochastic process 𝒀 𝒕 and 𝒀 𝒕 ∈ −𝟏𝟎𝟎,𝟏𝟎𝟎 .
3
Converting SOME non-Markov Chains to Markov ChainS
Assume that the probabilities are 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓
4
Converting SOME non-Markov Chains to Markov ChainS
Assume that the probabilities are 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓
5
Converting SOME non-Markov Chains to Markov ChainS
Question: Is 𝒀 𝒕 a Markov Chain?
6
Converting SOME non-Markov Chains to Markov ChainS
Now, suppose we have a stochastic process 𝑿 𝒕 where 𝑿 𝒕 ∈ 𝟏,𝟐,𝟑,𝟒 defined as 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎
7
Converting SOME non-Markov Chains to Markov ChainS
Probabilities are 𝑷 𝑿 𝒕+𝟏 =𝟏 𝑿 𝒕 =𝟏 =𝟎 100,-100;100,-100 𝑷 𝑿 𝒕+𝟏 =𝟏 𝑿 𝒕 =𝟐 =𝟎.𝟐 100,-100;-100,100 𝑷 𝑿 𝒕+𝟏 =𝟏 𝑿 𝒕 =𝟑 =𝟎.𝟗 100,-100;-100,-100 𝑷 𝑿 𝒕+𝟏 =𝟏 𝑿 𝒕 =𝟒 =𝟎 100,-100;100,100 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎
8
Converting SOME non-Markov Chains to Markov ChainS
Probabilities are 𝑷 𝑿 𝒕+𝟏 =𝟐 𝑿 𝒕 =𝟏 =𝟎.𝟕 -100,100;100,-100 𝑷 𝑿 𝒕+𝟏 =𝟐 𝑿 𝒕 =𝟐 =𝟎 -100,100;-100,100 𝑷 𝑿 𝒕+𝟏 =𝟐 𝑿 𝒕 =𝟑 =𝟎 -100,100;-100,-100 𝑷 𝑿 𝒕+𝟏 =𝟐 𝑿 𝒕 =𝟒 =𝟎.𝟎𝟓 -100,100;100,100 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎
9
Converting SOME non-Markov Chains to Markov ChainS
Probabilities are 𝑷 𝑿 𝒕+𝟏 =𝟑 𝑿 𝒕 =𝟏 =______ ___,___;100,-100 𝑷 𝑿 𝒕+𝟏 =𝟑 𝑿 𝒕 =𝟐 =______ ___,___;-100,100 𝑷 𝑿 𝒕+𝟏 =𝟑 𝑿 𝒕 =𝟑 =______ ___,___;-100,-100 𝑷 𝑿 𝒕+𝟏 =𝟑 𝑿 𝒕 =𝟒 =______ ___,___;100,100 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎
10
Converting SOME non-Markov Chains to Markov ChainS
Probabilities are 𝑷 𝑿 𝒕+𝟏 =𝟒 𝑿 𝒕 =𝟏 =______ ___,___;100,-100 𝑷 𝑿 𝒕+𝟏 =𝟒 𝑿 𝒕 =𝟐 =______ ___,___;-100,100 𝑷 𝑿 𝒕+𝟏 =𝟒 𝑿 𝒕 =𝟑 =______ ___,___;-100,-100 𝑷 𝑿 𝒕+𝟏 =𝟒 𝑿 𝒕 =𝟒 =______ ___,___;100,100 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎
11
Converting SOME non-Markov Chains to Markov ChainS
Question: Is 𝑿 𝒕 a Markov Chain?
12
Converting SOME non-Markov Chains to Markov ChainS
The transition matrix related to the four-state 𝑿 𝒕 is 𝐏= 𝟎 𝟎.𝟕 ___ ___ 𝟎.𝟐 𝟎 ___ ___ 𝟎.𝟗 𝟎 ___ ___ 𝟎 𝟎.𝟎𝟓 ___ ___ Also, draw the state transition diagram.
13
Converting SOME non-Markov Chains to Markov ChainS
Reflection: In Markov Chains we can still incorporate the past but we need to increase the number of states.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.