Discrete-time markov chain (continuation)
Converting SOME non-Markov Chains to Markov ChainS Let us explain this using an example: Suppose we have a stochastic process 𝒀 𝒕 and 𝒀 𝒕 ∈ −𝟏𝟎𝟎,𝟏𝟎𝟎 .
Converting SOME non-Markov Chains to Markov ChainS Assume that the probabilities are 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓
Converting SOME non-Markov Chains to Markov ChainS Assume that the probabilities are 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓
Converting SOME non-Markov Chains to Markov ChainS Question: Is 𝒀 𝒕 a Markov Chain?
Converting SOME non-Markov Chains to Markov ChainS Now, suppose we have a stochastic process 𝑿 𝒕 where 𝑿 𝒕 ∈ 𝟏,𝟐,𝟑,𝟒 defined as 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎
Converting SOME non-Markov Chains to Markov ChainS Probabilities are 𝑷 𝑿 𝒕+𝟏 =𝟏 𝑿 𝒕 =𝟏 =𝟎 100,-100;100,-100 𝑷 𝑿 𝒕+𝟏 =𝟏 𝑿 𝒕 =𝟐 =𝟎.𝟐 100,-100;-100,100 𝑷 𝑿 𝒕+𝟏 =𝟏 𝑿 𝒕 =𝟑 =𝟎.𝟗 100,-100;-100,-100 𝑷 𝑿 𝒕+𝟏 =𝟏 𝑿 𝒕 =𝟒 =𝟎 100,-100;100,100 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎
Converting SOME non-Markov Chains to Markov ChainS Probabilities are 𝑷 𝑿 𝒕+𝟏 =𝟐 𝑿 𝒕 =𝟏 =𝟎.𝟕 -100,100;100,-100 𝑷 𝑿 𝒕+𝟏 =𝟐 𝑿 𝒕 =𝟐 =𝟎 -100,100;-100,100 𝑷 𝑿 𝒕+𝟏 =𝟐 𝑿 𝒕 =𝟑 =𝟎 -100,100;-100,-100 𝑷 𝑿 𝒕+𝟏 =𝟐 𝑿 𝒕 =𝟒 =𝟎.𝟎𝟓 -100,100;100,100 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎
Converting SOME non-Markov Chains to Markov ChainS Probabilities are 𝑷 𝑿 𝒕+𝟏 =𝟑 𝑿 𝒕 =𝟏 =______ ___,___;100,-100 𝑷 𝑿 𝒕+𝟏 =𝟑 𝑿 𝒕 =𝟐 =______ ___,___;-100,100 𝑷 𝑿 𝒕+𝟏 =𝟑 𝑿 𝒕 =𝟑 =______ ___,___;-100,-100 𝑷 𝑿 𝒕+𝟏 =𝟑 𝑿 𝒕 =𝟒 =______ ___,___;100,100 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎
Converting SOME non-Markov Chains to Markov ChainS Probabilities are 𝑷 𝑿 𝒕+𝟏 =𝟒 𝑿 𝒕 =𝟏 =______ ___,___;100,-100 𝑷 𝑿 𝒕+𝟏 =𝟒 𝑿 𝒕 =𝟐 =______ ___,___;-100,100 𝑷 𝑿 𝒕+𝟏 =𝟒 𝑿 𝒕 =𝟑 =______ ___,___;-100,-100 𝑷 𝑿 𝒕+𝟏 =𝟒 𝑿 𝒕 =𝟒 =______ ___,___;100,100 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟏 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟕 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟖 𝑷 𝒀 𝒕+𝟏 =−𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟎𝟓 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟗 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 =𝟎.𝟑 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =−𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟐 𝑷 𝒀 𝒕+𝟏 =𝟏𝟎𝟎 𝒀 𝒕 =𝟏𝟎𝟎, 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 =𝟎.𝟗𝟓 𝑿 𝒕 = 𝟏 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟐 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎 𝟑 𝐢𝐟 𝒀 𝒕 =−𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =−𝟏𝟎𝟎 𝟒 𝐢𝐟 𝒀 𝒕 =𝟏𝟎𝟎 𝐚𝐧𝐝 𝒀 𝒕−𝟏 =𝟏𝟎𝟎
Converting SOME non-Markov Chains to Markov ChainS Question: Is 𝑿 𝒕 a Markov Chain?
Converting SOME non-Markov Chains to Markov ChainS The transition matrix related to the four-state 𝑿 𝒕 is 𝐏= 𝟎 𝟎.𝟕 ___ ___ 𝟎.𝟐 𝟎 ___ ___ 𝟎.𝟗 𝟎 ___ ___ 𝟎 𝟎.𝟎𝟓 ___ ___ Also, draw the state transition diagram.
Converting SOME non-Markov Chains to Markov ChainS Reflection: In Markov Chains we can still incorporate the past but we need to increase the number of states.