Presentation is loading. Please wait.

Presentation is loading. Please wait.

Discrete-time markov chain (continuation)

Similar presentations


Presentation on theme: "Discrete-time markov chain (continuation)"β€” Presentation transcript:

1 Discrete-time markov chain (continuation)

2 0.4 Definition State A 0.6 State C is accessible from state A State B is NOT accessible from state C 0.2 State B State C 0.8

3 0.4 Definition State A 0.6 State C is accessible from state A 𝒑 𝑨π‘ͺ (𝒏) >𝟎 for some 𝒏β‰₯𝟎. 0.2 State B State C 0.8

4 Definition 0.4 0.6 State B is accessible from state B. 0.2 0.8 State A
𝒑 𝑩𝑩 (𝟎) = 𝑷 𝑿 𝟎 = 𝑩 𝑿 𝟎 =𝑩 =𝟏 0.2 State B State C 0.8

5 Definition 0.4 0.6 State A and B communicate. 0.2 0.8 State A
State B is accessible from state A AND State A is accessible from state B State A and B communicate. 0.2 State B State C 0.8

6 Definition 0.4 0.6 0.2 0.8 Any state communicates with itself.
State A 0.6 Any state communicates with itself. β€œCommunicate” is transitive. 0.2 State B State C 0.8

7 0.4 Definition State A 0.6 If ALL states communicate, the Markov Chain is irreducible. 0.1 0.2 State C 0.9 State B 0.8

8 0.4 ClassificationS State A State B is transient iff there exists other states that is accessible from B but not vice versa. Once the process exits B, it will never return to B again. 0.6 0.5 State C 1 State B 0.5

9 0.4 ClassificationS State A State B is transient iff there exists other states that is accessible from B but not vice versa. A transient state will be visited only a finite number of times. 0.6 0.5 State C 1 State B 0.5

10 0.4 Classifications State A A state which is not transient is called recurrent. Once the process exits the recurrent state, the process will definitely return to this state again. 0.6 0.1 State C 0.9 0.5 State B 0.5

11 ClassificationS 0.4 0.6 0.5 0.5 State C is absorbing iff 𝒑 π‘ͺπ‘ͺ =𝟏.
State A State C is absorbing iff 𝒑 π‘ͺπ‘ͺ =𝟏. Upon entering C, the process will never leave C. 0.6 0.5 State B State C 0.5

12 ClassificationS 0.4 0.6 0.5 1 0.5 State C is absorbing iff 𝒑 π‘ͺπ‘ͺ =𝟏.
State A State C is absorbing iff 𝒑 π‘ͺπ‘ͺ =𝟏. Upon entering C, the process will never leave C. 0.6 0.5 State C 1 State B 0.5

13 Remark In a finite-state Markov Chain, at least one state is recurrent (i.e., not all states can be transient).

14 MONTE CARLO SIMULATION EXERCISE
0.6 MONTE CARLO SIMULATION EXERCISE State 1 0.4 0.3 0.9 State 3 0.1 State 2 0.7


Download ppt "Discrete-time markov chain (continuation)"

Similar presentations


Ads by Google