Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Introduction to Stochastic Models GSLM 54100. 2 Outline  discrete-time Markov chain  transient behavior.

Similar presentations


Presentation on theme: "1 Introduction to Stochastic Models GSLM 54100. 2 Outline  discrete-time Markov chain  transient behavior."— Presentation transcript:

1 1 Introduction to Stochastic Models GSLM 54100

2 2 Outline  discrete-time Markov chain  transient behavior

3 3 Transient Behavior  {X n } for weather condition  0 if it rained both today and yesterday  1 if it rained today but not yesterday  2 if it rained yesterday but not today  3 if it did not rain either yesterday or today  yesterday rained but not today  P(it will rain tomorrow|X 0 = 2) = P(X 1 = 1|X 0 = 2) = 0.4

4 4 Transient Behavior  {X n } for weather condition  0 if it rained both today and yesterday  1 if it rained today but not yesterday  2 if it rained yesterday but not today  3 if it did not rain either yesterday or today  yesterday rained but not today  P(it will rain 10 days from now|X 0 = 2) = P(X 10 = 0|X 0 = 2) + P(X 10 = 1|X 0 = 2) =

5 5 Example 4.10  2 balls in an urn  randomly picking one out, replacing by the same color w.p. 0.8, and the opposite color w.p. 0.2  initially both balls being red  P(there are 2 red balls in the urn after 4 selections) = ?  P(the fifth selected ball is red) = ?

6 6 Example 4.10  X n = the # of red balls in the urn after the nth selection and subsequent replacement  X 0 = 0 p 01 p 00 p 12 p 11 p 10

7 7 Example 4.10  P(there are 2 red balls in the urn after 4 selections) = P(X 4 = 2|X 0 = 2) = = 0.4872  

8 8 Example 4.10  P(5th selection is red) = P(5th selection is red|X 4 = 0)P(X 4 = 0|X 0 = 2) + P(5th selection is red|X 4 = 1)P(X 4 = 1|X 4 = 2) + P(5th selection is red|X 4 = 2)P(X 4 = 2|X 4 = 2) = (0) + (0.5) + (1) = 0 + (0.5)(0.4352) + (1)(0.4872) = 0.7048

9 9 Example 4.11  balls successively, randomly distributed among 8 urns  P(3 nonempty urns after 9 balls distributed) = ?  X n = the # of nonempty urns after n balls have been distributed; X n  {0, 1,..., 8}  p i,i = i/8 = 1 − P i,i+1, i = 0, 1,..., 8  required answer =

10 10 Unconditional Probability

11 11 Example 4.12 of Ross  the amount of money of a pensioner  receiving 2 (thousand dollars) at the beginning of a month  expenses in a month = i, w.p. ¼, i = 1, 2, 3, 4  not using the excess if insufficient money on hand  disposal of excess if having more than 3 at the end of a month  at a particular month (time reference), having 5 after receiving his payment  P(the pensioner’s capital ever 1 or less within the following four months)

12 12 Example 4.12 of Ross  X n = the amount of money that the pensioner has at the end of month n  X n+1 = min{[X n +2  D n ] +, 3}, D n ~ disc. unif. [1, 2, 3, 4]  starting with X 0 = 3, X n  {0, 1, 2, 3}

13 13 Example 4.12 of Ross  to find P(the capital of the pensioner is ever 1 or less at any time within the following 4 months) 0 12 3 0.25 0.5 0.25 0.5 0.25 0.75 with X 0 = 3, will the chain visit state 0 or 1 for n  4

14 14 Example 4.12 of Ross  starting with X 0 = 3, whether the chain has ever visited state 0 or 1 on or before n depends on the transitions within {2, 3}  merging states 0 and 1 into a super state A 0 12 3 0.25 0.5 0.25 0.5 0.25 0.75 A 2 3 0.25 0.5 0.25 1

15 15 Probability of Ever Visiting a Set of States by Period n  a Markov chain [p ij ]  A : a set of special states  P(ever visiting states in A on or before period n|X 0 = i)  defining  super state A: indicating ever visiting states in A  the first visiting time of A, N = min{n: X n  A }  a new Markov chain W n =

16 16 Probability of Ever Visiting a Set of States by Period n  transition probability matrix Q = [q ij ]

17 17 Probability of Visiting a Particular State at n and Skipping a Particular Set of States for k  {1, …, n  1}  P(X n = j, X k  A, k = 1, …, m  1| X 0 = i) = P(W n = j|X 0 = i) = P(W n = j|W 0 = i)


Download ppt "1 Introduction to Stochastic Models GSLM 54100. 2 Outline  discrete-time Markov chain  transient behavior."

Similar presentations


Ads by Google