Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Introduction to Stochastic Models GSLM 54100. 2 Outline  discrete-time Markov chain  motivation  example  transient behavior.

Similar presentations


Presentation on theme: "1 Introduction to Stochastic Models GSLM 54100. 2 Outline  discrete-time Markov chain  motivation  example  transient behavior."— Presentation transcript:

1 1 Introduction to Stochastic Models GSLM 54100

2 2 Outline  discrete-time Markov chain  motivation  example  transient behavior

3 3 Motivation  What happens if X n ’s are dependent?  many dependent systems, e.g.,  inventory across periods  state of a machine  customers unserved in a distribution system time excellent good fair bad

4 4 Motivation  any nice limiting results for dependent X n ’s?  no such result for general dependent X n ’s  nice results when X n ’s form a discrete-time Markov Chain

5 5 Discrete-Time, Discrete-State Stochastic Process  a stochastic process: a sequence of indexed random variables, e.g., {X n }, {X(t)}  a discrete-time stochastic process: {X n }  a discrete-state stochastic process, e.g.,  state  {excellent, good, fair, bad}  set of states  {e, g, f, b}  {1, 2, 3, 4}  {0, 1, 2, 3}  state to describe weather  {windy, rainy, cloudy, sunny}

6 6 Markov Property  a discrete-time, discrete-state stochastic process possesses the Markov property if  P{X n+1 = j|X n = i, X n−1 = i n−1,..., X 1 = i 1, X 0 = i 0 } = p ij, for all i 0, i 1, …, i n  1, i n, i, j, n  0  time frame: presence n, future n+1, past {i 0, i 1, …, i n  1 }  meaning of the statement: given presence, the past and the future are conditionally independent  the past and the future are certainly dependent

7 7 One-Step Transition Probability Matrix  p ij  0, i, j  0,

8 8 Example 4-1 Forecasting the Weather  state  {rain, not rain}  dynamics of the system  rains today  rains tomorrow w.p.   does not rain today  rains tomorrow w.p.   weather of the system across the days, {X n }

9 9 Example 4-2 A Communication System  digital signals in 0 and 1  a signal remaining unchanged with probability p on passing through a stage, independent of everything else  state = value of the signal  {0, 1}  X n : value of the signal before entering the nth stage

10 10 Example 4-3 The Mood of a Person  mood  {cheerful (C), so-so (S), or glum (G)}  cheerful today  C, S, or G tomorrow w.p. 0.5, 0.4, 0.1  so-so today  C, S, or G tomorrow w.p. 0.3, 0.4, 0.3  glum today  C, S, or G tomorrow w.p. 0.2, 0.3, 0.5  X n : mood on the nth day, such that mood  {C, S, G}  {X n }: a 3-state Markov chain (state 0 = C, state 1 = S, state 2 = G)

11 11 Example 4.4 Transforming a Process into a DTMC  raining or not today depending on the weather conditions of the last two days  rained for the past two days  will rain tomorrow w.p. 0.7  rained today but not yesterday  will rain tomorrow w.p. 0.5  rained yesterday but not today  will rain tomorrow w.p. 0.4  not rained in the past two days  will rain tomorrow w.p. 0.2

12 12 Example 4.4 Transforming a Process into a DTMC  state  0 if it rained both today and yesterday  1 if it rained today but not yesterday  2 if it rained yesterday but not today  3 if it did not rain either yesterday or today

13 13 Example 4.5 A Random Walk Model  a discrete-time Markov chain of  number of states {…, -2, -1, 0, 1, 2, …}  random walk: for 0 < p < 1,  p i,i+1 = p = 1 − p i,i−1, i = 0,  1,...

14 14 Example 4.6 A Gambling Model  each play of a game a gambler gaining $1 w.p. p, and losing $1 o.w.  end of the game: a gambler either broken or accumulating $N  transition probabilities:  p i,i+1 = p = 1 − p i,i−1, i = 1, 2,..., N − 1; p 00 = p NN = 1  example for N = 4  state: X n, the gambler’s fortune after the n play  {0, 1, 2, 3, 4}

15 15 Example 4.7  insurance premium paid on a year depending on the number of claims made last year

16 16 Example 4.7  # of claims in a year ~ Poisson( )

17 17 Transient Behavior  {X n } for weather condition  0 if it rained both today and yesterday  1 if it rained today but not yesterday  2 if it rained yesterday but not today  3 if it did not rain either yesterday or today  suppose yesterday rained and today does not, what is the weather forecast  for tomorrow?  for 10 days from now?

18 18 m-Step Transition Probability Matrix  one-step transition probability matrix, P = [p ij ], where p ij = P(X 1 = j|X 0 = i)  m-step transition probability matrix where  claim: P (m) = P m

19 19 m-Step Transition Probability Matrix  Markov chain {X n } for weather  X n  {r, c, s}, where r = rainy, c = cloudy, s = sunny n = 0 n = 1 n = 2 State = r State = c State = s p cr p cc p cs p rr p cr p sr

20 20 m-Step Transition Probability Matrix

21  claim: (P 2 ) cr = (PP) cr =   (P 2 ) ij =   P 2 = P (2)   P m = P (m) 21 m-Step Transition Probability Matrix p cr p cc p cs p rr p cr p sr r c s r c s


Download ppt "1 Introduction to Stochastic Models GSLM 54100. 2 Outline  discrete-time Markov chain  motivation  example  transient behavior."

Similar presentations


Ads by Google