Presentation is loading. Please wait.

Presentation is loading. Please wait.

Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

Similar presentations


Presentation on theme: "Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger"— Presentation transcript:

1 Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

2 Random Processes A stochastic process is a collection of random variables The index t is often interpreted as time. is called the state of the process at time t. Discrete-valued or continuous-valued The set I is called the index set of the process. If I is countable, the stochastic process is said to be a discrete-time process. If I is an interval of the real line, the stochastic process is said to be a continuous-time process. The state space E is the set of all possible values that the random variables can assume.

3 Discrete Time Random Process
If I is countable, is often denoted by n = 0,1,2,3,… time 1 2 3 4 Events occur at specific points in time

4 Discrete time Random Process
State Space = {SUNNY, RAINY} Day Day 1 Day 2 Day 3 Day 4 Day 5 Day 6 Day 7 THU FRI SAT SUN MON TUE WED X(dayi): Status of the weather observed each DAY

5 Markov processes A stochastic process is called a Markov process if
for all states and all If Xn’s are integer-valued, Xn is called a Markov Chain

6 What is “Markov Property”?
PAST EVENTS NOW FUTURE EVENTS ? Probability of “R” in DAY6 given all previous states Probability of “S” in DAY6 given all previous states Day Day 1 Day 2 Day 3 Day 4 Day 5 Day 6 Day 7 THU FRI SAT SUN MON TUE WED Markov Property: The probability that it will be (FUTURE) SUNNY in DAY 6 given that it is RAINY in DAY 5 (NOW) is independent from PAST EVENTS

7 Markov Chains We restrict ourselves to Markov chains such that the conditional probabilities are independent of n, and for which (which is equivalent to saying that state space E is finite or countable). Such a Markov chain is called homogeneous.

8 Markov Chains Since probabilities are non-negative, and
the process must make a transition into some state at each time in I, then We can arrange the probabilities into a square matrix called the transition matrix.

9 Markov Chain: A Simple Example
Weather: raining today 40% rain tomorrow 60% no rain tomorrow not raining today 20% rain tomorrow 80% no rain tomorrow State transition diagram: rain no rain 0.6 0.4 0.8 0.2

10 Rain (state 0), No rain (state 1)
Weather: raining today 40% rain tomorrow 60% no rain tomorrow not raining today 20% rain tomorrow 80% no rain tomorrow The transition (prob.) matrix P: for a given current state: - Transitions to any states - Each row sums up to 1

11 Examples in textbook Example 11.6 Example 11.7 Figures 11.2 and 11.3

12 Transition probability
Note that each entry in P is a one-step transition probability, say, from the current state to the right next state Then, how about multiple steps? Let’s start with 2 steps first

13 2-Step Transition Prob. of 2 state system: states 0 and 1
Let pij(2) be probability of going from i to j in 2 steps Suppose i = 0, j = 0, then P(X2 = 0|X0 = 0) = P(X1 = 1|X0 = 0)  P(X2 = 0| X1 = 1) + P(X1 = 0|X0 = 0)  P(X2 = 0| X1 = 0) p00(2) = p01p10 + p00p00 Similarly p01(2) = p01p11 + p00p01 p10(2) = p10p00 + p11p10 p11(2) = p10p01 + p11p11 In matrix form, P(2) = P(1)P(1) = P2 13

14 In general, 2 step transition is expressed as
Now note that

15 Two-step transition prob.
State space E={0, 1, 2} Hence,

16 Chapman-Kolmogorov Equations
In general, for all This leads to the Chapman-Kolmogorov equations:


Download ppt "Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger"

Similar presentations


Ads by Google