Download presentation
Presentation is loading. Please wait.
Published byRegina Cutsforth Modified over 9 years ago
1
Continuous-Time Markov Chains Nur Aini Masruroh
2
LOGO Introduction A continuous-time Markov chain is a stochastic process having the Markovian property that the conditional distribution of the future state at time t+s, given the present state at s and all past states depends only on the present state and is independent of the past. If {X(t+s) = j|X(s) = i} is independent of s, then the continuous-time Markov chain is said to have stationary or homogeneous transition probabilities. All Markov chain we consider to have stationary transition probabilities
3
LOGO Properties Suppose that a continuous-time Markov chain enters state i and the process does not leave state i (transition does not occur) during the next s time units. What is the probability that the process will not leave state i during the following t time units? The process is in state i at time s, by Markovian property, the probability it remains in that state during the interval [s, s+t] is just the unconditional probability that it stays in state i for at least t time units. If τ i denotes the amount of time that the process stays in state i before making a transition into a different state, then P{ τ i > s+t| τ i >s} = P{ τ i > t} for all s, t ≥ 0. the random variable τ i is memoryless and must thus be exponential distributed
4
LOGO Properties Based on the previous property, a continuous-time Markov chain is a stochastic process having the properties that each time it enters state i: The amount of time it spends in that state before making a transition into a different state is exponentially distributed with rate v i, 0 ≤ v i < ∞ If v i = ∞ instantaneous state, when entered it is instantaneously left If v i = 0 absorbing state When the process leaves state i, it will next enter state j with some probability P ij where
5
LOGO Properties A continuous-time Markov chain is a stochastic process that moves from state to state in accordance with a discrete-time Markov chain, but it such that the amount of time it spends in each state, before proceeding to the next state, is exponentially distributed The amount of time the process spends in state i and the next state visited must be independent random variables If the next state visited were dependent on τ i then information as how long the process has already been in state i would be relevant to the prediction of the next state would contradict to the Markov assumption
6
LOGO Properties A continuous-time Markov chain is regular if the number of transitions in any finite length of time is finite Let q ij = v i P ij for all i ≠ j Thus q ij is the rate at when in state i that the process makes a transition into state j q ij is the transition rate from i to j If P ij (t) is the probability that a Markov chain presently in state i will be in state j after an additional time t, P ij (t) = P{X(t+s) = j|X(s) = i}
7
LOGO Birth and Death Process An important class in continuous-time Markov chain Birth and death process is a continuous-time Markov chain with states 0, 1, … for which qij = 0 whenever |i – j|> 1 The transition from state i can only go either state i-1 or state i+1 The state of the process is usually thought of as representing the size of some population Increase by 1 birth occurs Decrease by 1 death occurs Denote: λ i = q i, i+1 birth rate μ i = q i, i-1 death rate
8
LOGO Birth and death process Since Hence we think of a birth and death process by supposing that whenever there are i people in the system, the time until the next birth is exponential with rate λ i and is independent of the time until the next death which is exponential with rate μ i
9
LOGO Example: two birth and death process The M/M/s queue Customers arrive at an s-server service station with Poisson process having rate λ The service times are assumed to be independent exponential random variables with mean 1/μ If X(t) denote the number in the system at tie t, then {X(t), t ≥ 0} is a birth and death process with
10
LOGO Linear growth model with immigration Occur naturally in the study of biological reproduction and population growth Each individual in the population is assumed to give birth at an exponential rate λ There is an exponential rate of increase θ of the population due to an external source such as immigration Deaths are assumed to occur at an exponential rate μ for each number of population
11
LOGO Pure birth process The birth and death process is said to be pure birth process if μ n = 0 for all n (the death is impossible) The simplest example of pure birth process is Poisson process, which has a constant birth rate λ n = λ, n ≥ 0 Yule process: a pure birth process which each member acts independently and gives birth at an exponential rate λ and no one ever dies. If X(t) represent the population size at time t, {X(t), t≥0} is a pure birth process with λ n = nλ, n ≥ 0
12
LOGO Limiting probabilities Since a continuous-time Markov chain is a semi- Markov process with F ij (t) = 1 – e -vit The limiting probabilities are given by
13
LOGO Limiting probabilities for the Birth and Death Process Rate In = Rate Out Principle For any state of the system n, the mean rate at which the entering incidents occurs must equal the mean rate at which the leaving incidents.
14
LOGO Balance equation The equations for the rate diagram can be formulated as follows: State 0: μ 1 p 1 = λ 0 p 0 State 1: λ 0 p 0 + μ 2 p 2 = (λ 1 + μ 1 )p 1 State 2: λ 1 p 1 + μ 3 p 3 = (λ 2 + μ 2 )p 2 …. State n: λ n-1 p n-1 + μ n+1 p n+1 = (λ n + μ n )p n ….
15
LOGO Balance equation (cont’d) Should be < ∞ Limiting probabilities to exist
16
LOGO Example: job-shop problem Consider a job-shop consisting M machines and a single repairman. Suppose the amount of time a machine runs before breaking down is exponentially distributed with rate λ and the amount of time it takes the repairman to fix any broken machine is exponential with rate μ. If we say that the state is n whenever there are n machines down, then this system can be modeled as a birth and death process with parameters
17
LOGO Example: job-shop problem (cont’d) The limiting probability that n machines will not be in use, p n is defined as The average number of machines not in use is given by
18
LOGO Example: job-shop problem (cont’d) Suppose we want to know the long-run proportion of time that a given machine is working The equivalent limiting probability of its working:
19
LOGO Time reversibility Consider the continuous-time Markov process going backwards in time Since the forward process is continuous-time Markov chain it follows that given the present state, X(t), the past state X(t – s) and the future state X(y), y > t are independent Therefore P{X(t-s) = j| X(t) = i, X(y), y>t} = P{X(t-s) = j|X(t) = i} So, the reverse process is also the continuous-time Markov chain
20
LOGO Time reversibility (cont’d) Since the amount of time spent in a state is the same whether one is going forward or backward in time, it follows that the amount of time reverse chain spends in state i on a visit is exponential with the same rate vi as in the forward process Suppose the process is in state i at time t, the probability that its backward time in i exceeds s is
21
LOGO Time reversibility (cont’d) The sequence of states visited by the reverse process constitutes a discrete-time Markov chain with transition probabilities P ij * given by P ij * = π j P ji / π i (condition for time reversibility) where (π j, j ≥0) are the stationary probabilities of the embedded discrete-time Markov chain with transition probability P ij The condition of time reversibility: the rate at which the process goes directly from state i to state j is equal to the rate at which it goes directly from j to i
22
LOGO So, can you differentiate among Discrete-Time Markov Chain, Discrete-Time Semi Markov Chain, and Continuous- Time Markov Chain now?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.