Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.

Slides:



Advertisements
Similar presentations
Exponential Distribution
Advertisements

Lecture 10 Queueing Theory. There are a few basic elements common to almost all queueing theory application. Customers arrive, they wait for service in.
Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Markov Chains.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
INDR 343 Problem Session
Discrete Time Markov Chains
TCOM 501: Networking Theory & Fundamentals
Introduction to Discrete Time Semi Markov Process Nur Aini Masruroh.
IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials.
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Entropy Rates of a Stochastic Process
Continuous Time Markov Chains and Basic Queueing Theory
Stochastic Processes Dr. Nur Aini Masruroh. Stochastic process X(t) is the state of the process (measurable characteristic of interest) at time t the.
Lecture 13 – Continuous-Time Markov Chains
Nur Aini Masruroh Queuing Theory. Outlines IntroductionBirth-death processSingle server modelMulti server model.
Introduction to stochastic process
Solutions to group exercises 1. (a) Truncating the chain is equivalent to setting transition probabilities to any state in {M+1,...} to zero. Renormalizing.
048866: Packet Switch Architectures Dr. Isaac Keslassy Electrical Engineering, Technion Review.
Queueing Theory (2). Home Work 12-9 and Due Day: October 31 (Monday) 2005.
EMGT 501 Fall 2005 Midterm Exam SOLUTIONS.
TCOM 501: Networking Theory & Fundamentals
Chapter 4: Stochastic Processes Poisson Processes and Markov Chains
Queueing Theory: Part I
If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities.
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 60 Chapter 8 Markov Processes.
Queuing Networks: Burke’s Theorem, Kleinrock’s Approximation, and Jackson’s Theorem Wade Trappe.

Lecture 7  Poisson Processes (a reminder)  Some simple facts about Poisson processes  The Birth/Death Processes in General  Differential-Difference.
Introduction to Stochastic Models GSLM 54100
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Generalized Semi-Markov Processes (GSMP)
Intro. to Stochastic Processes
Networks of Queues Plan for today (lecture 6): Last time / Questions? Product form preserving blocking Interpretation traffic equations Kelly / Whittle.
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
1 Queuing Models Dr. Mahmoud Alrefaei 2 Introduction Each one of us has spent a great deal of time waiting in lines. One example in the Cafeteria. Other.
1 Elements of Queuing Theory The queuing model –Core components; –Notation; –Parameters and performance measures –Characteristics; Markov Process –Discrete-time.
Modeling and Analysis of Computer Networks
1 Birth and death process N(t) Depends on how fast arrivals or departures occur Objective N(t) = # of customers at time t. λ arrivals (births) departures.
Why Wait?!? Bryan Gorney Joe Walker Dave Mertz Josh Staidl Matt Boche.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < < tk < tk+1 If X(t) is discrete-valued If X(t) is continuous-valued i.e.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
S TOCHASTIC M ODELS L ECTURE 3 P ART II C ONTINUOUS -T IME M ARKOV P ROCESSES Nan Chen MSc Program in Financial Engineering The Chinese University of Hong.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
Stochastic Models Lecture 3 Continuous-Time Markov Processes
Chapter 6 Product-Form Queuing Network Models Prof. Ali Movaghar.
To be presented by Maral Hudaybergenova IENG 513 FALL 2015.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
Markov Processes What is a Markov Process?
Chapter 5 Elementary Stochastic Analysis Prof. Ali Movaghar.
Fault Tree Analysis Part 11 – Markov Model. State Space Method Example: parallel structure of two components Possible System States: 0 (both components.
Random Variables r Random variables define a real valued function over a sample space. r The value of a random variable is determined by the outcome of.
Goldstein/Schnieder/Lay: Finite Math & Its Applications, 9e 1 of 60 Chapter 8 Markov Processes.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
The Pure Birth Process Derivation of the Poisson Probability Distribution Assumptions events occur completely at random the probability of an event occurring.
Markov Chains.
V5 Stochastic Processes
Queueing Theory What is a queue? Examples of queues:
Lecture on Markov Chain
Solutions Queueing Theory 1
Hidden Markov Autoregressive Models
Discrete-time markov chain (continuation)
CS723 - Probability and Stochastic Processes
VIRTUE MARYLEE MUGURACHANI QUEING THEORY BIRTH and DEATH.
Presentation transcript:

Continuous-Time Markov Chains Nur Aini Masruroh

LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property that the conditional distribution of the future state at time t+s, given the present state at s and all past states depends only on the present state and is independent of the past.  If {X(t+s) = j|X(s) = i} is independent of s, then the continuous-time Markov chain is said to have stationary or homogeneous transition probabilities.  All Markov chain we consider to have stationary transition probabilities

LOGO Properties  Suppose that a continuous-time Markov chain enters state i and the process does not leave state i (transition does not occur) during the next s time units. What is the probability that the process will not leave state i during the following t time units?  The process is in state i at time s, by Markovian property, the probability it remains in that state during the interval [s, s+t] is just the unconditional probability that it stays in state i for at least t time units.  If τ i denotes the amount of time that the process stays in state i before making a transition into a different state, then P{ τ i > s+t| τ i >s} = P{ τ i > t} for all s, t ≥ 0.  the random variable τ i is memoryless and must thus be exponential distributed

LOGO Properties  Based on the previous property, a continuous-time Markov chain is a stochastic process having the properties that each time it enters state i:  The amount of time it spends in that state before making a transition into a different state is exponentially distributed with rate v i, 0 ≤ v i < ∞ If v i = ∞  instantaneous state, when entered it is instantaneously left If v i = 0  absorbing state  When the process leaves state i, it will next enter state j with some probability P ij where

LOGO Properties  A continuous-time Markov chain is a stochastic process that moves from state to state in accordance with a discrete-time Markov chain, but it such that the amount of time it spends in each state, before proceeding to the next state, is exponentially distributed  The amount of time the process spends in state i and the next state visited must be independent random variables  If the next state visited were dependent on τ i then information as how long the process has already been in state i would be relevant to the prediction of the next state  would contradict to the Markov assumption

LOGO Properties  A continuous-time Markov chain is regular if the number of transitions in any finite length of time is finite  Let q ij = v i P ij for all i ≠ j  Thus q ij is the rate at when in state i that the process makes a transition into state j  q ij is the transition rate from i to j  If P ij (t) is the probability that a Markov chain presently in state i will be in state j after an additional time t, P ij (t) = P{X(t+s) = j|X(s) = i}

LOGO Birth and Death Process  An important class in continuous-time Markov chain  Birth and death process is a continuous-time Markov chain with states 0, 1, … for which qij = 0 whenever |i – j|> 1  The transition from state i can only go either state i-1 or state i+1  The state of the process is usually thought of as representing the size of some population Increase by 1  birth occurs Decrease by 1  death occurs  Denote: λ i = q i, i+1  birth rate μ i = q i, i-1  death rate

LOGO Birth and death process  Since  Hence we think of a birth and death process by supposing that whenever there are i people in the system, the time until the next birth is exponential with rate λ i and is independent of the time until the next death which is exponential with rate μ i

LOGO Example: two birth and death process  The M/M/s queue  Customers arrive at an s-server service station with Poisson process having rate λ  The service times are assumed to be independent exponential random variables with mean 1/μ  If X(t) denote the number in the system at tie t, then {X(t), t ≥ 0} is a birth and death process with

LOGO Linear growth model with immigration  Occur naturally in the study of biological reproduction and population growth  Each individual in the population is assumed to give birth at an exponential rate λ  There is an exponential rate of increase θ of the population due to an external source such as immigration  Deaths are assumed to occur at an exponential rate μ for each number of population

LOGO Pure birth process  The birth and death process is said to be pure birth process if μ n = 0 for all n (the death is impossible)  The simplest example of pure birth process is Poisson process, which has a constant birth rate λ n = λ, n ≥ 0  Yule process: a pure birth process which each member acts independently and gives birth at an exponential rate λ and no one ever dies.  If X(t) represent the population size at time t, {X(t), t≥0} is a pure birth process with λ n = nλ, n ≥ 0

LOGO Limiting probabilities  Since a continuous-time Markov chain is a semi- Markov process with F ij (t) = 1 – e -vit  The limiting probabilities are given by

LOGO Limiting probabilities for the Birth and Death Process  Rate In = Rate Out Principle  For any state of the system n, the mean rate at which the entering incidents occurs must equal the mean rate at which the leaving incidents.

LOGO Balance equation The equations for the rate diagram can be formulated as follows: State 0: μ 1 p 1 = λ 0 p 0 State 1: λ 0 p 0 + μ 2 p 2 = (λ 1 + μ 1 )p 1 State 2: λ 1 p 1 + μ 3 p 3 = (λ 2 + μ 2 )p 2 …. State n: λ n-1 p n-1 + μ n+1 p n+1 = (λ n + μ n )p n ….

LOGO Balance equation (cont’d) Should be < ∞  Limiting probabilities to exist

LOGO Example: job-shop problem  Consider a job-shop consisting M machines and a single repairman. Suppose the amount of time a machine runs before breaking down is exponentially distributed with rate λ and the amount of time it takes the repairman to fix any broken machine is exponential with rate μ. If we say that the state is n whenever there are n machines down, then this system can be modeled as a birth and death process with parameters

LOGO Example: job-shop problem (cont’d)  The limiting probability that n machines will not be in use, p n is defined as  The average number of machines not in use is given by

LOGO Example: job-shop problem (cont’d)  Suppose we want to know the long-run proportion of time that a given machine is working  The equivalent limiting probability of its working:

LOGO Time reversibility  Consider the continuous-time Markov process going backwards in time  Since the forward process is continuous-time Markov chain it follows that given the present state, X(t), the past state X(t – s) and the future state X(y), y > t are independent  Therefore P{X(t-s) = j| X(t) = i, X(y), y>t} = P{X(t-s) = j|X(t) = i}  So, the reverse process is also the continuous-time Markov chain

LOGO Time reversibility (cont’d)  Since the amount of time spent in a state is the same whether one is going forward or backward in time, it follows that the amount of time reverse chain spends in state i on a visit is exponential with the same rate vi as in the forward process  Suppose the process is in state i at time t, the probability that its backward time in i exceeds s is

LOGO Time reversibility (cont’d)  The sequence of states visited by the reverse process constitutes a discrete-time Markov chain with transition probabilities P ij * given by P ij * = π j P ji / π i (condition for time reversibility) where (π j, j ≥0) are the stationary probabilities of the embedded discrete-time Markov chain with transition probability P ij  The condition of time reversibility: the rate at which the process goes directly from state i to state j is equal to the rate at which it goes directly from j to i

LOGO So, can you differentiate among Discrete-Time Markov Chain, Discrete-Time Semi Markov Chain, and Continuous- Time Markov Chain now?