IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials.

Slides:



Advertisements
Similar presentations
Discrete time Markov Chain
Advertisements

Modeling and Dimensioning of Mobile Networks: from GSM to LTE
Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
CS433 Modeling and Simulation Lecture 06 – Part 03 Discrete Markov Chains Dr. Anis Koubâa 12 Apr 2009 Al-Imam Mohammad Ibn Saud University.
Flows and Networks Plan for today (lecture 2): Questions? Continuous time Markov chain Birth-death process Example: pure birth process Example: pure death.
Markov Chains.
Matrix Analytic methods in Markov Modelling. Continous Time Markov Models X: R -> X µ Z (integers) X(t): state at time t X: state space (discrete – countable)
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
IERG5300 Tutorial 1 Discrete-time Markov Chain
Discrete Time Markov Chains
TCOM 501: Networking Theory & Fundamentals
Copyright © 2005 Department of Computer Science CPSC 641 Winter Markov Chains Plan: –Introduce basics of Markov models –Define terminology for Markov.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Lecture 3: Markov processes, master equation
Discrete-Time Markov Chains. © Tallal Elshabrawy 2 Introduction Markov Modeling is an extremely important tool in the field of modeling and analysis of.
Continuous Time Markov Chains and Basic Queueing Theory
Workshop on Stochastic Differential Equations and Statistical Inference for Markov Processes Day 1: January 19 th, Day 2: January 28 th Lahore University.
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
Lecture 13 – Continuous-Time Markov Chains
Solutions to group exercises 1. (a) Truncating the chain is equivalent to setting transition probabilities to any state in {M+1,...} to zero. Renormalizing.
048866: Packet Switch Architectures Dr. Isaac Keslassy Electrical Engineering, Technion Review.
TCOM 501: Networking Theory & Fundamentals
Queueing Theory: Part I
If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities.
2003 Fall Queuing Theory Midterm Exam(Time limit:2 hours)
1 Markov Chains H Plan: –Introduce basics of Markov models –Define terminology for Markov chains –Discuss properties of Markov chains –Show examples of.
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Flows and Networks Plan for today (lecture 5): Last time / Questions? Waiting time simple queue Little Sojourn time tandem network Jackson network: mean.
Introduction to Stochastic Models GSLM 54100
Intro. to Stochastic Processes
Flows and Networks Plan for today (lecture 4): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
Modeling and Analysis of Computer Networks
1 Networks of queues Networks of queues reversibility, output theorem, tandem networks, partial balance, product-form distribution, blocking, insensitivity,
1 Networks of queues Networks of queues reversibility, output theorem, tandem networks, partial balance, product-form distribution, blocking, insensitivity,
Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < < tk < tk+1 If X(t) is discrete-valued If X(t) is continuous-valued i.e.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Model under consideration: Loss system Collection of resources to which calls with holding time  (c) and class c arrive at random instances. An arriving.
Flows and Networks Plan for today (lecture 2): Questions? Birth-death process Example: pure birth process Example: pure death process Simple queue General.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
S TOCHASTIC M ODELS L ECTURE 3 P ART II C ONTINUOUS -T IME M ARKOV P ROCESSES Nan Chen MSc Program in Financial Engineering The Chinese University of Hong.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
Stochastic Models Lecture 3 Continuous-Time Markov Processes
Discrete Time Markov Chains
Chapter 6 Product-Form Queuing Network Models Prof. Ali Movaghar.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
Fault Tree Analysis Part 11 – Markov Model. State Space Method Example: parallel structure of two components Possible System States: 0 (both components.
Flows and Networks Plan for today (lecture 3): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
QUEUING. CONTINUOUS TIME MARKOV CHAINS {X(t), t >= 0} is a continuous time process with > sojourn times S 0, S 1, S 2,... > embedded process X n = X(S.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Reliability Engineering
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
Markov Chains.
Discrete-time Markov chain (DTMC) State space distribution
Networks of queues Networks of queues reversibility, output theorem, tandem networks, partial balance, product-form distribution, blocking, insensitivity,
Flows and Networks Plan for today (lecture 3):
Discrete-time markov chain (continuation)
Flows and Networks Plan for today (lecture 4):
CTMCs & N/M/* Queues.
Discrete Time Markov Chains
Discrete-time markov chain (continuation)
IENG 362 Markov Chains.
Markov Chains Carey Williamson Department of Computer Science
Chapman-Kolmogorov Equations
Networks of queues Networks of queues reversibility, output theorem, tandem networks, partial balance, product-form distribution, blocking, insensitivity,
Continuous time Markov Chains
Carey Williamson Department of Computer Science University of Calgary
Presentation transcript:

IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials

2 Outline Continuous-time Markov chain Chapman-Kolmogorov equation Kolmogorov’s Backward & Forward equation Limiting Probability P j Time Reversible Markov Process Summary

3 Continuous-time Markov chain ― Properties X(t) always corresponds to an embedded discrete-time Markov chain, with the constraint P ii = 0, ∀ i. X(t) = i means that the process is in state i at time t. When the process X(t) is in state i at time t, the remaining time for it to make a transition to other states (≠i) is an exponential r.v. with rate v i. The remaining time for a transition is independent of t.

4 Continuous-time Markov chain ― Notations P{X(t + s) = j | X(s) = i, X(u) = x(u) ∀ u < s} = P{X(t + s) = j | X(s) = i} = P ij (t) // stationary transition probability. P ii = 0, ∀ i ; but P ii (t)≠0 for t ≥ 0 v i = transition rate in state i; q ij = v i P ij = instantaneous transition rate from state i to state j //v i = ∑ q ij ; P ij = q ij / v i T i = waiting time for a transition in state i, exponential with v i

5 Continuous-time Markov chain ― Example Consider two machines that are maintained by a single repairman. Machine i functions for an exponential time with rate μ i before breaking down, i = 1, 2. The repair times for either machine are exponential with rate μ. Analyze this as a Markov Process. Define 5 states: 0 ― both machines on 1 ― 1 st down, 2 nd on 2 ― 1 st on, 2 nd down 3 ― both machines down, 1 st is under repair. 4 ― both machines down, 2 nd is under repair. Draw a transition diagram The transition rates: v 0 = μ 1 + μ 2, v 1 = μ + μ 2 v 2 = μ 1 + μ, v 3 = v 4 = μ

6 Birth-death Process A special case of Markov process such that q 01 =λ 0 and for all other states i, q ii+1 =λ i, q ii–1 = μ i, and other q ij = 0. Correspondingly, P 01 = 1; The Poisson process is a special case of birth-death process, with constant birth rateλ and death rate 0.

7 Chapman-Kolmogorov equation P ij (t) is the probability that given the current state is i, the process will stay at state j after time period t. P ij (t) = P{X(t + s) = j | X(s) = i} ; ∑ P ij (t) = 1 Chapman-Kolmogorov equation // By conditioning on the state after time period t.

8 Kolmogorov’s Backward & Forward equation By setting either s or t in C-K equation to be infinitesimal, we get Kolmogorov’s Backward equation Kolmogorov’s Forward equation

9 Kolmogorov’s Backward & Forward equation ― Example Find P ij (t) by Kolmogorov Forward equation in the 2-state birth-death process. [Given: ] P’ 01 (t) = P 00 (t)q 01 – P 01 (t)v 1 = (1 – P 01 (t))λ– P 01 (t) μ =λ – (λ+ μ )P 01 (t) By solving this 1 st order DE, P 01 (t) = Ae –(λ+μ)t +λ/(λ+ μ ). By the boundary condition P 01 (0) = 0, A = –λ/(λ+ μ ). ∴ P 01 (t) = λ/(λ+ μ )(1 – e –(λ+μ)t ), P 00 (t) =1 –λ/(λ+ μ )(1 – e –(λ+μ)t ) 0 = off 1 = on 

10 Kolmogorov’s Backward & Forward equation ― Example Find P ij (t) by Kolmogorov Forward equation in the 2-state birth-death process. [Given: ] Similarly, P’ 10 (t) = P 11 (t)q 10 – P 10 (t)v 0 = μ– (λ+ μ )P 10 (t). P 10 (t) = μ /(λ+ μ )(1 – e –(λ+μ)t ), P 11 (t) =1 – μ /(λ+ μ )(1 – e –(λ+μ)t ) 0 = off 1 = on 

11 Limiting Probability P j of P ij (t) Definition. A continuous-time Markov chain is said to be ergodic when lim t→∞ P ij (t) exists for all j and the limiting value is independent of the initial state i. Let P j denote lim t→∞ P ij (t) By the flow conservation law (balance equations): the rate into a state = the rate out of a state // from Forward equation Similar toπ j in the discrete case, P j is the long run proportion of time the ergodic Markov process stays at state j and

Long Run Probability π j of P ij 12  When a continuous-time Markov chain is positive recurrent, so is the imbedded discrete-time Markov chain.  If a continuous-time Markov chain is irreducible and positive recurrent, then it is ergodic. In that case, the imbedded discrete-time markov chain has a unique long-run distribution {  j }, which is a solution to  Similarly,

13 Time Reversible Markov Process For an ergodic Markov process {X(t)} t≥0, its reversed process {Y(t)} T ≥t≥0, Y(t) = X(T – t) corresponds to an embedded discrete-time Markov chain. If π i exists, Y(t) has the same v i and if its corresponding embedded discrete-time Markov chain is time reversible, it is also time reversible. Then, the rate from i to j = the rate from j to i. P i q ij = P j q ji or equivalently, P ij = Q ij or equivalently, P i q ij = P j q ji  An ergodic birth-death process is time reversible.

14 Burke’s Theorem If < s , the stationary output process of M/M/s is a Poisson process with intensity. // this M/M/s is time reversible at the stationary state so balance flow holds.

15 Summary Chapman-Kolmogorov equation Kolmogorov’s Backward (and Forward) equation

16 Summary Limiting Probability P j  P j = lim t→∞ P ij (t)  Time Reversible Markov Process  P i q ij = P j q ji