1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.

Slides:



Advertisements
Similar presentations
Discrete time Markov Chain
Advertisements

Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
Markov Chains.
Matrix Analytic methods in Markov Modelling. Continous Time Markov Models X: R -> X µ Z (integers) X(t): state at time t X: state space (discrete – countable)
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Probability and Statistics with Reliability, Queuing and Computer Science Applications: Chapter 6 on Stochastic Processes Kishor S. Trivedi Visiting Professor.
Discrete Time Markov Chains
TCOM 501: Networking Theory & Fundamentals
Chapter 17 Markov Chains.
IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials.
Copyright © 2005 Department of Computer Science CPSC 641 Winter Markov Chains Plan: –Introduce basics of Markov models –Define terminology for Markov.
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Continuous Time Markov Chains and Basic Queueing Theory
Stochastic Processes Dr. Nur Aini Masruroh. Stochastic process X(t) is the state of the process (measurable characteristic of interest) at time t the.
Lecture 13 – Continuous-Time Markov Chains
048866: Packet Switch Architectures Dr. Isaac Keslassy Electrical Engineering, Technion Review.
TCOM 501: Networking Theory & Fundamentals
Chapter 4: Stochastic Processes Poisson Processes and Markov Chains
If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities.
2003 Fall Queuing Theory Midterm Exam(Time limit:2 hours)
The moment generating function of random variable X is given by Moment generating function.
1 Markov Chains H Plan: –Introduce basics of Markov models –Define terminology for Markov chains –Discuss properties of Markov chains –Show examples of.
Lecture 11 – Stochastic Processes
Lecture 7  Poisson Processes (a reminder)  Some simple facts about Poisson processes  The Birth/Death Processes in General  Differential-Difference.
Exponential Distribution & Poisson Process
1 Exponential Distribution & Poisson Process Memorylessness & other exponential distribution properties; Poisson process and compound P.P.’s.
Introduction to Stochastic Models GSLM 54100
Generalized Semi-Markov Processes (GSMP)
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
Intro. to Stochastic Processes
Chapter 2 Machine Interference Model Long Run Analysis Deterministic Model Markov Model.
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
1 Queuing Models Dr. Mahmoud Alrefaei 2 Introduction Each one of us has spent a great deal of time waiting in lines. One example in the Cafeteria. Other.
1 Elements of Queuing Theory The queuing model –Core components; –Notation; –Parameters and performance measures –Characteristics; Markov Process –Discrete-time.
Modeling and Analysis of Computer Networks
1 Birth and death process N(t) Depends on how fast arrivals or departures occur Objective N(t) = # of customers at time t. λ arrivals (births) departures.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < < tk < tk+1 If X(t) is discrete-valued If X(t) is continuous-valued i.e.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Model under consideration: Loss system Collection of resources to which calls with holding time  (c) and class c arrive at random instances. An arriving.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
Chapter 2 Probability, Statistics and Traffic Theories
Discrete Time Markov Chains
Continuous Time Markov Chains
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
8/14/04J. Bard and J. W. Barnes Operations Research Models and Methods Copyright All rights reserved Lecture 11 – Stochastic Processes Topics Definitions.
Flows and Networks Plan for today (lecture 3): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
Queueing Fundamentals for Network Design Application ECE/CSC 777: Telecommunications Network Design Fall, 2013, Rudra Dutta.
Random Variables r Random variables define a real valued function over a sample space. r The value of a random variable is determined by the outcome of.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
Markov Chains.
Discrete-time Markov chain (DTMC) State space distribution
Availability Availability - A(t)
Exponential Distribution & Poisson Process
Industrial Engineering Dep
Pertemuan ke-7 s/d ke-10 (minggu ke-4 dan ke-5)
CTMCs & N/M/* Queues.
Markov Chains Carey Williamson Department of Computer Science
September 1, 2010 Dr. Itamar Arel College of Engineering
Continuous time Markov Chains
Carey Williamson Department of Computer Science University of Calgary
Discrete time Markov Chain
Discrete time Markov Chain
Presentation transcript:

1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able to evaluate the steady-state performances Textbook : C. Cassandras and S. Lafortune, Introduction to Discrete Event Systems, Springer, 2007

2 Plan Basic definitions of continuous time Markov chains Characteristics of CTMC Performance analysis of CTMC Poisson process Approximation of general distributions by phase type distribution

3 Basic definitions of continuous time Markov chains

4 Stochastic process Discrete events Continuous event Discrete time Continuous time Memoryless A CTMC is a continuous time and memoriless discrete event stochastic process. Continuous Time Markov Chain (CTMC)

5 Definition : a stochastic process with discrete state space and continuous time {X(t), t > 0} is a continuous time Markov Chain (CTMC) iff P[X(t+s)= j  X(u), 0≤u≤s] = P[X(t+s)= j  X(s)],  t,  s,  j Memoryless: In a CTMC, the past history impacts on the future evolution of the system via the current state of the system

6 Continuous Time Markov Chain (CTMC) Poisson Arrivals Exponential service time N(t) : number of customers at time t Customer Arrivals Customer departures

7 Homogenuous CTMC Definition : A CTMC {X(t), t > 0} is homogeneous iff P[X(t+s)= j  X(t) = i] = P[X(t+s)= j  X(t) = i] = p ij (s) Homogeneous memoryless: In reliability, we only say "a machine that does not fail at age t is as good as new" Only homogeneous CTMC will be considered in this chapter.

8 Characteristics of CTMC

9 Behavior of a CTMC X(t) Two major components: T i = sojourn time in state i (random variable) p ij = probability of moving to state j when leaving state i

10 Sojourn time in a state Let T i be the random variable corresponding to the time spent in state i The memoryless property of the homogenuous CTMC implies The exponential distribution is the only continuous probability distribution having this property. In an CTMC, the sojourn time in any state is exponentially distributed.

11 Exponential distribution Let T be a continuous random variable with an exponential distribution of parameter Distribution Function (figure) : F T (t) = P{T ≤ t} Probability density function : f T (t) = dF T (t)/dt Mean : E[T] = 1/  Standard deviation:  [T] = 1/  Coeficient of variation: Cv(T) =  [T]/ E[T] = 1 Parameter often corresponds to some event rate (failure rate, repair rate, production rate,...)

12 Exponential distribution Memoryless : For a machine with exponentially distributed lifetime, we say that it is "as good as new" if it is not failed. The remaining lifetime of an used but UP machine has the same distribution as a new machine.

13 Transition probability When a CTMC leaves state i, it jumps to state j with probability p ij. This probability is: independent of time as the CTMC is homogeneous independent of sojourn time Ti as the process is markovian (memoryless)

14 1st characterization of a CTMC An CTMC is fully characterized by the following parameters: {  i } i  E with  i as the parameter of the exponential distribution of sojourn time T i {p ij } i≠j, with pij as the transition probability from i to j when leaving state i

15 Classification of a CTMC Each CTMC is associated an underlying DTMC by neglecting sojourn times. A state i of a CTMC is said transient (resp. recurrent, absorbing) if it is transient (resp. recurrent, absorbing) in the underlying DTCM A CTMC is irreducible if its underlying DTMC is irreducible. Remark: the concept of periodicity is not relevant.

16 2nd characterization of a CTMC Each state activates several potential events leading to different transitions. A CTMC travels from state i to state j in T ij time, an exponentially distributed random variable with parameter  ij.  i is called transition rate from i to j.

17 Equivalence of the two representation Let T i = MIN j {T ij } p ij = P{T ij = T i } Result to prove: T i = EXP(  ij ), p ij is independent of T i Moment generating function M X (u) = E[exp(uX)]

18 Performance analysis of CTMC

19 Probability distribution State probability  i (t) = P{X(t) = i} state probability vector, also called probability distribution  (t) = (  1 (t),  2 (t),...)

20 Transient analysis By conditionning on X(t), With

21 Transient analysis It can be shown, Letting dt go to 0,

22 Infinitesimal generator Let The matrix Q = [q ij ] is called infinitesimal generator of the CTMC As a ressult,

23 Transient state Laplace transforms of function f(t) Example 1 0 

24 Transient state = 0.1,  = 0.5 = 0.9,  = 0.8 = 0.01,  = 0.05 = 0.09,  = 0.08

25 Transient state: numerical computation 1. Uniformization with rate 2. Derived the embeded discrete-time Markov Chain 3. Determine distribution  D of (2) 4. Determine distribution  1 0    1 0    

26 Transient state: numerical computation Example: = 0.1,  = 1, Uniformization with rate  = 1   (0) =    tp0p1N-1 0,50, , , , , , , , , , , , , ,

27 Steady state distribution of a CTMC Thereom: For an irreducible CTMC with postive recurrent states, the probability distribution converges to a vector of stationary probabilities (  1,  2,...) that is independent of the initial distribution  (0). Further it is the unique solution of the following equation system: normalization equation flow balance equation or equilibrium eq

28 Flow balance equation The balance equation equivalent to :  i≠j  j  ji =  i≠j  i  ij Associate to each transition (i,j) a probability flow :  i  ij  i≠j  j  ji : total flow into state i  i≠j  i  ij : total flow out of state i Interpretation : Total flow in = Total flow out

29 Flow balance equation of set of states Let E 1 be a subset of states Flow balance equation : Total flow into E 1 = Total flow out of E 1

30 A manufaturing system Consider a machine which can be either UP or DOWN. The state of the machine is checked continuously. The average time to failure of an UP machine is 10 days. The average time for repair of a DOWN machine is 1.5 days. Determine the conditions for the state of the machine {X(t)} to be a Markov chain. Draw the Markov chain model. Find the transient distribution by starting from state UP and DOWN. Check whether the Markov chain is recurrent. Determine the steady state distribution. Determine the availability of the machine.

31 Poisson process

32 Poisson process A Poisson process is a stochastic process N(t) such that N(0) = 0 N(t) increments by +1 after a time T random distributed according to an exponential distribution of parameter. An arrival process is said Poisson if the inter-arrival times are exponentially distributed.

33 Properties of Poisson process A Poisson process is a CTMC N(t) has a Poisson distribution with parameter t

34 Properties of Poisson process A Poisson process is a CTMC P{N(t+dt) = k+1 | N(t) = k} = dt + o(dt) Probability of 0 arrival in dt P{N(t+dt) = k | N(t) = k} = 1- dt + o(dt) Probability of more than one arrival in dt P{N(t+dt) > k+1 | N(t) = k} = o(dt)

35 Properties of Poisson process The superposition of n Poisson process of parameter i is a Poisson process of parameter  i Assume that a Poisson process is split into n processes with probabilities p i. These n process are independent Poisson process with parameter p i

36 Birth-Death process

37 Definition Consider a population of individuals Let N(t) be the size of the population with N(t) = 0, 1, 2,... When N(t) = n, births arrive at according to a Poisson pocess of birth rate n > 0 Deaths arrive also according to a Poisson process of death rate  n > 0.

38 Key issues Graphic representation of the Markov chain Relation with the Poisson process (also called pure birth process) Condition for existence of steady state distribution Sufficient condition (larger death rate than birth rate) Steady state distribution  n

39 Approximation of general distributions by phase type distribution

40 Phase-type distribution A probability distribution that results from a system of one or more inter-related Poisson process occurring in sequence, or phases. The sequence in which each of the phases occur may itself be a stochastic process. Phase-type distribution = time until the absorption of a CTMC one absorbing state and m transient states. Each of the states of the Markov process represents one of the phases. Phase-type distributions can be used to approximate any positive valued distribution.

41 Definition A CTMC with m+1 states, where m ≥ 1, such that the states 1,...,m are transient states and state m+1 is an absorbing state. An initial probability of starting in any of the m+1 phases given by the probability vector (α, α m+1 ). The continuous phase-type distribution is the distribution of time from the above process's starting until absorption in the absorbing state. This process can be written in the form of a transition rate matrix, where S is an m×m matrix and S 0 = -S 1 with 1 represents an m×1 vector with every element being 1

42 Characterization Time X until the absorbing state is phase-type distributed PH(α,S). The distribution function of X is given by, F(x) = 1 -  exp(Sx)1, and the density function, f(x) =  exp(Sx)S 0, for all x > 0. Mean: E[X n ] = (-1) n n!  S -n 1

43 Erlang distribution E k : k-stage Erlang distribution with parameter  X = sum of k independent random variable of exponential distribution with parameter  E[X] = k/  Var[X] = k/  2 cv X =  X / E[X] = 1/k 1/2  ●●● Iden. phases

44 Serial phase distribution nn ●●● Not ident phases To be proved: two-moment matching of a r.v. X with n-stage serial phase distribution iff

45 Hyper-exponential or mixture of exponential distribution X =  1 X 1 +  2 X  n X n where P(Ii = 1) =  i, P(Ii = 0) = 1-  i  1 +   n = 1, X i = EXP(  i ) E[X] =  1 /  1 +  2 /   n /  n E[X 2 ] = 2  1 /   2 /   n /  n 2 Two moment matching of a random variable X with mean m and variance  2  m 2 (???) X =  1 X 1 +  2 X 2 where X 1 = EXP(  1 ), X 2 = 0 By Jensen’s inequality,

46 Coxian distribution   nn ●●● p1p2p2 p n-1 1-p 1 1-p 2 1 Coxian distribution can be used to approximate any distribution. Phases not necessarily identical Theorem: There exists an exact two moment matching with n-stage Coxian distribution for a random variable X iff

47 Coxian distribution   22 ●●● p 1 =pp 2 =1p n-1 =1 1-p 1 1-p 2 1 Also two-moment matching with n-stage serial distribution.

48 A manufaturing system Consider a machine which can be either UP or DOWN. The state of the machine is checked continuously. The average time to failure of an UP machine is 10 days. The average time for repair of a DOWN machine is 1.5 days. Assumed that UP time = E 2 and DOWN time = E 3. Draw the Markov chain model. Homework: Solve the same problem with Coxian distribution with the same means but