Matrix Analytic methods in Markov Modelling. Continous Time Markov Models X: R -> X µ Z (integers) X(t): state at time t X: state space (discrete – countable)

Slides:



Advertisements
Similar presentations
Discrete time Markov Chain
Advertisements

Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
CS433 Modeling and Simulation Lecture 06 – Part 03 Discrete Markov Chains Dr. Anis Koubâa 12 Apr 2009 Al-Imam Mohammad Ibn Saud University.
Flows and Networks Plan for today (lecture 2): Questions? Continuous time Markov chain Birth-death process Example: pure birth process Example: pure death.
Markov Chains.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
1 The Monte Carlo method. 2 (0,0) (1,1) (-1,-1) (-1,1) (1,-1) 1 Z= 1 If  X 2 +Y 2  1 0 o/w (X,Y) is a point chosen uniformly at random in a 2  2 square.
Discrete Time Markov Chains
Markov Chains 1.
TCOM 501: Networking Theory & Fundamentals
IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials.
Copyright © 2005 Department of Computer Science CPSC 641 Winter Markov Chains Plan: –Introduce basics of Markov models –Define terminology for Markov.
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Continuous Time Markov Chains and Basic Queueing Theory
Markov Chains Lecture #5
Lecture 13 – Continuous-Time Markov Chains
048866: Packet Switch Architectures Dr. Isaac Keslassy Electrical Engineering, Technion Review.
TCOM 501: Networking Theory & Fundamentals
Homework 2 Question 2: For a formal proof, use Chapman-Kolmogorov Question 4: Need to argue why a chain is persistent, periodic, etc. To calculate mean.
If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities.
2003 Fall Queuing Theory Midterm Exam(Time limit:2 hours)
Stochastic Process1 Indexed collection of random variables {X t } t   for each t  T  X t is a random variable T = Index Set State Space = range.
Queuing Networks: Burke’s Theorem, Kleinrock’s Approximation, and Jackson’s Theorem Wade Trappe.
1 Markov Chains H Plan: –Introduce basics of Markov models –Define terminology for Markov chains –Discuss properties of Markov chains –Show examples of.
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Flows and Networks Plan for today (lecture 5): Last time / Questions? Waiting time simple queue Little Sojourn time tandem network Jackson network: mean.
Introduction to Stochastic Models GSLM 54100
Generalized Semi-Markov Processes (GSMP)
Intro. to Stochastic Processes
Markov Decision Processes1 Definitions; Stationary policies; Value improvement algorithm, Policy improvement algorithm, and linear programming for discounted.
Network Design and Analysis-----Wang Wenjie Queueing System IV: 1 © Graduate University, Chinese academy of Sciences. Network Design and Analysis Wang.
Flows and Networks Plan for today (lecture 4): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
EE6610: Week 6 Lectures.
Markov Chains and Random Walks. Def: A stochastic process X={X(t),t ∈ T} is a collection of random variables. If T is a countable set, say T={0,1,2, …
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < < tk < tk+1 If X(t) is discrete-valued If X(t) is continuous-valued i.e.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Model under consideration: Loss system Collection of resources to which calls with holding time  (c) and class c arrive at random instances. An arriving.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Flows and Networks Plan for today (lecture 2): Questions? Birth-death process Example: pure birth process Example: pure death process Simple queue General.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
Discrete Time Markov Chains
Chapter 6 Product-Form Queuing Network Models Prof. Ali Movaghar.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Flows and Networks Plan for today (lecture 3): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
QUEUING. CONTINUOUS TIME MARKOV CHAINS {X(t), t >= 0} is a continuous time process with > sojourn times S 0, S 1, S 2,... > embedded process X n = X(S.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Reliability Engineering
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
Markov Chains.
Discrete-time Markov chain (DTMC) State space distribution
Markov Chains and Random Walks
Flows and Networks Plan for today (lecture 3):
Much More About Markov Chains
Flows and Networks Plan for today (lecture 4):
CTMCs & N/M/* Queues.
Discrete Time Markov Chains
Discrete-time markov chain (continuation)
Markov Chains Carey Williamson Department of Computer Science
Lecture 4: Algorithmic Methods for G/M/1 and M/G/1 type models
September 1, 2010 Dr. Itamar Arel College of Engineering
Continuous time Markov Chains
Carey Williamson Department of Computer Science University of Calgary
CS723 - Probability and Stochastic Processes
Presentation transcript:

Matrix Analytic methods in Markov Modelling

Continous Time Markov Models X: R -> X µ Z (integers) X(t): state at time t X: state space (discrete – countable) R: real numbers (continuous time line)

Continous Time Markov Models X is piecewise constant X is typically cadlag ("continue à droite, limite à gauche") = RCLL (“right continuous with left limits”)

Transition probabilities P(X(t+h)=j|X(t)=i) ¼ h ¸ ij for i  j P(X(t+h)=j|X(t)=j) = 1-  i  j P(X(t+h)=i|X(t)=j) ¼ 1- h  i  j ¸ ji P(X(t+h)=j)=  i P(X(t+h)=j and X(h)=i) =  i P(X(t+h)=j | X(h)=i) P(X(h)=i) =  i  j ¸ ij h P(X(h)=i) + h(1-  i  j ¸ ji ) P(X(h)=j)

Transition probabilities P(X(t+h)=j)=  i P(X(t+h)=j and X(h)=i) + (1-  k  j ¸ ik ) P(X(h)=j) P(t)=[P(X(t)=0) P(X(t)=1) P(X(t)=2)..] P(t+h) ¼ P(t)H H=I+hQ Q ij = ¸ ij Q jj = 1-  i  j ¸ ji

Taking limits P(t+h) ¼ P(t)H H=I+hQ P(t+h) ¼ P(t)(I+hQ)=P(t)+hP(t)Q (P(t+h)-P(t))/h ¼ P(t)Q d/dt P(t) = P(t)Q P(t)=P(0)exp(Qt)

Irreducibility X is irreducible when states are mutually reachable. X is irreducible iff for every i,j 2 X there is a sequence {i(1),i(2),..,I(N) 2 X} such that i(1)=i i(N)=j and ¸ i(k),i(k+1) > 0 for every k 2 1,..,N-1

Recurrence Assume X(t n - )=j and X(t n )=i  j then t n is a transition time Let {t n } be the sequence of consequetive transition times. {Xn=X(t n )} is called the embedded chain Let t 0 =0, X 0 =i then k(i)=inf{k>0: X(k)=i}=inf{k>1: X(k)=i} T i =t k(i) T i is the time to next visit at i X is recurrent if P(T i < 1 )=1 for all i (almost certain return to all states) X is positive recurrent if E(T i ) · 1 for all i

Stationary probability d/dt P(t) = P(t)Q P(t)=P(0)exp(Qt) When X is irreducible and positive recurrent there is a unique probability vector ¦ such that P(t) -> ¦ ¦ solves ¦ Q=0

Stationary probability Ergodicity: ¦ i = E(D i )/E(T i ) ¦ i is the fraction of the time in state i Statistically intuitively appealing X(t) ==i TiTi DiDi time

Example Poisson Counting Process Q i,i+1 = ¸ ¸¸¸¸ 1023 Counts Poisson events Birth Chain Not irreducible Not recurrent

Example Birth/Death(BD)-chain Q i,i+1 = ¸ Q i+1,i = ¹ ¸¸¸¸ 1023 Models a queueing system with Poisson arrival process and independent exponentially distributed service times ¸ is arrival rate ¹ is service rate Irreducible Positive recurrent for ¸ < ¹ ¹¹¹¹

Example Birth/Death(BD)-chain ¸¸¸¸ 1023 ½ = ¸ / ¹ ¦ n = ½¦ n-1 ¦ n = ½ n ¦ 0 P 0 =1/  n=0 1 ½ n =1- ½ ¦ n = ½ n (1- ½ ) E(X) =  n=0 1 n ¦ n = ½ /(1- ½ ) ¹¹¹¹

Example Birth/Death(BD)-chain ¸1¸2¸3¸ ½ n = ¸ n / ¹ n ¦ n = ½ n ¦ n-1 ¦ n = ¦ i=1 n ½ i ¦ 0 P 0 =1/  n=0 1 ¦ i=1 n ½ i ¹1¹2¹3¹4

Markov Modulated Poisson Process Has two modes: Modes={ON,OFF} M(t) 2 Modes is a two state CTMC Transmits with rate ¸ in ON mode. Counting proces combines state spaces, i.e. X = Modes £ {0,1,2,..} Q (ON,i),(ON,i+1) = ¸ Q (ON,i),(OFF,i) = ¯ Q (OFF,i),(ON,i) = ® Q i,j =0 otherwise

Markov Modulated Poisson Process Q (ON,i),(ON,i+1) = ¸ Q (ON,i),(OFF,i) = ¯ Q (OFF,i),(ON,i) = ® Q i,j =0 otherwise 1OFF23 ¸¸¸¸ ON ® ¯®®®¯¯¯

MMPP with exponential service Q (ON,i),(ON,i+1) = ¸ Q (ON,i),(OFF,i) = ¯ Q (OFF,i),(ON,i) = ® Q (ON,i),(ON,i-1) = ¹ Q (OFF,i),(OFF,i-1) = ¹ Q i,j =0 otherwise 1OFF23 ¸¸¸¸ ON ® ¯®®®¯¯¯ ¹¹¹¹ ¹¹¹¹

State ordering For a state (i,M) we denote i the level of the state We order states so that equal levels are gathered (i,OFF),(i,ON)(i+1,OFF)(i+1,ON)(i+2,OFF)(i+2,ON)

Generator matrix

Sub matrices

Generator matrix by submatrices Balance equations: P 0 A 0 + P 1 B = 0 eq(0) P 0 C + P 1 A + P 2 B = 0 eq(1) P i C + P i+1 A + P i+2 B = 0 eq(i+1) We look for a matrix geometric solution, i.e. P 1 =P 0 R P i+1 = P i R Inserting in eq(0): P 0 A 0 + P 0 R B = 0 and eq(i+1) P i (C+R A + R 2 B)=0 for all P i

Solving for R P i (C+R A + R 2 B)=0 for all P i Sufficient that C+R A + R 2 B=0 (Ricatti equation) Iterative solution R 0 =0 repeat R n+1 =-(C+R n 2 B) A -1 Converges for irreducible positive recurrent Q

MM - service Q (ON,i),(ON,i+1) = ¸ Q (OFF,i),(OFF,i+1) = ¸ Q (ON,i),(OFF,i) = ¯ Q (OFF,i),(ON,i) = ® Q (ON,i),(ON,i-1) = ¹ Q (OFF,i),(OFF,i-1) =0 Q i,j =0 otherwise 1OFF23 ¸¸¸¸ ON ® ¯®®®¯¯¯ ¹¹¹¹ ¸ ¸¸¸

Generator matrix

Sub matrices

Generally We still look for a matrix geometric solution: Now:  i=0 1 R i A i = 0  A 0 + R A 1  i=2 1 R i A i Iteration: R n = -A 1 -1 (A 0 +  i=2 1 R n-1 i A i ) Solving for P 0 : P 0  i=0 1 R i B i = 0 Conditions for solution: Irreducibility and pos. recurrence

Miniproject (i) Let traffic be generated by an on/off Markov process with on rate: ¸ =1, mean rate 0.1 ¸, average on time: T=1 Let service be exponential with rate ¹ = 0.2 ¸ Construct the generator matrix for the data given above. Use the iterative algoritme to solve for the load matrix R Solve for P 0 and P i Compute the mean queue length Compare with M/M/1 results

Miniproject (ii) Collect file size or web session duration data Check for power tails and estimate tail power Find appropriate parameters for a hyperexponential approximation of the reliability estimated reliability Construct the generator matrix of an equivalent ME/M/1 queue