Presentation is loading. Please wait.

Presentation is loading. Please wait.

Burke Theorem, Reversibility, and Jackson Networks of Queues

Similar presentations


Presentation on theme: "Burke Theorem, Reversibility, and Jackson Networks of Queues"— Presentation transcript:

1 Burke Theorem, Reversibility, and Jackson Networks of Queues

2 Reverse CTMC Basic idea is to run time in reverse
Departures become arrivals and vice versa The reverse chain is also a CTMC Some basic properties of the reverse chain The fraction of time spent in state i in the forward chain is the same as the fraction of time spent in state i in the reverse chain: π*i = πi Rate of transitions from i to j in the reverse chain is equal to the rate of transitions from j to i in the forward chain: π*iq*ij = πjqji If a CTMC is time-reversible, i.e., πiqij = πjqji and Σiπi=1, then the forward and reverse chains are statistically identical and q*ij = qij

3 Burke’s Theorem Consider an M/M/k system with arrival rate λ. At steady state, the following holds The departure process is Poisson(λ) At all times t, the number of jobs in the system at t, N(t), is independent of the sequence of departures times prior to time t Implications Tandem M/M/k queues can be analyzed as independent M/M/k queue Acyclic networks of M/M/k queues with probabilistic routing can be analyzed as networks of independent M/M/k queues Arrival process to each queue is Poisson

4 The Case of Cyclic Networks
Burke’s Theorem does not hold for cyclic networks (the arrival process is not Poisson) Consider the following example Very low external arrival rate Very fast server Most jobs repeat The independent increment property is clearly not valid (probability of another arrival is much higher just following an arrival Arrival process is, therefore, not Poisson The system can, however, still be represented by a Markov chain (a resubmitted job is indistinguishable from a new one), so there is hope μ fast 0.99 0.01 Poisson(λ≈0)

5 Markov Chain for Cyclic Network
μ fast 0.99 0.01 Poisson(λ≈0) 1 2 3 4 λ 0.01μ Same Markov chain as standard M/M/1 queue with ρ = λ/(0.01μ)

6 Due to outside departure
Rate leaving (n1,n2,…,nk) Jackson Networks Due to departure from Due to outside arrival A general set of single server queues with infinite waiting rooms, probabilistic routing, and exponentially distributed service times External arrivals, if any (open networks) are Poisson Total arrival rates at individual server are sum of external arrival rates and internal transition rates to that server They can be computed by solving a set of linear equations Solving Jackson networks Identify “local” balance equations (the art is in figuring what should balance what) Guess a solution for those balance equations Server 1 Server 2 Server k = = = = Server 1 Server 2 Server k Due to an arrival at Due to outside departure Rate entering (n1,n2,…,nk)

7 Jackson Networks – Main Results
An open Jackson network with k servers has product form P(n1,n2,…,nk) =ρ1n1(1-ρ1)ρ2n2(1-ρ2)…ρknk(1-ρk) ρi = λi/μi, where is λ1,λ2,…, λk is the solution of the set of equations λi = ri +ΣjλjPji (ri is external arrival rate to server i, and λjPji is fraction of departures from server j going to server i) All queues behave like M/M/1 queues even though the arrival process is not Poisson A closed Jackson network with k servers has product form P(n1,n2,…,nk) = Cρ1n1ρ2n2…ρknk where ρi = λi/μi, C’= [ΣΣini=N ρ1n1ρ2n2…ρknk ]-1 is a normalization constant, and the λi’s are any solution to the simultaneous rate equations λi = ΣjλjPji

8 Open Network Example Rate equations: State probabilities
μ1 1-p p Poisson(λ) μ2 I/O CPU Rate equations: λ1 = λ + λ2 λ1 = λ/p ρ1= λ/(pμ1)   λ2 = (1-p)λ1 λ2 = λ(1-p)/p ρ2= λ(1-p)/(pμ2) State probabilities πn1,n2 = ρ1n1ρ2n2 (1-ρ1)(1-ρ2), n1, n2 = 0,1,2,… E[N] = E[N1]+E[N2] = ρ1/(1-ρ1) + ρ2/(1-ρ2)

9 Closed Network Example
μ1=5 0.5 μ2=4 0.4 0.6 N = 2 Rate equations: λ1 = 0.5λ1+ 0.4λ2  λ2 = 4/5λ1 λ2 = 0.5λ λ2  λ2 = 4/5λ1 (only one independent equation) Choose λ1 = 5 and therefore λ2 = 4, so that ρ1 = ρ2 = 1 State probabilities (for N = 2) πn1,n2 = C ρ1n1ρ2n2, where n1+n2 = 2, i.e., (0,2); (2,0); (1,1) π(0,2) + π(2,0) + π(1,1) = 3C = 1  C = 1/3

10 Extensions – Open Classed Networks
The results hold for classed networks, i.e., networks with k servers and l job classes Same service rate μi for all classes at server i, but Jobs can change class after service (from c to c’) Different external arrival rates: ri(c) for class c at server i Different routing probabilities per class: Pij(c)(c’) for probability that when completing service at server i, a class c job moves to server j as a class c’ job Can be used to emulate different per class job sizes Arrival rate for class c at server i is (arrival rate equations) λi(c) = ri(c) + Σ{j=1 to k} Σ {c’=1 to l} λj(c’) Pji(c’)(c) Network state: z = (z1,z2,…,zk), where z = [(c1(1),c1(2),…,ck(n1)), (c2(1),c2(2),…,c2(n2)), …, (ck(1),ck(2),…,ck(nk))] ci(j) is class of job in position j, j = 1, 2,…, ni, at server i State probabilities π(z1,z1,…,zk)={i=1 to k}P{state at server i is zi} where P{state at server i is zi} = (1- ρi)[λi(ci(1))λi(ci(2))…λi(ci(ni))]/μini Aggregate state probabilities P(n1,n2,…,nk) = {i=1 to k}P{ni jobs at server i} = {i=1 to k}ρini(1-ρi), where ρi = λi/μi and λi = Σ {c=1 to l} λi(c)

11 Distribution of Job Classes Example with Two Classes
In a two-class system, the probability of s jobs of class 1 and t jobs of class 2 at server i is P{Server i has s class 1 jobs and t class 2 jobs}

12 CPU & I/O Bound System Solving those two systems of equations yields:
μ1=2 0.5 μ2=1 I/O CPU PC1,out=0.3 PI1,2=0.95 PC1,2=0.05 PI1,1=0.05, PC1,1=0.65 rC1=0.2 rI2=0.25 PI2,2=0.5 PI2,out=0.4 PI2,1=0.1 PC2,1=1 Rate equations: λ1C = rC1+ λ1CPC1,1 + λ2C PC2,1  λ1C = λ1C + λ2C λ1C = rC2+ λ1CPC1,2 + λ2C PC2,2  λ2C = 0.05λ1C λ1I = rI1+ λ1IPI1,1 + λ2I PI2,1  λ1I = 0.05λ1I+ 0.1λ2I λ1I = rI2+ λ1IPI1,2 + λ2I PI2,2  λ2I = λ1I + 0.5λ2I Solving those two systems of equations yields: λ1C = 2/3, λ2C = 1/30, λ1I = 5/76, λ2I = 5/8 λ1 = λ1C + λ1I = , ρ1 = λ1/μ1= λ2 = λ2C + λ2I = , ρ2 = λ2/μ2= Which immediately gives for i = 1,2 E[Ni] = ρi/(1- ρi) and E[Ti] = E[Ni]/λi More interestingly, what is E[TC] or E[TI]? E[TC] = E[V1C]  E[T1] + E[V2C]  E[T2] E[V1C] = E[V1C] + E[V2C] E[V2C] = 0.05 E[V1C] E[V1C] = 3.333, E[V2C] = 0.167 so that E[TC] = 3.117 Similarly, we can compute E[N1C] E[N1C] is E[N1]  p, where p is the fraction p of CPU-bound jobs at server 1, i.e., p = λ1C/(λ1C + λ1I)

13 Back to Closed Networks
Recall that in a closed Jackson network with k servers and N jobs, the state probabilities are of the form P(n1,n2,…,nk) = Cρ1n1ρ2n2…ρknk where ρi = λi/μi, and C’= [ΣΣini=N ρ1n1ρ2n2…ρknk ]-1 is a normalization constant Solving for the λi’s calls for solving a system of k simultaneous rate equations λi = ΣjλjPji Computing C calls for adding up a total of terms This grows exponentially in N and k We need a better approach

14 Arrival Theorem In a closed Jackson network with M > 1 jobs, an arrival to (any) server j sees a distribution of the number of jobs at each server equal to the distribution of the number of jobs at each server in the same network, but with only M – 1 jobs. The mean number of jobs the arrival sees at server j is equal to E[Nj(M – 1)] We can use the Arrival Theorem to derive a recursion for the mean response time at server j

15 Mean Value Analysis A simple recursive approach to computing E[Ti(M)] (and E[Ni(M)]) in a system with M > 1 jobs and k servers E[Ti(M)] = 1/μi(1+piλ(M-1)E[Ti(M-1)]), where λ(M-1) is the total arrival rate to all servers pi = λi(M-1)/λ(M-1) is the fraction of those arrivals headed for server i (pi is independent of M) – pi = Vi/Σ{j=1 to k}Vj , Vi is # visits of server i for each job completion In a system with k servers, λ(M) is given by λ(M) = Σ{i=1 to k}λi(M) = M/[Σ{i=1 to k}piE[Ti(M)]] (*) (*) Based on Little’s Law and the fact that M = Σ{i=1 to k} E[Ni(M)]

16 Mean Value Analysis – Recursion
Initial condition of recursion: E[Tj(1)] = 1/μj Recursive step: E[Tj(M)] = 1/μj + E[Number at server j seen by arrival at j]/μj = 1/μj + E[Nj(M-1)]/μj – by Arrival Theorem = 1/μj + λj(M-1)E[Tj(M-1)]/μj – by Little’s Law = 1/μj + pjλ(M-1)E[Tj(M-1)]/μj – since pj = λj(M-1)/λ(M-1) Next step is to computer λ(M-1) using Little’s Law and the fact that M –1 = Σ{j=1 to k} E[Nj(M-1)] = Σ{j=1 to k}λj(M-1) E[Tj(M-1)] = Σ{j=1 to k} pjλ(M-1) E[Tj(M-1)] = λ(M-1) Σ{j=1 to k} pj E[Tj(M-1)] λ(M-1) = (M–1)/[Σ{j=1 to k}pjE[Tj(M-1)]]

17 MVA Example λ(M) = M/Σ{i=1 to k}piE[Ti(M)] (*)
E[Ti(M)] = 1/μi(1+piλ(M-1)E[Ti(M-1)]) λ(M) = M/Σ{i=1 to k}piE[Ti(M)] (*) μ=1 M = 3 What are E[N1(3)] and E[N2(3)], for M = 3? Note that in this system p1 = p2 =1/2 (each server sees each job once, and so experience half of all job visits to servers in the system) Recursion for E[T1(i)] and E[T2(i)] E[T1(1)] = 1/μ1= 1, E[T2(1)] = 1/μ2= 1/2, λ(1) = 4/3 (by (*)) E[T1(2)] = 1+(1/24/31)/1 = 5/3, E[T2(2)] = 1/2+(1/24/31/2)/2 = 2/3, λ(2) = 12/7 E[T1(3)] = 1+(1/212/75/3)/1 = 17/7, E[T2(3)] = 1/2+(1/212/72/3)/2 = 11/14 = 1/2, λ(3) = 28/15 This gives E[N1(3)] = E[T1(3)]λ1(3) = E[T1(3)]p1λ(3) = 17/71/228/15 = 34/15

18 More on MVA μ=1 M Note that λ(M) is NOT the system throughput when they are M jobs in circulation. It is the total arrival rate across all servers The system throughput would be λ1(M) Hence, while Little’s Law holds and we have M = λ(M)E[T(M)], E[T(M)] is not the standard system response time. It is simply a quantity defined as E[T(M)] = Σi piE[Ti(M)] Consider the case M = 1 We found E[T1(1)] = 1/μ1= 1, E[T2(1)] = 1/μ2= 1/2, λ(1) = 1/(1/21+ 1/21/2) = 4/3 We have E[T(1)] = 1/21+ 1/21/2 = 3/4, and λ(1)E[T(1)] = 4/33/4 = 1, as it should according to Little’s Law However, we also know that the system’s response time is E[R] = 1/μ1+ 1/μ2= 3/2. Applying Little’s law to this system, we get λ1(1)E[R] = p1λ(1)E[R] = 1/24/33/2 = 1, as it again should


Download ppt "Burke Theorem, Reversibility, and Jackson Networks of Queues"

Similar presentations


Ads by Google