Erlang, Hyper-exponential, and Coxian distributions

Slides:



Advertisements
Similar presentations
Lecture 10 Queueing Theory. There are a few basic elements common to almost all queueing theory application. Customers arrive, they wait for service in.
Advertisements

H(s) x(t)y(t) 8.b Laplace Transform: Y(s)=X(s) H(s) The Laplace transform can be used in the solution of ordinary linear differential equations. Let’s.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Nur Aini Masruroh Queuing Theory. Outlines IntroductionBirth-death processSingle server modelMulti server model.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Simple queuing models (Sec )
Mean Delay in M/G/1 Queues with Head-of-Line Priority Service and Embedded Markov Chains Wade Trappe.
ORDINARY DIFFERENTIAL EQUATION (ODE) LAPLACE TRANSFORM.
1 Exponential distribution: main limitation So far, we have considered Birth and death process and its direct application To single server queues With.
Chapter 2 Machine Interference Model Long Run Analysis Deterministic Model Markov Model.
Moment Generating Functions
Squared coefficient of variation
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
TexPoint fonts used in EMF.
Introduction Verbal descriptions of mathematical patterns and situations can be represented using equations and expressions. A variable is a letter used.
State N 2.6 The M/M/1/N Queueing System: The Finite Buffer Case.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
The M/M/ N / N Queue etc COMP5416 Advanced Network Technologies.
Sampling and estimation Petter Mostad
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
1 Transformation Techniques In Probability theory various transformation techniques are used to simplify the solutions for various moment calculations.
Random Variables r Random variables define a real valued function over a sample space. r The value of a random variable is determined by the outcome of.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Queueing Theory II.
Quantum Two 1. 2 Angular Momentum and Rotations 3.
5.1 Exponential Functions
Expectations of Random Variables, Functions of Random Variables
Lesson 8: Basic Monte Carlo integration
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
WARM UP Find each equation, determine whether the indicated pair (x, y) is a solution of the equation. 2x + y = 5; (1, 3) 4x – 3y = 14; (5, 2)
Lecture 3 B Maysaa ELmahi.
Equations Quadratic in form factorable equations
Laplace Transforms Chapter 3 Standard notation in dynamics and control
Expectations of Random Variables, Functions of Random Variables
Much More About Markov Chains
4.2 Real Zeros Finding the real zeros of a polynomial f(x) is the same as solving the related polynomial equation, f(x) = 0. Zero, solution, root.
Internet Queuing Delay Introduction
3.1 Expectation Expectation Example
Internet Queuing Delay Introduction
Birth-Death Process Birth – arrival of a customer to the system
Our favorite simple stochastic process.
Moment Generating Functions
Sampling Distribution of the Sample Mean
Exponential & Logarithmic Equations
TexPoint fonts used in EMF.
Queueing Theory II.
Solutions Queueing Theory 1
Using Factoring To Solve
Solutions Queueing Theory 1
What LIMIT Means Given a function: f(x) = 3x – 5 Describe its parts.
Chapter 8: Estimating with Confidence
Continuous time Markov Chains
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Exponential & Logarithmic Equations
Chapter 8: Estimating with Confidence
Equations Quadratic in form factorable equations
Chapter 8: Estimating with Confidence
Exponential & Logarithmic Equations
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Introduction to Probability: Solutions for Quizzes 4 and 5
Chapter 3 Modeling in the Time Domain
Chapter 5 Continuous Random Variables and Probability Distributions
Presentation transcript:

Erlang, Hyper-exponential, and Coxian distributions Mixture of exponentials Combines a different # of exponential distributions Erlang Hyper-exponential Coxian μ μ μ μ E4 Service mechanism μ1 P1 μ2 H3 The reason people was studying these distributions was to introduce more realistic representation of real life service times. P2 P3 μ3 μ μ μ μ C4

Erlang distribution: analysis 1/2μ 1/2μ E2 Mean service time E[Y] = E[X1] + E[X2] =1/2μ + 1/2μ = 1/μ Variance Var[Y] = Var[X1] + Var[X2] = 1/4μ2 v + 1/4μ2 = 1/2μ2

Squared coefficient of variation: analysis constant exponential C2 Erlang 1 Hypo-exponential X is a constant X = d => E[X] = d, Var[X] = 0 => C2 =0 X is an exponential r.v. E[X]=1/μ; Var[X] = 1/μ2 => C2 = 1 X has an Erlang r distribution E[X] = 1/μ, Var[X] = 1/rμ2 => C2 = 1/r fX *(s) = [rμ/(s+rμ)]r

Probability density function of Erlang r Let Y have an Erlang r distribution r = 1 Y is an exponential random variable r is very large The larger the r => the closer the C2 to 0 Er tends to infintiy => Y behaves like a constant E5 is a good enough approximation

Generalized Erlang Er Classical Erlang r Generalized Erlang r E[Y] = r/μ Var[Y] = r/μ2 Generalized Erlang r Phases don’t have same μ rμ rμ … rμ Y μ1 μ2 … μr Y

Generalized Erlang Er: analysis If the Laplace transform of a r.v. Y Has this particular structure Y can be exactly represented by An Erlang Er Where the service rates of the r phase Are minus the root of the polynomials

Hyper-exponential distribution μ1 P1 μ2 X P2 . Pk μk P1 + P2 + P3 +…+ Pk =1 Pdf of X?

Hyper-exponential distribution:1st and 2nd moments Example: H2

Hyper-exponential: squared coefficient of variation C2 = Var[X]/E[X]2 C2 is greater than 1 Example: H2 , C2 > 1 ?

Coxian model: main idea Instead of forcing the customer to get r exponential distributions in an Er model The customer will have the choice to get 1, 2, …, r services Example C2 : when customer completes the first phase He will move on to 2nd phase with probability a Or he will depart with probability b (where a+b=1) a b

Coxian model μ1 μ2 μ3 μ4 a1 a2 a3 b1 b2 b3 μ1 b1 a1 b2 μ1 μ2 a1 a2 b3

Coxian distribution: Laplace transform Laplace transform of Ck Is a fraction of 2 polynomials The denominator of order k and the other of order < k Implication A Laplace transform that has this structure Can be represented by a Coxian distribution Where the order k = # phases, Roots of denominator = service rate at each phase

Coxian model: conclusion Most Laplace transforms Are rational polynomials => Any distribution can be represented Exactly or approximately By a Coxian distribution

Coxian model: dimensionality problem A Coxian model can grow too big And may have as such a large # of phases To cope with such a limitation Any Laplace transform can be approximated by a c To obtain the unknowns (a, μ1, μ2) Calculate the first 3 moments based on Laplace transform And match these against those of the C2 a μ1 μ2 b=1-a

C2 : first three moments

Rational polynomials approximated by Coxian-2 (C2) The Laplace transforms of most pdfs Have the following shape Can be exactly represented by A Coxian k (Ck) distribution The # stages = the order of the denominator Service rates given by the roots of the denominator If k is very large => dimensionality problem Solution: collapse the Ck into a C2 a μ1 μ2 b=1-a

3 moments method The first way of obtaining the 3 unknowns (μ1 μ2 a) Let m1, m2, m3 be the first three moments of the distribution which we want to approximate by a C2 The first 3 moments of a C2 are given by by equating m1=E(X), m2=E(X2), and m3 = E(X3), you get

3 moments method: solution The following expressions will be obtained However, The following condition has to hold: X2 – 4Y >= 0 => 3 < 2m1m3 => the 3 moments method applies to the case where c2 > 1 for the original distribution The following condition has to hold: 3 < 2m1m3 . Note that the above method of moment applies to the case where the c2 of the original distribution (which we approximate by a C2) is greater than 1.

Two-moment fit If the previous condition does not hold You can use the following two-moment fit General rule Use the 3 moments method If it doesn’t do => use the 2 moment fit approximation

M/C2/1 queue What is the state of the system? μ μ C2 a b What is the state of the system? How many variables are needed to Describe the state of this queue? A 2-dimensional state (n1 ,n2) is needed n1 represents the number of customers in the queue n2 is the state of the server 0 => server is idle; 1 => the server is busy in phase 1 2 => server is busy in phase 2

M/C2/1 queue: analysis To analyze this queue, one must go thru Rate diagram that depicts state transition Steady state equations Derive the balance equations Based on these equations Obtain the generating functions Using a recursive scheme Solve the M/C2/1 queue and Determine P(n) = Pn = prob of having n customers in system

M/C2/1: state diagram (n1 , n2) Where n1 is the # customers in the queue Excluding the one in service n2 is the state of the server (0:idle, 1:phase1, 2:phase2) 0,0 μ2 λ bμ1 aμ1 2nd column: states of the system where server busy in phase 2 1st column: States where system in phase 1 0,1 0,2 We will start by construct the flow balance equations. In order to do this, first we need to draw the state diagram that represents all possible transitions from one state to another. Starting at (0,0), the system goes to (0,1) because the arriving customer goes directly into service. Other arrivals will lead to queuing up customers. What happens when you have a service completion? From (0,1) with some probability the system proceeds to (0,0) and with another probability it goes to (0,0). The state diagram consists of 2 columns. The first column involves states of the system where the server is busy in phase 1. The other column includes all the states of the system where the server is busy in phase 2. In order to obtain the expressions for these probabilities, first we need to come up with their corresponding moment generating functions g1(z) and g2(z). 1,1 1,2 2,1 2,2 . .

M/C2/1: steady state equations 1st set of equations (1st column)

Balance equations 2nd set of steady state equations (2nd column) We are interested in finding Pn Prob of having n customers in the system that can be obtained based on Pn1,n2

Generating functions Let us define generating function g1 (z) involving all probabilities Pi1 Where i customers are in the queue and 1st phase is busy generating function g2 (z) based on {P02, P12, P22,…} Where the server is busy in phase 2 generating function g(z) based on Pn The probability of having n customers in the system

Expressions for the generating functions Using the 2 set of balance equations, we get (1), (2), and (3) => (1) (2) (3)

Finding P00 g1 (1) = P01 + P11 + P21 +…=prob stage 1 is busy This is equal to traffic intensity for stage 1 => ρ1 g (z) = f(g1 (z), g2 (z))

Pn : recursive scheme The general expression of probability Pn