Presentation is loading. Please wait.

Presentation is loading. Please wait.

Basic Modeling Components

Similar presentations


Presentation on theme: "Basic Modeling Components"— Presentation transcript:

1 Modeling of Systems Two Key Distributions Exponential & Poisson Distributions

2 Basic Modeling Components
μ Arrivals Server Departures System How “work” arrives to the system as a function of time Arrival process/rate ( jobs/unit of time) Average inter-arrival time (1/) How much work does each arrival bring? Server(s) capacity or rate (μ) in jobs/unit of time Service times (average 1/μ) Stable system Over time, number of departures equals number of “accepted” arrivals Long-term incoming work less than server capacity ρ : System utilization Note: A finite system is always stable

3 Arrival Process Inter-arrival time: time interval between consecutive jobs (Ti) Arrival rate: number of jobs per unit of time Average inter-arrival time = 1/arrival rate 6 2 7 5 3 1 4 T1 T2 T3 T4 T5 T6 T7 t N(T1+…+T7) = 8

4 Basic Modeling Components
μ Arrivals Server Departures System How “work” arrives to the system as a function of time Arrival process/rate ( jobs/unit of time) Average inter-arrival time (1/) How much work does each arrival bring? Server(s) capacity or rate (μ) in jobs/unit of time Service times (average 1/μ) Stable system Over time, number of departures equals number of “accepted” arrivals Long-term incoming work less than server capacity ρ : System utilization Note: A finite system is always stable

5 Service Process Service time: time for server to process one job (Si)
Average service time Service rate: number of average size jobs server can handle per unit of time  = 1/S 6 2 7 5 3 1 S6 S0 S2 4 S7 S1 S3 S5 S4 t

6 Basic Modeling Components
μ Arrivals Server Departures System How “work” arrives to the system as a function of time Arrival process/rate ( jobs/unit of time) Average inter-arrival time (1/) How much work does each arrival bring? Server(s) capacity or rate (μ) in jobs/unit of time Service times (average 1/μ) Stable system Over time, number of departures equals number of “accepted” arrivals Long-term incoming work less than server capacity ρ : System utilization Note: A finite system is always stable

7 Service Times Service time distribution: P(S  t)
Probability that once a job/customer/packet gets into service, it leaves before t time units Function of the job “size” and server “rate” Main features of interest First and second moments (mean and variance) – affects service variability Distribution of residual service time P(S  T+t | S  T) - Conditional distribution Impact on “waiting times” (when a new job arrives, it must wait for all the jobs waiting in front of it to be served and for the one currently in service to complete)

8 A “Nice” Candidate Exponential distribution Distribution function
Generates a continuous, non-negative, real-valued random variable X Characterized by a single parameter  Distribution function Density function

9 Distribution & Density Functions
 =1  =1

10 First and Second Moments
First moment (mean) Second moment Variance

11 First Moment Standard derivation uses integration by parts trick

12 Second Moment & Variance
Note: Coefficient of variation

13 The Memoryless Property
No distinction between residual and original service time distributions Past history has no influence on future Probability that a service time lasts for another x secs given it already lasted y secs is the same as the probability it was going to last x secs when it first started More formally,

14 Establishing the Memoryless Property
Using and We get

15 How Special is the Memoryless Property?
The exponential distribution is the only continuous distribution with this property Discrete time equivalent The geometric distribution Service time X is an integer number of “slots” At the end of each slot, flip a coin With probability p service ends With probability (1-p) service continues

16 Memoryless Property Geometric Distribution
Memorylessness is “intuitive” Past “coin flips” don’t affect future ones

17 Other Properties of the Exponential Distribution
The exponential distribution has a constant failure rate, i.e., r(t) = 1 r(t) = f(t)/[1-F(t)] – (probability that an item of age t fails in the next dt is r(t)dt If X1~Exp(1) and X2~Exp(2), with X1X2~ P{X1<X2} = 1/(1+2)

18 Arrival Process Two perspectives Main features of interest I1 I2 I3 I4
N(T) Two perspectives Distribution of the time I between consecutive arrivals (P[I  t]) Distribution of the number N(T) of arrivals within an interval of duration T (P[N(T) = k]) Main features of interest First and second moments (mean and variance) Distribution of residual inter-arrival time (time to next arrival?) P(I  t+t0 | I  t0) - Conditional distribution Do arrivals see the same system as a random “observer”? This is of practical importance as we usually care about the experience of users (arrivals), which will depend on the system state they see when they arrive

19 Arrivals (Can) See a Different System
Assume periodic arrivals (every 2 secs) with each job lasting for 1 sec By the time the next job arrives, the previous one is gone, i.e., arrivals always see an empty system No waiting! In contrast, if you look at the system at a random time, it is busy half of the time (probability 0.5) Does not mean that 50% of customers will have to wait…

20 Building on the Exponential Distribution
Consider arrivals where times between arrivals (inter-arrivals) are independent and exponentially distributed (with parameter )  is the arrival rate (larger rate  shorter inter-arrival times) If the i-th arrival came at time ti, the probability that the i+1-th arrival takes place before time ti+ is 1- e- If the i-th arrival came at time ti, and the i+1-th arrival has not arrived by time ti+, the probability it arrives by time ti+ + is still 1- e- (memoryless property of the exponential distribution) Whether we look right after an arrival or at an arbitrary point in time, the distribution of the time to the next arrival is the same Note: Discrete time equivalent relies on geometric inter-arrival times This arrival process defines what is called a Poisson arrival process It has a number of “nice” properties (in part because of the memoryless property of the exponential distribution)

21 Poisson Arrival Process
How many arrivals in a time interval [0,t)? Building some insight Probability of at least 1 arrival P[N(t)≥1]=P[X≤t]=1-e-t Probability of at least 2 arrivals P[N(t)≥2]=P[X1+X2≤t], where X1 and X2 are inter-arrival times for the 1st and 2nd arrivals (both exponential with parameter ) Sum of two independent exponential random variables Generalizing to n arrivals (gives rise to a Gamma distribution) Probability of exactly n arrivals in t (P(N(t)=n) The derivation that the probability distribution of n arrivals in t is a Gamma distribution is by induction. We showed it works for n=1 and 2. Lets assume it works up to n, and lest then establish that it must then also work for n+1. Once we have established this, the induction, starting at n=1, allows us to say that the expression we have established (a Gamma function) is true for all values of n. ← Poisson distribution

22 Three Definitions of the Poisson Process
A Poisson process of rate λ is a sequence of events such that N(0) = 0 The process has independent increments The number of events in any interval of length t is Poisson distributed with mean λt, i.e.,  t,s ≥ 0, P{N(t+s)-N(s)=n}= (e-λt[λt]n)/n! A Poisson process of rate λ is a sequence of events such that the interarrival times are i.i.d. exponential random variable with rate λ and N(0)=0 The process has stationary and independent increments P{N(δ)=1}= λδ + o(δ) P{N(δ)≥2}= o(δ)

23 Properties of the Poisson Distribution
Series relations Moment derivations readily follow from those E[N]=λ; σ2[N]=λ – in unit time interval, i.e., N(1)

24 Properties of the Poisson Process
Average # of arrivals in interval of length  is  ( = arrival rate) Probabilities of 0, 1, >1 arrivals in small time interval [t, t+) P0=1-; P1=; P>1=o() respectively Uniformity – If k events of a Poisson process occurred in [0,t], the k events are distributed independently and uniformly in [0,t] Merging Poisson processes of rates k yields a Poisson process with rate = k k (sum of Poisson random variables is Poisson) Minimum of n exponential random variables is exponential Splitting (randomly) a Poisson process yields independent Poisson processes with rates 1 = p and 2 = (1-p) Geometric sum of i.i.d. exponential random variables is exponential (arrival to, say, 1, occurs after time T1+T2+…+Tk, where k is geometrically distributed with probability p, and the Ti’s are exponentially distributed with parameter 1/) PASTA (Poisson Arrivals See Time Average) Poisson arrivals sample the system state as if their arrival time was a random time (not true for other distribution)

25 Some Examples Decay of (most) physical particles follows a Poisson process, i.e., probability of surviving for more than t Reliability of semi-conductors is well described by an exponential distribution after an initial “infant mortality” period and during their useful operating life (failure or hazard rate stays constant) Arrivals of calls at a call center during “average” hour of operation are well approximated by a Poisson process

26 Other General Results Level crossing law
In general, we are interested in a number of different system statistics Time averages, i.e., steady-state probability distribution that the system is in a given state System state seen by a random arriving customer System state seen by a random departing customer Basic question: How are those various statistics related (are they the same?) Let’s first focus on comparing system state as seen by arriving and departing customers

27 Arrival vs Departure Statistics
Consider a system where arrivals and departures are restricted to be one at a time Arriving (departing) customers record the number i of customers they see (leave behind) in the system ai is the fraction of arriving customers that find i customers already in the system di is the fraction of departing customers that leave behind i customers in the system Ai(t) ( Di(t) ) is the number of arriving (departing) customers finding (leaving) i customers in the system in the time interval [0,t] Ai(t)/A(t)  ai (almost surely) as t  , i = 0,1,… Di(t)/D(t)  di (almost surely) as t  , i = 0,1,…

28 Arrival vs Departure Statistics (proof)
We have But |Ai(t) - Di(t)|  1 since Each arrival to a system with i customers causes a transition to level i+1 (i  i+1 - Upward crossing of level i) Each departure from a system with i+1 customers causes a transition to level i (i+1  i - Downward crossing of level i) Arrivals and departures occur one at a time So and ai = di, i = 0,1,..., The Level Crossing Law

29 Further Results We have just established the identity of system statistics seen by arriving and departing customers Facilitates analysis of some queueing systems as we can analyze them at arrival or departure times But these are customer averages and not time averages Fraction of time i the system is in state i need not be equal to ai or di,(arrival times are not random times) 1 2 10 9 19 20 30 29 39 N(t) D/D/1 Queue

30 Poisson Arrivals See Time Averages (PASTA)
It turns out that time and customer averages are the same when arrivals are according to a Poisson process (only process for which this is true!) Also requires that service times of previously arrived customers be independent of future inter-arrival times Lack of Anticipation Assumption (LAA) Two approaches to establishing the result An intuitive approach An explicit derivation Consider system with Poisson arrivals with rate  and independent and generally distributed service times

31 Lack of Anticipation Assumption (LAA)
Why is it important Poisson process has independent increments, so what more do we need? Absence of LAA property can induce dependencies between system state and arrivals A simple example Poisson arrivals, but service time of customer i is   1 times the inter-arrival time between customers i and i+1 Arrivals always see an empty system, but system has 1 customer with probability  A1 S1 = A1 A2 S2 = A2 S3 = A3 A3

32 PASTA (1) Consider ta and tr to be randomly selected arrival and observation times, respectively The arrival processes prior to times ta and tr are stochastically identical The probability distributions of the time to the first arrival before ta and tr are both exponentially distributed with parameter  Extending this to the 2nd, 3rd, etc. arrivals before ta and tr establishes the result But the state of the system at a given time t depends only on the arrivals (and associated service times) before t Since the arrival processes before arrival times and random times are identical, so is the state of the system they see QED

33 PASTA (2) Let be the probability that an arrival at time t finds n customers in the system Let  be an arbitrarily small amount of time, with A(t,t+) the event corresponding to a single arrival in [t, t+) Because of LAA and memoryless arrival property Limit at t   gives

34 PASTA + Little’s Theorem
Consider a FIFO queue with Poisson arrivals, exponentially distributed service times, and a single server (an M/M/1 queue) Let Na be the number of customers found in the system by an arrival The system (queueing + service) time t of the arriving customer is given by But from PASTA we know that E(Na) = N, so that T=(N+1)1/μ From Little’s theorem we also have N = T Combining the two: T = 1/(μ-λ) and N = λ/(μ-λ) = /(1 – ), where  = λ/ μ Note the use of the exponential service times assumption

35 PASTA + Little’s Theorem (Discrete Version)
Consider a FIFO queue with Geometric arrivals, Geometrically distributed service times, and a single server (a Geom/Geom/1 queue) Probability of arrival is p, and probability of departure is q Let Na be the number of customers found in the system by an arrival The system (queueing + service) time t of the arriving customer is given by But from PASTA we know that E(Na) = N, so that T=(N+1)1/q From Little’s theorem we also have N = pT Combining the two: T = 1/(q-p) and N = p/(q-p) = /(1 – ), where  = p/q Note the use of the geometric service times assumption


Download ppt "Basic Modeling Components"

Similar presentations


Ads by Google