Lecture 5 This lecture is about: Introduction to Queuing Theory

Slides:



Advertisements
Similar presentations
Lecture 5 This lecture is about: Introduction to Queuing Theory Queuing Theory Notation Bertsekas/Gallager: Section 3.3 Kleinrock (Book I) Basics of Markov.
Advertisements

Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
TCOM 501: Networking Theory & Fundamentals
Continuous Time Markov Chains and Basic Queueing Theory
Lecture 13 – Continuous-Time Markov Chains
1.(10%) Let two independent, exponentially distributed random variables and denote the service time and the inter-arrival time of a system with parameters.
Performance analysis for high speed switches Lecture 6.
Queuing Systems Chapter 17.
1 Performance Evaluation of Computer Networks Objectives  Introduction to Queuing Theory  Little’s Theorem  Standard Notation of Queuing Systems  Poisson.
Queueing Theory: Part I
7/3/2015© 2007 Raymond P. Jefferis III1 Queuing Systems.
Queuing Networks: Burke’s Theorem, Kleinrock’s Approximation, and Jackson’s Theorem Wade Trappe.
Introduction to Queuing Theory. 2 Queuing theory definitions  (Kleinrock) “We study the phenomena of standing, waiting, and serving, and we call this.
Internet Queuing Delay Introduction How many packets in the queue? How long a packet takes to go through?

Lecture 7  Poisson Processes (a reminder)  Some simple facts about Poisson processes  The Birth/Death Processes in General  Differential-Difference.
Introduction to Queuing Theory
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Introduction to Queuing Theory
Copyright ©: Nahrstedt, Angrave, Abdelzaher, Caccamo1 Queueing Systems.
Generalized Semi-Markov Processes (GSMP)
Probability Review Thinh Nguyen. Probability Theory Review Sample space Bayes’ Rule Independence Expectation Distributions.
MIT Fun queues for MIT The importance of queues When do queues appear? –Systems in which some serving entities provide some service in a shared.
Lecture 14 – Queuing Networks Topics Description of Jackson networks Equations for computing internal arrival rates Examples: computation center, job shop.
NETE4631:Capacity Planning (2)- Lecture 10 Suronapee Phoomvuthisarn, Ph.D. /
Introduction to Queueing Theory
Network Design and Analysis-----Wang Wenjie Queueing System IV: 1 © Graduate University, Chinese academy of Sciences. Network Design and Analysis Wang.
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
1 Queuing Models Dr. Mahmoud Alrefaei 2 Introduction Each one of us has spent a great deal of time waiting in lines. One example in the Cafeteria. Other.
TexPoint fonts used in EMF.
1 Elements of Queuing Theory The queuing model –Core components; –Notation; –Parameters and performance measures –Characteristics; Markov Process –Discrete-time.
Modeling and Simulation Queuing theory
Queuing Theory and Traffic Analysis Based on Slides by Richard Martin.
Why Wait?!? Bryan Gorney Joe Walker Dave Mertz Josh Staidl Matt Boche.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Chapter 20 Queuing Theory to accompany Operations Research: Applications and Algorithms 4th edition by Wayne L. Winston Copyright (c) 2004 Brooks/Cole,
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
CS352 - Introduction to Queuing Theory Rutgers University.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
Copyright ©: Nahrstedt, Angrave, Abdelzaher, Caccamo1 Queueing Systems.
Queuing Theory.  Queuing Theory deals with systems of the following type:  Typically we are interested in how much queuing occurs or in the delays at.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
Random Variables r Random variables define a real valued function over a sample space. r The value of a random variable is determined by the outcome of.
Managerial Decision Making Chapter 13 Queuing Models.
From DeGroot & Schervish. Example Occupied Telephone Lines Suppose that a certain business office has five telephone lines and that any number of these.
Queueing Theory II.
Lecture 14 – Queuing Networks
Much More About Markov Chains
Internet Queuing Delay Introduction
Lecture on Markov Chain
Multinomial Distribution
ECE 358 Examples #1 Xuemin (Sherman) Shen Office: EIT 4155
Internet Queuing Delay Introduction
Introduction Notation Little’s Law aka Little’s Result
Handling Routing Transport Haifa JFK TLV BGN To: Yishay From: Vered
TexPoint fonts used in EMF.
Lecture 7 Poisson Processes (a reminder)
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
TexPoint fonts used in EMF.
Queueing Theory II.
Introduction to Queueing Theory
Delays Deterministic Stochastic Assumes “error free” type case
Lecture 14 – Queuing Networks
Queueing Theory 2008.
Probability Fundamentals
TexPoint fonts used in EMF.
Course Description Queuing Analysis This queuing course
Lecture 11 – Stochastic Processes
Presentation transcript:

Lecture 5 This lecture is about: Introduction to Queuing Theory Queuing Theory Notation Bertsekas/Gallager: Section 3.3 Kleinrock (Book I) Basics of Markov Chains Bertsekas/Gallager: Appendix A Markov Chains by J. R. Norris

Queuing Theory Queuing Theory deals with systems of the following type: Typically we are interested in how much queuing occurs or in the delays at the servers. Server Process(es) Input Process Output

Queuing Theory Notation A standard notation is used in queuing theory to denote the type of system we are dealing with. Typical examples are: M/M/1 Poisson Input/Poisson Server/1 Server M/G/1 Poisson Input/General Server/1 Server D/G/n Deterministic Input/General Server/n Servers E/G/ Erlangian Input/General Server/Inf. Servers The first letter indicates the input process, the second letter is the server process and the number is the number of servers. (M = Memoryless = Poisson)

The M/M/1 Queue The simplest queue is the M/M/1 queue. Recall that a Poisson process has the following characteristics: Where A(t) is the number of events (arrivals) up to time t. Let us assume that the arrival process is a Poisson with mean  and the service process is a Poisson with a mean 

Poisson Processes (a refresher) Interarrival times are i.i.d. and exponentially distributed with parameter . tn is the time of packet n and n= tn+1 - tn then: For every t  0 and   0:

Poisson Processes (a refresher) If two or more Poisson processes (A1,A2...Ak) with different means(1, 2... k) are merged then the resultant process has a mean  given by: If a Poisson process is split into two (or more) by independently assigning arrivals to streams then the resultant processes are both Poisson. Because of the memoryless property of the Poisson process, an ideal tool for investigating this type of system is the Markov chain.

On the Buses (a paradoxical property of Poisson Processes) You are waiting for a bus. The timetable says that buses are every 30 minutes. (But who believes bus timetables?) As a mathematician, you have observed that, in fact, the buses are a Poisson process with a mean arrival rate such that the expectation time between buses is 30 minutes. You arrived at a random time at the bus stop. What is your expected wait for a bus? What is the expected time since the last bus? 15 minutes. After all, they are, on average, 30 minutes apart. 30 minutes. As we have said, a Poisson Process is memoryless so logically, the expected waiting time must be the same whether we arrive just after a previous bus or a full hour since the previous bus.

Introduction to Markov Chains Some process (or time series) {Xn| n= 0,1,2,...} takes values in nonnegative integers. The process is a Markov chain if, whenever it is in state i, the probability of being in state j next is pij This is, of course, another way of saying that a Markov Chain is memoryless. pij are the transition probabilities.

Visualising Markov Chains (the confused hippy hitcher example) B C 1/3 2/3 1/2 1/4 3/4 A hitchhiking hippy begins at A town. For some reason he has poor short-term memory and travels at random according to the probabilities shown. What is the chance he is back at A after 2 days? What about after 3 days? Where is he likely to end up?

The Hippy Hitcher (continued) After 1 day he will be in B town with probability 3/4 or C town with probability 1/4 The probability of returning to A via B after 1 day is 3/12 and via C is 1/8 total 3/8 We can perform similar calculations for 3 or 4 days but it will quickly become tricky and finding which city he is most likely to end up in is impossible. A B C 1/3 2/3 1/2 1/4 3/4

Transition Matrix Instead we can represent the transitions as a matrix Prob of going to B from A A B C 1/3 2/3 1/2 1/4 3/4 Prob of going to A from C

Markov Chain Transition Basics pij are the transition probabilities of a chain. They have the following properties: The corresponding probability matrix is:

Transition Matrix Define n as a distribution vector representing the probabilities of each state at time step n. We can now define 1 step in our chain as: And clearly, by iterating this, after m steps we have:

The Return of the Hippy Hitcher What does this imply for our hippy? We know the initial state vector: So we can calculate n with a little drudge work. (If you get bored raising P to the power n then you can use a computer) But which city is the hippy likely to end up in? We want to know

Invariant (or equilibrium) probabilities) Assuming the limit exists, the distribution vector  is known as the invariant or equilibrium probabilities. We might think of them as being the proportion of the time that the system spends in each state or alternatively, as the probability of finding the system in a given state at a particular time. They can be found by finding a distribution which solves the equation: We will formalise these ideas in a subsequent lecture.

Some Notation for Markov Chains Formally, a process Xn is Markov chain with initial distribution  and transition matrix P if: P{X0=i} = i (where i is the ith element of ) P{Xn+1=j| Xn=i, Xn-1=xn-1,...X0=x0}= P{Xn+1=j| Xn=i }=pij For short we say Xn is Markov (,P) We now introduce the notation for an n step transition: And note in passing that: This is the Chapman-Kolmogorov equation