20. Extinction Probability for Queues and Martingales

Slides:



Advertisements
Similar presentations
Boyce/DiPrima 9th ed, Ch 2.8: The Existence and Uniqueness Theorem Elementary Differential Equations and Boundary Value Problems, 9th edition, by William.
Advertisements

Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
Chapter 14 Infinite Horizon 1.Markov Games 2.Markov Solutions 3.Infinite Horizon Repeated Games 4.Trigger Strategy Solutions 5.Investing in Strategic Capital.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
1 A class of Generalized Stochastic Petri Nets for the performance Evaluation of Mulitprocessor Systems By M. Almone, G. Conte Presented by Yinglei Song.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Topics Review of DTMC Classification of states Economic analysis
TCOM 501: Networking Theory & Fundamentals
G12: Management Science Markov Chains.
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Continuous Time Markov Chains and Basic Queueing Theory
4.7 Brownian Bridge 報告者 : 劉彥君 Gaussian Process Definition 4.7.1: A Gaussian process X(t), t ≥ 0, is a stochastic process that has the property.
#11 QUEUEING THEORY Systems Fall 2000 Instructor: Peter M. Hahn
Performance analysis for high speed switches Lecture 6.
INFINITE SEQUENCES AND SERIES
Chapter 4: Stochastic Processes Poisson Processes and Markov Chains
Mean Delay in M/G/1 Queues with Head-of-Line Priority Service and Embedded Markov Chains Wade Trappe.
15. Poisson Processes In Lecture 4, we introduced Poisson arrivals as the limiting behavior of Binomial random variables. (Refer to Poisson approximation.
13. The Weak Law and the Strong Law of Large Numbers
Probability theory 2008 Conditional probability mass function  Discrete case  Continuous case.
19. Series Representation of Stochastic Processes
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 60 Chapter 8 Markov Processes.
Queuing Networks: Burke’s Theorem, Kleinrock’s Approximation, and Jackson’s Theorem Wade Trappe.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Copyright © Cengage Learning. All rights reserved. CHAPTER 11 ANALYSIS OF ALGORITHM EFFICIENCY ANALYSIS OF ALGORITHM EFFICIENCY.
Introduction to Queuing Theory
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
Generalized Semi-Markov Processes (GSMP)
Section 8.1 Estimating  When  is Known In this section, we develop techniques for estimating the population mean μ using sample data. We assume that.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology The Weak Law and the Strong.
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
Flows and Networks Plan for today (lecture 4): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
Modeling and Analysis of Computer Networks
Equations, Inequalities, and Mathematical Models 1.2 Linear Equations
Why Wait?!? Bryan Gorney Joe Walker Dave Mertz Josh Staidl Matt Boche.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
State N 2.6 The M/M/1/N Queueing System: The Finite Buffer Case.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Chapter 6 Product-Form Queuing Network Models Prof. Ali Movaghar.
Joint Moments and Joint Characteristic Functions.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Flows and Networks Plan for today (lecture 3): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
Goldstein/Schnieder/Lay: Finite Math & Its Applications, 9e 1 of 60 Chapter 8 Markov Processes.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Trigonometric Identities
12. Principles of Parameter Estimation
Discrete-time markov chain (continuation)
Perturbation method, lexicographic method
Lecture on Markov Chain
Trigonometric Identities
13. The Weak Law and the Strong Law of Large Numbers
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
Copyright © Cengage Learning. All rights reserved.
11. Conditional Density Functions and Conditional Expected Values
9. Two Functions of Two Random Variables
12. Principles of Parameter Estimation
16. Mean Square Estimation
8. One Function of Two Random Variables
13. The Weak Law and the Strong Law of Large Numbers
Presentation transcript:

20. Extinction Probability for Queues and Martingales (Refer to section 15.6 in text (Branching processes) for discussion on the extinction probability). 20.1 Extinction Probability for Queues: A customer arrives at an empty server and immediately goes for service initiating a busy period. During that service period, other customers may arrive and if so they wait for service. The server continues to be busy till the last waiting customer completes service which indicates the end of a busy period. An interesting question is whether the busy periods are bound to terminate at some point ? Are they ? PILLAI

Do busy periods continue forever? Or do such queues come to an end sooner or later? If so, how ? Slow Traffic ( ) Steady state solutions exist and the probability of extinction equals 1. (Busy periods are bound to terminate with probability 1. Follows from sec 15.6, theorem 15-9.) Heavy Traffic ( ) Steady state solutions do not exist, and such queues can be characterized by their probability of extinction. Steady state solutions exist if the traffic rate Thus What if too many customers rush in, and/or the service rate is slow ( ) ? How to characterize such queues ? PILLAI

Extinction Probability for Population Models Fig 20.1 PILLAI

Queues and Population Models Offspring moment generating function: Queues and Population Models Population models : Size of the nth generation : Number of offspring for the ith member of the nth generation. From Eq.(15-287), Text Let PILLAI (20-1) Fig 20.2

Extinction probability satisfies the equation which can be solved iteratively as follows: and PILLAI (20-2) (20-3) (20-4) (20-5)

Left to themselves, in the long run, populations either Let Left to themselves, in the long run, populations either die out completely with probability , or explode with probability 1- (Both unpleasant conclusions). Review Theorem 15-9 (Text) PILLAI (20-6) (20-7) (a) (b) Fig 20.3

Queues : Note that the statistics depends both on the arrival as well as the service phenomena. Queues : Customers that arrive during the first service time Service time of first customer second customer First customer Busy period PILLAI Fig 20.4

: Inter-departure statistics generated by arrivals : Traffic Intensity Steady state Heavy traffic Termination of busy periods corresponds to extinction of queues. From the analogy with population models the extinction probability is the unique root of the equation Slow Traffic : Heavy Traffic : i.e., unstable queues either terminate their busy periods with probability , or they will continue to be busy with probability 1- . Interestingly, there is a finite probability of busy period termination even for unstable queues. : Measure of stability for unstable queues. PILLAI

Example 20.1 : M/M/ 1 queue From (15-221), text,we have Number of arrivals between any two departures follows a geometric random variable. PILLAI (20-8) (20-10) (20-9) Fig 20.5

Example 20.2 : Bulk Arrivals M[x]/M/ 1 queue Compound Poisson Arrivals : Departures are exponential random variables as in a Poisson process with parameter Similarly arrivals are Poisson with parameter However each arrival can contain multiple jobs. Let and represents the bulk arrival statistics. PILLAI Fig 20.6

Inter-departure Statistics of Arrivals PILLAI (20-11) (20-12)

Compound Poisson arrivals with geometric rate Bulk Arrivals (contd) Compound Poisson arrivals with geometric rate Doubly Poisson arrivals gives PILLAI (20-13) (20-14) (20-15) M/M/1 (a) (b) Fig 20.7

Example 20.3 : M/En / 1 queue (n-phase exponential service) From (16-213) (20-16) (20-17) Example 20.4 : M/D/1 queue Letting in (16.213),text, we obtain so that PILLAI (20-18)

20.2 Martingales Martingales refer to a specific class of stochastic processes that maintain a form of “stability” in an overall sense. Let refer to a discrete time stochastic process. If n refers to the present instant, then in any realization the random variables are known, and the future values are unknown. The process is “stable” in the sense that conditioned on the available information (past and present), no change is expected on the average for the future values, and hence the conditional expectation of the immediate future value is the same as that of the present value. Thus, if (20-19) PILLAI

for all n, then the sequence {Xn} represents a Martingale. Historically martingales refer to the “doubling the stake” strategy in gambling where the gambler doubles the bet on every loss till the almost sure win occurs eventually at which point the entire loss is recovered by the wager together with a modest profit. Problems 15-6 and 15-7, chapter 15, Text refer to examples of martingales. [Also refer to section 15-5, Text]. If {Xn} refers to a Markov chain, then as we have seen, with Eq. (20-19) reduces to the simpler expression [Eq. (15-224), Text] (20-20) PILLAI

For finite chains of size N, interestingly, Eq. (20-20) reads implying that x2 is a right-eigenvector of the transition probability matrix associated with the eigenvalue 1. However, the “all one” vector is always an eigenvector for any P corresponding to the unit eigenvalue [see Eq. (15-179), Text], and from Perron’s theorem and the discussion there [Theorem 15-8, Text] it follows that, for finite Markov chains that are also martingales, P cannot be a primitive matrix, and the corresponding chains are in fact not irreducible. Hence every finite state martingale has at least two closed sets embedded in it. (The closed sets in the two martingales in Example 15-13, Text correspond to two absorbing states. Same is true for the Branching Processes discussed in the next example also. Refer to remarks following Eq. (20-7)). (20-21) PILLAI

Example 20.5: As another example, let {Xn} represent the PILLAI Example 20.5: As another example, let {Xn} represent the branching process discussed in section 15-6, Eq. (15-287), Text. Then Zn given by is a martingale, where Yi s are independent, identically distributed random variables, and refers to the extinction probability for that process [see Theorem 15.9, Text]. To see this, note that where we have used the Markov property of the chain, (20-22) (20-23) since Yi s are independent of Xn use (15-2)

the common moment generating function P(z) of Yi s, and PILLAI the common moment generating function P(z) of Yi s, and Theorem 15-9, Text. Example 20.6 (DeMoivre’s Martingale): The gambler’s ruin problem (see Example 3-15, Text) also gives rise to various martingales. (see problem 15-7 for an example). From there, if Sn refers to player A’s cumulative capital at stage n, (note that S0 = $ a ), then as DeMoivre has observed generates a martingale. This follows since where the instantaneous gain or loss given by Zn+1 obeys and hence (20-24) (20-25) (20-26)

since {Sn} generates a Markov chain. Thus PILLAI since {Sn} generates a Markov chain. Thus i.e., Yn in (20-24) defines a martingale! Martingales have excellent convergence properties in the long run. To start with, from (20-19) for any given n, taking expectations on both sides we get Observe that, as stated, (20-28) is true only when n is known or n is a given number. As the following result shows, martingales do not fluctuate wildly. There is in fact only a small probability that a large deviation for a martingale from its initial value will occur. (20-27) (20-28)

Hoeffding’s inequality: Let {Xn} represent a martingale and PILLAI Hoeffding’s inequality: Let {Xn} represent a martingale and be a sequence of real numbers such that the random variables Then Proof: Eqs. (20-29)-(20-30) state that so long as the martingale increments remain bounded almost surely, then there is only a very small chance that a large deviation occurs between Xn and X0. We shall prove (20-30) in three steps. For any convex function f (x), and we have (Fig 20.8) (20-30) (20-29) (20-31)

Replacing a in (20-32) with any zero mean random variable which for and gives Replacing a in (20-32) with any zero mean random variable Y that is bounded by unity almost everywhere, and taking expected values on both sides we get Note that the right side is independent of Y in (20-33). On the other hand, from (20-29) and since Yi s are bounded by unity, from (20-32) we get (as in (20-33)) (20-32) (20-33) (20-34) Fig 20.8 PILLAI

(ii) To make use of (20-35), referring back to the Markov inequality in (5-89), Text, it can be rewritten as and with But (20-35) (20-36) (20-37) (20-38) PILLAI use (20-29)

Substituting (20-38) into (20-37) we get (iii) Observe that the exponent on the right side of (20-39) is minimized for and hence it reduces to The same result holds when Xn – X0 is replaced by X0 – Xn, and adding the two bounds we get (20-30), the Hoeffding’s inequality. From (20-28), for any fixed n, the mean value E{Xn} equals E{X0}. Under what conditions is this result true if we replace n by a random time T ? i.e., if T is a random variable, then when is PILLAI (20-39) (20-40)

The answer turns out to be that T has to be a stopping time. What is a stopping time? A stochastic process may be known to assume a particular value, but the time at which it happens is in general unpredictable or random. In other words, the nature of the outcome is fixed but the timing is random. When that outcome actually occurs, the time instant corresponds to a stopping time. Consider a gambler starting with $a and let T refer to the time instant at which his capital becomes $1. The random variable T represents a stopping time. When the capital becomes zero, it corresponds to the gambler’s ruin and that instant represents another stopping time (Time to go home for the gambler!) PILLAI (20-41)

Recall that in a Poisson process the occurrences of the first, second, arrivals correspond to stopping times Stopping times refer to those random instants at which there is sufficient information to decide whether or not a specific condition is satisfied. Stopping Time: The random variable T is a stopping time for the process X(t), if for all is a function of the values of the process up to t, i.e., it should be possible to decide whether T has occurred or not by the time t, knowing only the value of the process X(t) up to that time t. Thus the Poisson arrival times T1 and T2 referred above are stopping times; however T2 – T1 is not a stopping time. A key result in martingales states that so long as PILLAI

T is a stopping time (under some additional mild restrictions) Notice that (20-42) generalizes (20-28) to certain random time instants (stopping times) as well. Eq. (20-42) is an extremely useful tool in analyzing martingales. We shall illustrate its usefulness by rederiving the gambler’s ruin probability in Example 3-15, Eq. (3-47), Text. From Example 20.6, Yn in (20-24) refer to a martingale in the gambler’s ruin problem. Let T refer to the random instant at which the game ends; i.e., the instant at which either player A loses all his wealth and Pa is the associated probability of ruin for player A, or player A gains all wealth $(a + b) with probability (1 – Pa). In that case, T is a stopping time and hence from (20-42), we get PILLAI (20-42)

since player A starts with $a in Example 3.15. But Equating (20-43)-(20-44) and simplifying we get that agrees with (3-47), Text. Eq. (20-45) can be used to derive other useful probabilities and advantageous plays as well. [see Examples 3-16 and 3-17, Text]. Whatever the advantage, it is worth quoting the master Gerolamo Cardano (1501-1576) on this: “The greatest advantage in gambling comes from not playing at all.” PILLAI (20-45) (20-43) (20-44)