Probability and Statistics with Reliability, Queuing and Computer Science Applications: Chapter 6 on Stochastic Processes Kishor S. Trivedi Visiting Professor.

Slides:



Advertisements
Similar presentations
Exponential and Poisson Chapter 5 Material. 2 Poisson Distribution [Discrete] Poisson distribution describes many random processes quite well and is mathematically.
Advertisements

Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
1 Continuous random variables Continuous random variable Let X be such a random variable Takes on values in the real space  (-infinity; +infinity)  (lower.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Probability, Statistics, and Traffic Theories
TCOM 501: Networking Theory & Fundamentals
Copyright © 2005 Department of Computer Science CPSC 641 Winter Markov Chains Plan: –Introduce basics of Markov models –Define terminology for Markov.
Reliable System Design 2011 by: Amir M. Rahmani
Lecture 13 – Continuous-Time Markov Chains
Introduction to stochastic process
#11 QUEUEING THEORY Systems Fall 2000 Instructor: Peter M. Hahn
1 Review of Probability Theory [Source: Stanford University]
15. Poisson Processes In Lecture 4, we introduced Poisson arrivals as the limiting behavior of Binomial random variables. (Refer to Poisson approximation.
Probability theory 2011 Outline of lecture 7 The Poisson process  Definitions  Restarted Poisson processes  Conditioning in Poisson processes  Thinning.
Queueing Theory.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Stochastic processes Bernoulli and Poisson processes (Sec. 6.1,6.3.,6.4)
1 Markov Chains H Plan: –Introduce basics of Markov models –Define terminology for Markov chains –Discuss properties of Markov chains –Show examples of.
Introduction to Stochastic Models GSLM 54100
Chapter 4. Continuous Probability Distributions
CMPE 252A: Computer Networks Review Set:
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Exponential Distribution & Poisson Process
1 Exponential Distribution & Poisson Process Memorylessness & other exponential distribution properties; Poisson process and compound P.P.’s.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Introduction to Queuing Theory
Introduction to Stochastic Models GSLM 54100
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Probability and Statistics with Reliability, Queuing and Computer Science Applications: Chapter 7 on Discrete Time Markov Chains Kishor S. Trivedi Visiting.
Generalized Semi-Markov Processes (GSMP)
Chapter 5 Statistical Models in Simulation
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Intro. to Stochastic Processes
Network Design and Analysis-----Wang Wenjie Queueing System IV: 1 © Graduate University, Chinese academy of Sciences. Network Design and Analysis Wang.
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
TexPoint fonts used in EMF.
1 Elements of Queuing Theory The queuing model –Core components; –Notation; –Parameters and performance measures –Characteristics; Markov Process –Discrete-time.
Modeling and Analysis of Computer Networks
1 Birth and death process N(t) Depends on how fast arrivals or departures occur Objective N(t) = # of customers at time t. λ arrivals (births) departures.
Probability Theory for Networking Application Rudra Dutta ECE/CSC Fall 2010, Section 001, 601.
Why Wait?!? Bryan Gorney Joe Walker Dave Mertz Josh Staidl Matt Boche.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < < tk < tk+1 If X(t) is discrete-valued If X(t) is continuous-valued i.e.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
Chapter 2 Probability, Statistics and Traffic Theories
Continuous Time Markov Chains
Queuing Theory.  Queuing Theory deals with systems of the following type:  Typically we are interested in how much queuing occurs or in the delays at.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
Random Variables r Random variables define a real valued function over a sample space. r The value of a random variable is determined by the outcome of.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Probability Distributions: a review
Availability Availability - A(t)
Exponential Distribution & Poisson Process
Basic Modeling Components
Pertemuan ke-7 s/d ke-10 (minggu ke-4 dan ke-5)
Lecture on Markov Chain
CPSC 531: System Modeling and Simulation
Tarbiat Modares University
TexPoint fonts used in EMF.
Dept. of Electrical & Computer engineering
Dept. of Electrical & Computer engineering
EE255/CPS226 Discrete Random Variables
EE255/CPS226 Stochastic Processes
Dept. of Electrical & Computer engineering
EE255/CPS226 Conditional Probability and Expectation
Presentation transcript:

Probability and Statistics with Reliability, Queuing and Computer Science Applications: Chapter 6 on Stochastic Processes Kishor S. Trivedi Visiting Professor Dept. of Computer Science and Engineering Indian Institute of Technology, Kanpur

What is a Stochastic Process? Stochastic Process: is a family of random variables {X(t) | t ε T} (T is an index set; it may be discrete or continuous) Values assumed by X(t) are called states. State space (I): set of all possible states Sometimes called a random process or a chance process If x and y are mutually independent, then, p(y|x) = p(y).

Stochastic Process Characterization At a fixed time t=t1, we have a random variable X(t1). Similarly, we have X(t2), .., X(tk). X(t1) can be characterized by its distribution function, We can also consider the joint distribution function, Discrete and continuous cases: States X(t) (i.e. time t) may be discrete/continuous State space I may be discrete/continuous

Classification of Stochastic Processes Four classes of stochastic processes: discrete-state process  chain discrete-time process  stochastic sequence {Xn | n є T} (e.g., probing a system every 10 ms.)

Example: a Queuing System Interarrival times Y1, Y2, … (common dist. Fn. FY) Service times: S1, S2, … (iid with a common cdf FS) Notation for a queuing system: FY /FS/m Some interarrival/service time distributions types are: M: Memoryless (i.e., EXP) D: Deterministic Ek: k-stage Erlang etc. Hk: k-stage Hyper exponential distribution G: General distribution GI: General independent inter arrival times M/M/1  Memoryless interarrival/service times with a single server

Discrete/Continuous Stochastic Processes Nk: Number of jobs waiting in the system at the time of kth job’s departure  Stochastic process {Nk| k=1,2,…}: Discrete time, discrete state Nk Discrete k Discrete

Continuous Time, Discrete Space X(t): Number of jobs in the system at time t. {X(t) | t є T} forms a continuous-time, discrete-state stochastic process, with, X(t) Discrete Continuous

Discrete Time, Continuous Space Wk: waiting time for the kth job. Then {Wk | k є T} forms a Discrete-time, Continuous-state stochastic process, where, Wk Continuous k Discrete

Continuous Time, Continuous Space Y(t): total service time for all jobs in the system at time t. Y(t) forms a continuous-time, continuous-state stochastic process, Where, Y(t) t

Further Classification Similarly, we can define nth order distribution: Formidable task to provide nth order distribution for all n. (1st order distribution) (2nd order distribution)

Further Classification (contd.) Can the nth order distribution be simplified? Yes. Under some simplifying assumptions: Independence As example, we have the Renewal Process Discrete time independent process {Xn | n=1,2,…} (X1, X2, .. are iid, non-negative rvs), e.g., repair/replacement after a failure. Markov process introduces a limited form of dependence Markov Process Stochastic proc. {X(t) | t є T} is Markov if for any t0 < t1< … < tn< t, the conditional distribution satisfies the Markov property: Stationary: E[x(t)] = E[x]  ensemble average. When the pdf or the CDF exhibits stationarity property, then, the process is said to strictly stationary. If only the first moment satisfies this property, then, the process is said to stationary in the mean etc.

Markov Process We will only deal with discrete state Markov processes i.e., Markov chains In some situations, a Markov chain may also exhibit time-homogeneity Future of process (probabilistically) determined by its current state, independent of how it reached this particular state; but in a non homogeneous case, current time can also determine the future. For a homogeneous Markov chain current time is also not needed to determine the future. Let Y: time spent in a given state in a hom. CTMC

Homogeneous CTMC-Sojourn time Since Y, the sojourn time, has the memoryless prop. This result says that for a homogeneous continuous time Markov chain, sojourn time in a state follows EXP( ) distribution (not true for non-hom CTMC) Hom. DTMC sojourn time dist. Is geometric. Semi-Markov process is one in which the sojourn time in a state is generally distributed.

Bernoulli Process A sequence of iid Bernoulli rvs, {Yi | i=1,2,3,..}, Yi =1 or 0 {Yi} forms a Bernoulli Process, an example of a renewal process. Define another stochastic process , {Sn | n=1,2,3,..}, where Sn = Y1 + Y2 +…+ Yn (i.e. Sn :sequence of partial sums) Sn = Sn-1+ Yn (recursive form) P[Sn = k | Sn-1= k] = P[Yn = 0] = (1-p) and, P[Sn = k | Sn-1= k-1] = P[Yn = 1] = p {Sn |n=1,2,3,..}, forms a Binomial process, an example of a homogeneous DTMC {Yi} forms a discrete-time process.

Renewal Counting Process Renewal counting process: # of renewals (repairs, replacements, arrivals) by time t: a continuous time process: If time interval between two renewals follows EXP distribution, then  Poisson Process

Note: For a fixed t, N(t) is a random variable (in this case a discrete random variable known as the Poisson random variable) The family {N(t), t  0} is a stochastic process, in this case, the homogeneous Poisson process {N(t), t  0} is a homogeneous CTMC as well

Poisson Process A continuous time, discrete state process. N(t): no. of events occurring in time (0, t]. Events may be, # of packets arriving at a router port # of incoming telephone calls at a switch # of jobs arriving at file/compute server Number of component failures Events occurs successively and that intervals between these successive events are iid rvs, each following EXP( ) λ: arrival rate (1/ λ: average time between arrivals) λ: failure rate (1/ λ: average time between failures)

Poisson Process (contd.) N(t) forms a Poisson process provided: N(0) = 0 Events within non-overlapping intervals are independent In a very small interval h, only one event may occur (prob. p(h)) Letting, pn(t) = P[N(t)=n], For a Poisson process, interarrival times follow EXP( ) (memoryless) distribution. E[N(t)] = Var[N(t)] = λt ; What about E[N(t)/t], as t infinity?

Merged Multiple Poisson Process Streams Consider the system, Proof: Using z-transform. Letting, α = λt, +

Decomposing a Poisson Stream Decompose a Poisson process using a prob. switch N arrivals decomposed into {N1, N2, .., Nk}; N= N1+N2, ..,+Nk Cond. pmf Since, The uncond. pmf

Generalizing the Poisson Process Non-Homogeneous Poisson Process (NHPP)

Non-Homogeneous Poisson Process (NHPP) If the expected number of events per unit time, l, changes with age (time), we have a non-homogeneous Poisson model. We assume that: 1. If 0  t, the pmf of N(t) is given by: where m(t)  0 is the expected number of events in the time period [0, t] 2. Counts of events in non-overlapping time periods are mutually independent. m(t) : the mean value function. l(x) :the time-dependent rate of occurrence of events or time-dependent failure rate

NHPP(cont.)

Generalizing Poisson Process Non-Homogeneous Poisson Process (NHPP) Renewal Counting Process

Renewal Counting Process Poisson process  EXP( ) distributed interarrival times. What if the EXP( ) assumption is removed  renewal proc. Renewal proc. : {Xi | i=1,2,…} (Xi’s are iid non-EXP rvs) Xi : time gap between the occurrence of (i-1) st and ith event Sk = X1 + X2 + .. + Xk  time to occurrence of the kth event. N(t)- Renewal counting process is a discrete-state, continuous-time stochastic process. N(t) denotes no. of renewals in the interval (0, t].

Renewal Counting Processes (contd.) Sn t More arrivals possible tn For N(t), what is P(N(t) = n)? F(n+1) (t): prob(time taken for n-renewals + time for one more renewal) = tn + t

Renewal Counting Process Expectation Let, m(t) = E[N(t)]. Then, m(t) = mean no. of arrivals in time (0,t]. m(t) is called the renewal function.

Renewal Density Function For example, if the renewal interval X is EXP(λ), then d(t) = λ , t >= 0 and m(t) = λ t , t >= 0. P[N(t)=n] = Fn(t) will turn out to be n-stage Erlang e–λ t (λ t)n/n! i.e Poisson pmf

Alternating Renewal Process I(t) 1 Operating Restoration Time Where: Failure times T1, T2, … are mutually independent with a common distribution function W Restoration times D1, D2, … are mutually independent with a common distribution function G The sequences {Tn} and {Dn} are independent

Availability Analysis Availability: is defined is the ability of a system to provide the desired service. If no repair/replacement,Availability(t)=Reliability(t) If repairs are possible, then above is pessimistic. MTBF = E[Di+Ti+1] = E[Ti+Di]=E[Xi]=MTTF+MTTR T1 D1 T2 D2 T3 D3 T4 D4 ……. MTBF

Availability Analysis (contd.) Repair is completed with in this interval renewal x Two mutually exclusive situations: System does not fail before time t  A(t) = R(t) System fails, but the repair is completed before time t Therefore, A(t) = sum of these two probabilities

Availability Expression dA(x) : Incremental availability dA(x) = Prob(that after renewal, life time is > (t-x) & that the renewal occurs in the interval (x,x+dx]) Repair is completed with in this interval x Renewed life time >= (t-x) x+dx t

Availability Expression (contd.) A(t) can also be expressed in the Laplace domain. Since, R(t) = 1-W(t) or LR(s) = 1/s – LW(s) = 1/s –Lw(s)/s What happens when t becomes very large? However,

Availability, MTTF and MTTR Steady state availability A is: Taking the expression of sLA(s) and taking the limit via L’Hospital rule and using the moment generating property of the LT, we get the required result for the steady-state A=MTTF/(MTTF+MTTR)

Availability Example Assuming EXP( ) density fn for g(t) and w(t)

Generalizing Poisson Process Bernoulli Process Poisson Process Homogeneous Continuous Time Markov Chain Compound Poisson Process Renewal Counting Process Non-Homogeneous Poisson Process (NHPP) Homogeneous Discrete Time Markov Chain Non-Homogeneous Continuous Time Markov Chain Semi-Markov Process Markov Regenerative Process