Generalized Semi-Markov Processes (GSMP)

Slides:



Advertisements
Similar presentations
Lecture 5 This lecture is about: Introduction to Queuing Theory Queuing Theory Notation Bertsekas/Gallager: Section 3.3 Kleinrock (Book I) Basics of Markov.
Advertisements

Exponential and Poisson Chapter 5 Material. 2 Poisson Distribution [Discrete] Poisson distribution describes many random processes quite well and is mathematically.
Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
Markov Chains.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Probability and Statistics with Reliability, Queuing and Computer Science Applications: Chapter 6 on Stochastic Processes Kishor S. Trivedi Visiting Professor.
TCOM 501: Networking Theory & Fundamentals
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Al-Imam Mohammad Ibn Saud University
Entropy Rates of a Stochastic Process
Reliable System Design 2011 by: Amir M. Rahmani
Stochastic Processes Dr. Nur Aini Masruroh. Stochastic process X(t) is the state of the process (measurable characteristic of interest) at time t the.
Lecture 13 – Continuous-Time Markov Chains
Introduction to stochastic process
Chapter 4: Stochastic Processes Poisson Processes and Markov Chains
Probability By Zhichun Li.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Stochastic processes Bernoulli and Poisson processes (Sec. 6.1,6.3.,6.4)
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Lecture 11 – Stochastic Processes
Exponential Distribution & Poisson Process
1 Exponential Distribution & Poisson Process Memorylessness & other exponential distribution properties; Poisson process and compound P.P.’s.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Introduction to Stochastic Models GSLM 54100
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Week 41 Continuous Probability Spaces Ω is not countable. Outcomes can be any real number or part of an interval of R, e.g. heights, weights and lifetimes.
Intro. to Stochastic Processes
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Stochastic Models Lecture 2 Poisson Processes
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
Modeling and Analysis of Computer Networks
1 Birth and death process N(t) Depends on how fast arrivals or departures occur Objective N(t) = # of customers at time t. λ arrivals (births) departures.
Why Wait?!? Bryan Gorney Joe Walker Dave Mertz Josh Staidl Matt Boche.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Chapter 20 Queuing Theory to accompany Operations Research: Applications and Algorithms 4th edition by Wayne L. Winston Copyright (c) 2004 Brooks/Cole,
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Stochastic Processes1 CPSC : Stochastic Processes Instructor: Anirban Mahanti Reference Book “Computer Systems Performance.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
Fault Tree Analysis Part 11 – Markov Model. State Space Method Example: parallel structure of two components Possible System States: 0 (both components.
Flows and Networks Plan for today (lecture 3): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Chapter 6 Random Processes
Reliability Engineering
Markov Chains.
Renewal Theory Definitions, Limit Theorems, Renewal Reward Processes, Alternating Renewal Processes, Age and Excess Life Distributions, Inspection Paradox.
Availability Availability - A(t)
水分子不時受撞,跳格子(c.p. 車行) 投骰子 (最穩定) 股票 (價格是不穏定,但報酬過程略穩定) 地震的次數 (不穩定)
Exponential Distribution & Poisson Process
The Poisson Process.
Medium Access Control Protocols
Probability.
Pertemuan ke-7 s/d ke-10 (minggu ke-4 dan ke-5)
V5 Stochastic Processes
Multinomial Distribution
Tarbiat Modares University
STOCHASTIC HYDROLOGY Random Processes
Dept. of Electrical & Computer engineering
CS723 - Probability and Stochastic Processes
Lecture 11 – Stochastic Processes
Presentation transcript:

Generalized Semi-Markov Processes (GSMP)

Summary Some Definitions Markov and Semi-Markov Processes The Poisson Process Properties of the Poisson Process Interarrival times Memoryless property and the residual lifetime paradox Superposition of Poisson processes

Random Process Let (Ω, F, P) be a probability space. A stochastic (or random) process {X(t)} is a collection of random variables defined on (Ω, F, P), indexed by t  T (where t is usually time). X(t) is the state of the process. Continuous Time and Discrete Time stochastic processes If the set T is finite or countable then {X(t)} is called discrete-time process. In this case t  {0, 1,2,…} and we may referred to a stochastic sequence. We may also use the notation {Xk}, k=0,1,2,… Otherwise, the process is called continuous-time process Continuous State and Discrete State stochastic processes If {X(t)} is defined over a countable set, then the process is discrete-state, also referred to as chain. Otherwise, the process is continuous-state.

Classification of Random Processes Joint cdf of the random variables X(t0),…,X(tn) Independent Process Let X1,…,Xn be a sequence of independent random variables, then Stationary Process (strict sense stationarity) The sequence {Xn} is stationary if and only if for any τ  R

Classification of Random Processes Wide-sense Stationarity Let C be a constant and g(τ) a function of τ but not of t, then a process is wide-sense stationary if and only if and Markov Process The future of a process does not depend on its past, only on its present Also referred to as the Markov property

Markov and Semi-Markov Property The Markov Property requires that the process has no memory of the past. This memoryless property has two aspects: All past state information is irrelevant in determining the future (no state memory). How long the process has been in the current state is also irrelevant (no state age memory). The later implies that the lifetimes between subsequent events (interevent time) should also have the memoryless property (i.e., exponentially distributed). Semi-Markov Processes For this class of processes the requirement that the state age is irrelevant is relaxed, therefore, the interevent time is no longer required to be exponentially distributed.

Example Consider the process with Pr{X0=0}= Pr{X0=1}= 0.5 and Pr{X1= 0}= Pr{X1=1}= 0.5 Is this a Markov process? NO Is it possible to make the process Markov? Define Yk= Xk-1 and form the vector Zk= [Xk, Yk]T then we can write

Renewal Process A renewal process is a chain {N(t)} with state space {0,1,2,…} whose purpose is to count state transitions. The time intervals between state transitions are assumed iid from an arbitrary distribution. Therefore, for any 0 ≤ t1 ≤ … ≤ tk ≤ …

Generalized Semi-Markov Processes (GSMP) A GSMP is a stochastic process {X(t)} with state space X generated by a stochastic timed automaton X is the countable state space E is the countable event set Γ(x) is the feasible event set at state x. f(x, e): is state transition function. p0 is the probability mass function of the initial state G is a vector with the cdfs of all events. The semi-Markov property of GSMP is due to the fact that at the state transition instant, the next state is determined by the current state and the event that just occurred.

The Poisson Counting Process Let the process {N(t)} which counts the number of events that have occurred in the interval [0,t). For any 0 ≤ t1 ≤ … ≤ tk ≤ … 6 4 2 Process with independent increments: The random variables N(t1), N(t1,t2),…, N(tk-1,tk),… are mutually independent. Process with stationary independent increments: The random variable N(tk-1, tk), does not depend on tk-1, tk but only on tk -tk-1 t1 t2 t3 tk-1 … tk

The Poisson Counting Process Assumptions: At most one event can occur at any time instant (no two or more events can occur at the same time) A process with stationary independent increments Given that a process satisfies the above assumptions, find

The Poisson Process Step 1: Determine Starting from Stationary independent increments Lemma: Let g(t) be a differentiable function for all t≥0 such that g(0)=1 and g(t) ≤ 1 for all t >0. Then for any t, s≥0 for some λ>0

The Poisson Process Therefore Step 2: Determine P0(Δt) for a small Δt. Step 3: Determine Pn(Δt) for a small Δt. For n=2,3,… since by assumption no two events can occur at the same time As a result, for n=1

The Poisson Process Step 4: Determine Pn(t+Δt) for any n Moving terms between sides, Taking the limit as Δt  0

The Poisson Process Step 4: Determine Pn(t+Δt) for any n Moving terms between sides, Taking the limit as Δt  0

The Poisson Process Step 5: Solve the differential equation to obtain This expression is known as the Poisson distribution and it full characterizes the stochastic process {N(t)} in [0,t) under the assumptions that No two events can occur at exactly the same time, and Independent stationary increments You should verify that and Parameter λ has the interpretation of the “rate” that events arrive.

Properties of the Poisson Process: Interevent Times Let tk-1 be the time when the k-1 event has occurred and let Vk denote the (random variable) interevent time between the kth and k-1 events. What is the cdf of Vk, Gk(t)? Stationary independent increments Vk tk-1 tk-1 + t N(tk-1,tk-1+t)=0 Exponential Distribution

Properties of the Poisson Process: Exponential Interevent Times The process {Vk} k=1,2,…, that corresponds to the interevent times of a Poisson process is an iid stochastic sequence with cdf Therefore, the Poisson is also a renewal process The corresponding pdf is One can easily show that and

Properties of the Poisson Process: Memoryless Property Let tk be the time when previous event has occurred and let V denote the time until the next event. Assuming that we have been at the current state for z time units, let Y be the remaining time until the next event. What is the cdf of Y? V tk tk+z Y=V-z Memoryless! It does not matter that we have already spent z time units at the current state.

Memoryless Property This is a unique property of the exponential distribution. If a process has the memoryless property, then it must be exponential, i.e., Poisson Process λ Exponential Interevent Times G(t)=1-e-λt Memoryless Property

Superposition of Poisson Processes Consider a DES with m events each modeled as a Poisson Process with rate λi, i=1,…,m. What is the resulting process? Suppose at time tk we observe event 1. Let Y1 be the time until the next event 1. Its cdf is G1(t)=1-exp{-λ1t}. Let Y2,…,Ym denote the residual time until the next occurrence of the corresponding event. Their cdfs are: Vj ej e1 V1= Y1 Memoryless Property tk Yj=Vj-zj Let Y* be the time until the next event (any type). Therefore, we need to find

Superposition of Poisson Processes Independence The superposition of m Poisson processes is also a Poisson process with rate equal to the sum of the rates of the individual processes

Superposition of Poisson Processes Suppose that at time tk an event has occurred. What is the probability that the next event to occur is event j? Without loss of generality, let j=1 and define Y’=min{Yi: i=2,…,m}.

Residual Lifetime Paradox Suppose that buses pass by the bus station according to a Poisson process with rate λ. A passenger arrives at the bus station at some random point. How long does the passenger has to wait? V bk p bk+1 Y Z Solution 1: E[V]= 1/λ. Therefore, since the passenger will (on average) arrive in the middle of the interval, he has to wait for E[Y]=E[V]/2= 1/(2λ). But using the memoryless property, the time until the next bus is exponentially distributed with rate λ, therefore E[Y]=1/λ not 1/(2λ)! Solution 2: Using the memoryless property, the time until the next bus is exponentially distributed with rate λ, therefore E[Y]=1/λ. But note that E[Z]= 1/λ therefore E[V]= E[Z]+E[Y]= 2/λ not 1/λ!