Stochastic Processes Dr. Nur Aini Masruroh. Stochastic process X(t) is the state of the process (measurable characteristic of interest) at time t the.

Slides:



Advertisements
Similar presentations
Exponential Distribution
Advertisements

Exponential and Poisson Chapter 5 Material. 2 Poisson Distribution [Discrete] Poisson distribution describes many random processes quite well and is mathematically.
Exponential Distribution
Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
Chapter 5 Statistical Models in Simulation
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
INDR 343 Problem Session
STAT 497 APPLIED TIME SERIES ANALYSIS
TCOM 501: Networking Theory & Fundamentals
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Operations Research: Applications and Algorithms
Reliable System Design 2011 by: Amir M. Rahmani
Introduction to stochastic process
TCOM 501: Networking Theory & Fundamentals
15. Poisson Processes In Lecture 4, we introduced Poisson arrivals as the limiting behavior of Binomial random variables. (Refer to Poisson approximation.
Probability theory 2011 Outline of lecture 7 The Poisson process  Definitions  Restarted Poisson processes  Conditioning in Poisson processes  Thinning.
Some standard univariate probability distributions
Some standard univariate probability distributions
Discrete Random Variables and Probability Distributions
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Queuing Networks: Burke’s Theorem, Kleinrock’s Approximation, and Jackson’s Theorem Wade Trappe.
Some standard univariate probability distributions
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Introduction to Stochastic Models GSLM 54100
Chapter 4. Continuous Probability Distributions
Exponential Distribution & Poisson Process
1 Exponential Distribution & Poisson Process Memorylessness & other exponential distribution properties; Poisson process and compound P.P.’s.
Chapter 5 Discrete Random Variables and Probability Distributions ©
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Continuous Random Variables and Probability Distributions
Generalized Semi-Markov Processes (GSMP)
Intro. to Stochastic Processes
Some standard univariate probability distributions Characteristic function, moment generating function, cumulant generating functions Discrete distribution.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Stochastic Models Lecture 2 Poisson Processes
Flows and Networks Plan for today (lecture 4): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
1 Queuing Models Dr. Mahmoud Alrefaei 2 Introduction Each one of us has spent a great deal of time waiting in lines. One example in the Cafeteria. Other.
1 Birth and death process N(t) Depends on how fast arrivals or departures occur Objective N(t) = # of customers at time t. λ arrivals (births) departures.
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
Why Wait?!? Bryan Gorney Joe Walker Dave Mertz Josh Staidl Matt Boche.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Chapter 20 Queuing Theory to accompany Operations Research: Applications and Algorithms 4th edition by Wayne L. Winston Copyright (c) 2004 Brooks/Cole,
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
Stochastic Processes1 CPSC : Stochastic Processes Instructor: Anirban Mahanti Reference Book “Computer Systems Performance.
 Recall your experience when you take an elevator.  Think about usually how long it takes for the elevator to arrive.  Most likely, the experience.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
Flows and Networks Plan for today (lecture 3): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
From DeGroot & Schervish. Example Occupied Telephone Lines Suppose that a certain business office has five telephone lines and that any number of these.
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Renewal Theory Definitions, Limit Theorems, Renewal Reward Processes, Alternating Renewal Processes, Age and Excess Life Distributions, Inspection Paradox.
Probability Distributions: a review
水分子不時受撞,跳格子(c.p. 車行) 投骰子 (最穩定) 股票 (價格是不穏定,但報酬過程略穩定) 地震的次數 (不穩定)
Exponential Distribution & Poisson Process
The Poisson Process.
Flows and Networks Plan for today (lecture 3):
Pertemuan ke-7 s/d ke-10 (minggu ke-4 dan ke-5)
V5 Stochastic Processes
CPSC 531: System Modeling and Simulation
Random Variables A random variable is a rule that assigns exactly one value to each point in a sample space for an experiment. A random variable can be.
Poisson Process and Related Distributions
Presentation transcript:

Stochastic Processes Dr. Nur Aini Masruroh

Stochastic process X(t) is the state of the process (measurable characteristic of interest) at time t the state space of the a stochastic process is defined as the set of all possible values that the random variables X(t) can assume when the set T is countable, the stochastic process is a discrete time process ; denote by {Xn, n=0, 1, 2, …} when T is an interval of the real line, the stochastic process is a continuous time process; denote by {X(t), t≥0}

Stochastic process Hence, a stochastic process is a family of random variables that describes the evolution through time of some (physical) process. usually, the random variables X(t) are dependent and hence the analysis of stochastic processes is very difficult. Discrete Time Markov Chains ( DTMC ) is a special type of stochastic process that has a very simple dependence among X(t) and renders nice results in the analysis of {X(t), t ∈ T} under very mild assumptions.

Example of stochastic processes Refer to X(t) as the state of the process at time t  A stochastic process {X(t), t ∈ T} is a time indexed collection of random variables  X(t) might equal the total number of customers that have entered a supermarket by time t  X(t) might equal the number of customers in the supermarket at time t  X(t) might equal the stock price of a company at time t

Counting process  Definition:  A stochastic process {N(t), t≥0} is a counting process if N(t) represents the total number of “events” that have occurred up to time t

Counting process  Examples:  If N(t) equal the number of persons who have entered a particular store at or prior to time t, then {N(t), t≥0} is a counting process in which an event corresponds to a person entering the store If N(t) equal the number of persons in the store at time t, then {N(t), t≥0} would not be a counting process. Why?  If N(t) equals the total number of people born by time t, then {N(t), t≥0} is a counting process in which an event corresponds to a child is born  If N(t) equals the number of goals that Ronaldo has scored by time t, then {N(t), t≥0} is a counting process in which an event occurs whenever he scores a goal

Counting process  A counting process N(t) must satisfy  N(t)≥0  N(t) is integer valued  If s ≤t, then N(s) ≤ N(t)  For s<t, N(t)-N(s) equals the number of events that have occurred in the interval (s,t), or the increments of the counting process in (s,t)  A counting process has  Independent increments if the number of events which occur in disjoint time intervals are independent  Stationary increments if the distribution of the number of events which occur in any interval of time depends only on the length of the time interval

Independent increment  This property says that numbers of events in disjoint intervals are independent random variables.  Suppose that t 1 < t 2 ≤ t 3 < t 4. Then N(t 2 )-N(t 1 ), the number of events occurring in (t 1,t 2 ], is independent of N(t 4 )-N(t 3 ), the number of events occurring in (t 3, t 4 ].

Example Dependent increments:  Suppose N(t) is the number of babies born by year t.  If N(t) is very large, then it is probable that there are many people alive at time t; this would then lead us to believe that the number of new births between time t and t+s would also tend to be large.  Hence {N(t), t≥0} does not have independent increments.

Stationary increment  This property states that the distribution of the number of events which occur in any interval of time depends only on the length of the time interval  Suppose t 1 0. Then  N(t 2 +s)-N(t 1 +s), number of events in the interval (t 1 +s, t 2 +s), has the same distribution as  N(t 2 )-N(t 1 ), the number of events in interval (t 1, t 2 ).

Example Non-stationary increments:  The number of consumers who have entered the university canteen obviously does not have stationary increments. The arrival rates are higher during the lunch time

Poisson process (definition 1)  The counting process {N(t), t≥0} is a Poisson Process with rate λ, λ>0, if 1. N(0) = 0 2. The process has independent increments 3. The number of events in any interval of length t is Poisson distributed with mean λt. That is, ∀ s, t≥0, From condition 3, the Poisson process has stationary increments and E[N(t)]=λt. From definition 1, a Poisson process with rate λ means that at any t>0, the number of events follows a Poisson distribution with mean λt.

Example  The arrival of customers at a café is a Poisson process with rate 4 per minute. Find the probability that there is no less than 5 arrivals a) between time (0,2] (the first two minutes) b) between time (4,8].

Identifying a Poisson process  To determine if an arbitrary counting process is a Poisson process, we need to show that conditions 1, 2, 3 are satisfied.  Condition 1: states that the counting of events begins at time 0  Condition 2: can usually be directly verified from our knowledge of the process  Condition 3: hard to determine

Inter-arrival time distribution  For a Poisson process, denote T 1 as the time of the first event, and T n as the time between the (n-1)st and nth event, for n>1. The sequence {T n, n=1, 2, …} is called the sequence of interarrival times. What is the distribution of T n ?

Inter-arrival time distribution  Note that the event {T 1 >t} takes place if and only if no events of the Poisson process occurs in the time interval [0,t], {T 1 >t} ↔ {N(t)=0} Thus P(T 1 >t) = P(N(t)=0)=e -λt Hence T 1 has exponential distribution with mean 1/λ

Inter-arrival time distribution  To obtain distribution of T 2, condition on T 1 P{T 2 >t | T 1 =s} = P{0 events in (s,s+t] | T 1 =s} = P{0 events in (s, s+t]} (indept increments) = e -λt (stationary increments) Repeating the same argument yields the following Proposition: T n, n=1, 2, … are independent identically distributed exponential random variables having mean 1/λ

Inter-arrival times Remark 1:  This proposition is actually quite intuitive.  The assumption of stationary and independent increments is basically equivalent to saying that at any point in time, the process probabilistically restarts itself.  That is, the process at any point on is independent of all that has previously occurred (by independent increments), and also has the same distribution as the original process (by stationary increments).  In other words, the process has no memory, and hence the exponential interarrival times are expected.

Inter-arrival times Remark 2:  Another way to obtain the distribution of T 2 and so on, is to make use of the independent and stationary increments of the Poisson process. As such, the process probabilistically restarts itself at any point in time, so looking at the Time Frame view, the time origin can be ‘pushed’ forward, and T 2, T 3, … can be viewed similarly like T 1.

Inter-arrival times Remark 3:  The proposition gives us another way of defining a Poisson process.  Suppose we have a sequence {T n, n≥1} of independent identically distributed exponential random variables, each having mean 1/λ.  Then by defining a counting process by saying that the nth event of this process occurs at time S n = T 1 +T 2 +T 3 +…+T n, the resultant counting process {N(t), t≥0} will be Poisson with rate λ.

Waiting time distribution  Another quantity of interest is S n, the arrival time of the nth event (also called the waiting time until the nth event)

Example  Suppose that people immigrate into a territory at a Poisson rate λ=1 per day. a) What is the expected time until the tenth immigrant arrives? b) What is the probability that the elapsed time between the tenth and the eleventh arrival exceeds two days?

Solution  Since the arrival time of the nth event is Gamma distributed with parameters n, λ, a) The expected time until 10th arrival is, E[S 10 ] = 10/λ = 10 days b) From before, we know that T 11 is exponential and independent of T 10, so P{T 11 >2} = e -2λ = e -2 ≈.133

Further properties of Poisson process Property 1a:  Consider a Poisson process {N(t), t≥0} having rate λ.  Each time an event occurs, it is classified as either a Type I or Type II event with probability p and 1-p respectively, independently of all other events.  Let N 1 (t) and N 2 (t) denote respectively the number of Type I and Type II events occurring in [0,t].  Then {N 1 (t), t≥0} and {N 2 (t), t≥0} are Poisson processes having respective rates λp and λ(1-p).  The two processes are also independent.

Example  Customers arrive at a Starbucks at a Poisson rate of λ=10 per hour.  Suppose that each customer is a man with probability ½, and a woman with probability ½.  Suppose you observe 100 men arrive in the first 5 hours, how many women would we expect to have arrived in that 5 hours?

Is it right??  Because the number of male arrivals is 100, and because each arrival is male with probability ½, then the expected number of total arrivals should be 200  hence, the expected number of female arrivals in first 5 hours is 100.  Is this reasoning correct?

Absolutely not!!  We have just shown by the previous property that the two Poisson processes {Nmale(t), t≥0} & {Nfemale(t), t≥0} are independent!  So, the expected number of female arrivals in the first five hours is independent of the number of male arrivals in that period  E[Nfemale(5)] = λt(1-p) = (10)(5)(1/2) = 25

Further properties of Poisson process Property 1b:  Let {N(t), t≥0} be a Poisson process with rate λ.  Suppose that each event is type i with probability p i, i=1 to M, and  Then the type i events {N i (t), t≥0} form a Poisson process with rate p i λ; Moreover, {N i (t), t≥0} and {N j (t), t≥0} are independent of each other.

Further properties of Poisson process Property 2:  Let {N 1 (t), t≥0} and {N 2 (t), t≥0} be independent Poisson processes with rate λ 1 and λ 2 respectively.  Define N(t) = N 1 (t)+N 2 (t) for all t.  Then {N(t), t≥0} is a Poisson process with rate λ=λ 1 +λ 2.

Summary  Definition of Poisson Process  Definitions 1 & 2 (equivalent)  Counting process: Independent increments, stationary increments, Poisson distributed  Characteristics  Interarrival times Exponential  Waiting times Sum of Exponential  Gamma  Keys to obtaining these results and solving problems are (i) observing time frame & counting frame views (ii) identifying equivalent events (iii) stationary & independent increments (iv) properties of distributions

Summary  Properties  If a PP {N(t), t≥0} has rate λ, If each event is type I or type II with probability p & (1-p), then {N 1 (t), t≥0} and {N 2 (t), t≥0} are independent PP having respective rates λp and λ(1-p). If each event is type i with probability p i, i=1 to M, and then the type i events {N i (t), t≥0} form independent PPs with rate p i λ  If {N 1 (t), t≥0} and {N 2 (t), t≥0} are independent PPs with rate λ 1 and λ 2, and N(t) = N 1 (t)+N 2 (t) for all t, then {N(t), t≥0} is a PP with rate λ=λ 1 +λ 2.