Intro. to Stochastic Processes

Slides:



Advertisements
Similar presentations
Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Advertisements

Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
Flows and Networks Plan for today (lecture 2): Questions? Continuous time Markov chain Birth-death process Example: pure birth process Example: pure death.
Markov Chains.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Probability and Statistics with Reliability, Queuing and Computer Science Applications: Chapter 6 on Stochastic Processes Kishor S. Trivedi Visiting Professor.
Topics Review of DTMC Classification of states Economic analysis
TCOM 501: Networking Theory & Fundamentals
G12: Management Science Markov Chains.
Chapter 17 Markov Chains.
IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Stochastic Processes Dr. Nur Aini Masruroh. Stochastic process X(t) is the state of the process (measurable characteristic of interest) at time t the.
Lecture 13 – Continuous-Time Markov Chains
Introduction to stochastic process
TCOM 501: Networking Theory & Fundamentals
If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities.
Probability theory 2011 Outline of lecture 7 The Poisson process  Definitions  Restarted Poisson processes  Conditioning in Poisson processes  Thinning.
Markov Chains Chapter 16.
Group exercise For 0≤t 1
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Chapter 4. Continuous Probability Distributions
CMPE 252A: Computer Networks Review Set:
Exponential Distribution & Poisson Process
1 Exponential Distribution & Poisson Process Memorylessness & other exponential distribution properties; Poisson process and compound P.P.’s.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
CDA6530: Performance Models of Computers and Networks Examples of Stochastic Process, Markov Chain, M/M/* Queue TexPoint fonts used in EMF. Read the TexPoint.
Introduction to Stochastic Models GSLM 54100
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Probability and Statistics with Reliability, Queuing and Computer Science Applications: Chapter 7 on Discrete Time Markov Chains Kishor S. Trivedi Visiting.
Andy Guo 1 Handout Ch5(2) 實習. Andy Guo 2 Normal Distribution There are three reasons why normal distribution is important –Mathematical properties of.
Generalized Semi-Markov Processes (GSMP)
Stochastic Models Lecture 2 Poisson Processes
Modeling and Analysis of Computer Networks
1 Birth and death process N(t) Depends on how fast arrivals or departures occur Objective N(t) = # of customers at time t. λ arrivals (births) departures.
Markov Chains and Random Walks. Def: A stochastic process X={X(t),t ∈ T} is a collection of random variables. If T is a countable set, say T={0,1,2, …
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < < tk < tk+1 If X(t) is discrete-valued If X(t) is continuous-valued i.e.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004 Assignment 3.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
The final exam solutions. Part I, #1, Central limit theorem Let X1,X2, …, Xn be a sequence of i.i.d. random variables each having mean μ and variance.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
Chapter 2 Probability, Statistics and Traffic Theories
Chapter 4. Random Variables - 3
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
To be presented by Maral Hudaybergenova IENG 513 FALL 2015.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
Stochastic Processes1 CPSC : Stochastic Processes Instructor: Anirban Mahanti Reference Book “Computer Systems Performance.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
S TOCHASTIC M ODELS L ECTURE 4 B ROWNIAN M OTIONS Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (Shenzhen) Nov 11,
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Renewal Theory Definitions, Limit Theorems, Renewal Reward Processes, Alternating Renewal Processes, Age and Excess Life Distributions, Inspection Paradox.
Discrete-time Markov chain (DTMC) State space distribution
Exponential Distribution & Poisson Process
Discrete-time markov chain (continuation)
Pertemuan ke-7 s/d ke-10 (minggu ke-4 dan ke-5)
Chapter 5 Statistical Models in Simulation
Presentation transcript:

Intro. to Stochastic Processes Cheng-Fu Chou

Outline Stochastic Process Counting Process Poisson Process Markov Process Renewal Process P. 2

Stochastic Process A stochastic process N= {N(t), t T} is a collection of r.v., i.e., for each t in the index set T, N(t) is a random variable t: time N(t): state at time t If T is a countable set, N is a discrete-time stochastic process If T is continuous, N is a continuous-time stoc. proc. P. 3

Counting Process A stochastic process {N(t) ,t  0} is said to be a counting process if N(t) is the total number of events that occurred up to time t. Hence, some properties of a counting process is N(t)  0 N(t) is integer valued If s < t, N(t)  N(s) For s < t, N(t) – N(s) equals number of events occurring in the interval (s, t] P. 4

Counting Process Independent increments If the number of events that occur in disjoint time intervals are independent Stationary increments If the dist. of number of events that occur in any interval of time depends only on the length of time interval P. 5

Poisson Process Def. A: the counting process {N(t), t0} is said to be Poisson process having rate l, l>0 if N(0) = 0; The process has independent-increments Number of events in any interval of length t is Poisson dist. with mean lt, that is for all s, t 0. P. 6

Poisson Process Def. B: The counting process {N(t), t 0} is said to be a Poisson process with rate l, l>0, if N(0) = 0 The process has stationary and independent increments P[N(h) = 1] = lh +o(h) P[N(h)  2] = o(h) The func. f is said to be o(h) if Def A  Def B, i.e,. they are equivalent. We show Def B  Def A Def A  Def B is HW P. 7

Important Properties Property 1: mean number of event for any t 0, E[N(t)]=lt. Property 2: the inter-arrival time dist. of a Poisson process with rate l is an exponential dist. with parameter l. Property 3: the superposition of two independent Poisson process with rate l1 and l2 is a Poisson process with rate l1+l2 P. 8

Properties (cont.) Property 4: if we perform Bernoulli trials to make independent random erasures from a Poisson process, the remaining arrivals also form a Poisson process Property 5: the time until rth arrival , i.e., tr is known as the rth order waiting time, is the sum of r independent experimental values of t and is described by Erlan pdf. P. 9

Ex 1 Suppose that X1 and X2 are independent exponential random variables with respective means 1/l1 and 1/l2;What is P{X1 < X2} P. 10

P. 11

Conditional Dist. Of the Arrival Time Suppose we are told that exactly one event of a Poisson process has taken place by time t, what is the distribution of the time at which the event occurred? P. 12

P. 13

Ex 2 Consider the failure of a link in a communication network. Failures occur according to a Poisson process with rate 4.8 per day. Find P[time between failures  10 days] P[5 failures in 20 days] Expected time between 2 consecutive failures P[0 failures in next day] Suppose 12 hours have elapsed since last failure, find the expected time to next failure P. 14

P. 15

Poisson, Markov, Renewal Processes Continuous Time Markov Chain: Exponential times between transitions Relax counting process Poisson Process: Counting process iid exponential times between arrivals Relax exponential interarrival times Renewal Process: Counting process iid times between arrivals P. 16

Markov Process

Markov Process P[X(tn+1)  Xn+1| X(tn)= xn, X(tn-1) = xn-1,…X(t1)=x1] = P[X(tn+1)  Xn+1| X(tn)=xn] Probabilistic future of the process depends only on the current state, not on the history We are mostly concerned with discrete-space Markov process, commonly referred to as Markov chains Discrete-time Markov chains Continuous-time Markov chains P. 18

DTMC Discrete Time Markov Chain: P[Xn+1 = j | Xn= kn, Xn-1 = kn-1,…X0= k0] = P[Xn+1 = j | Xn = kn] discrete time, discrete space A finite-state DTMC if its state space is finite A homogeneous DTMC if P[Xn+1 = j | Xn= i ] does not depend on n for all i, j, i.e., Pij = P[Xn+1 = j | Xn= i ], where Pij is one step transition prob. P. 19

Definition P = [ Pij] is the transition matrix A matrix that satisfies those conditions is called a stochastic matrix n-step transition prob. P. 20

Chapman-Kolmogorov Eq. Def. Proof: P. 21

Question We have only been dealing with conditional prob. but what we want is to compute the unconditional prob. that the system is in state j at time n, i.e. P. 22

Result 1 For all n  1, pn = p0Pn, where pm = (pm(0),pm(1),…) for all m  0. From the above equ., we deduce that pn+1 = pnP. Assume that limn pn(i) exists for all i, and refer it as p(i). The remaining question is how to compute p Reachable: a state j is reachable from i. if Communicate: if j is reachable from i and if i is reachable form j, then we say that i and j communicate (i  j) P. 23

Result 1 (cont.) Irreducible: A M.C. is irreducible if i  j for all i,j I Aperiodic: For every state iI, define d(i) to be largest common divisor of all integer n, s.t., P. 24

Result 2 Invariant measure of a M.C., if a M.C. with transition matrix P is irreducible and aperiodic and if the system of equation p=pP and p1=1 has a strict positive solution then p(i) = limn pn(i) independently of initial dist. Invariant equ. : p=pP Invariant measure p P. 25

Gambler’s Ruin Problem Consider a gambler who at each play of game has probability p of winning one unit and probability q=1-p of losing one unit. Assuming that successive plays of the game are independent, what is the probability that, starting with i units, the gambler’s fortune will reach N before reaching 0? P. 26

Ans If we let Xn denote the player’s fortune at time n, then the process {Xn, n=0, 1,2,…} is a Markov chain with transition probabilities: p00 =pNN =1 pi,i+1 = p = 1-pi,i-1 This Markov chain has 3 classes of states: {0},{1,2,…,N-1}, and {N} P. 27

P3 - P2 =q/p*(P2-P1)= (q/p)2*P1 Let Pi, i=0,1,2,…,N, denote the prob. That, starting with i, the gambler’s fortune will eventually reach N. By conditioning on the outcome of the initial play of the game we obtain Pi = pPi+1 + qPi-1, i=1,2, …, N-1 Since p+q =1 Pi+1 – Pi = q/p(Pi-Pi-1), Also, P0 =0, so P2 – P1 = q/p*(P1-P0) = q/p*P1 P3 - P2 =q/p*(P2-P1)= (q/p)2*P1 P. 28

P. 29

If p > ½, there is a positive prob If p > ½, there is a positive prob. that the gambler’s fortune will increase indefinitely Otherwise, the gambler will, with prob. 1, go broke against an infinitely rich adversary. P. 30

CTMC Continuous-time Markov Chain Continuous time, discrete state P[X(t)= j | X(s)=i, X(sn-1)= in-1,…X(s0)= i0] = P[X(t)= j | X(s)=i] A continuous M.C. is homogeneous if P[X(t+u)= j | X(s+u)=i] = P[X(t)= j | X(s)=i] = Pij[t-s], where t > s Chapman-Kolmogorov equ. P. 31

CTMC (cont.) p(t)=p(0)eQt Q is called the infinitesimal generator Proof: P. 32

Result 3 If a continuous M.C. with infinitesimal generator Q is irreducible and if the system of equations pQ = 0, and p1=1, has a strictly positive solution then p(i)= limt p(x(t)=i) for all iI, independently of the initial dist. P. 33

Renewal Process

time interval, not a rate; Renewal Process A counting process {N(t), t  0} is a renewal process if for each n, Xn is the time between the (n-1)st and nth arrivals and {Xn, n  1} are independent with the same distribution F. The time of the nth arrival is with S0 = 0. Can write and if m = E[Xn], n  1, then the strong law of large numbers says that Note: m is now a time interval, not a rate; 1/ m will be called the rate of the r. p.

Fundamental Relationship It follows that where Fn(t) is the n-fold convolution of F with itself. The mean value of N(t) is Condition on the time of the first renewal to get the renewal equation:

Exercise 1 Is it true that:

Exercise 2 If the mean-value function of the renewal process {N(t), t  0} is given by Then what is P{N(5) = 0} ?

Exercise 3 Consider a renewal process {N(t), t  0} having a gamma (r,l) interarrival distribution with density Show that Hint: use the relationship between the gamma (r,l) distribution and the sum of r independent exponentials with rate l to define N(t) in terms of a Poisson process with rate l.

Limit Theorems With probability 1, Elementary renewal theorem: Central limit theorem: For large t, N(t) is approximately normally distributed with mean t/m and variance where s2 is the variance of the time between arrivals; in particular,

Exercise 4 A machine in use is replaced by a new machine either when it fails or when it reaches the age of T years. If the lifetimes of successive machines are independent with a common distribution F with density f, show that the long-run rate at which machines are replaced is the long-run rate at which machines in use fail equals Hint: condition on the lifetime of the first machine

Renewal Reward Processes Suppose that each time a renewal occurs we receive a reward. Assume Rn is the reward earned at the nth renewal and {Rn, n  1} are independent and identically distributed (Rn may depend on Xn). The total reward up to time t is If then and

Age & Excess Life of a Renewal Process The age at time t is A(t) = the amount of time elapsed since the last renewal. The excess life Y(t) is the time until the next renewal: SN(t) t A(t) Y(t) What is the average value of the age

Average Age of a Renewal Process Imagine we receive payment at a rate equal to the current age of the renewal process. Our total reward up to time s is and the average reward up to time s is If X is the length of a renewal cycle, then the total reward during the cycle is So, the average age is

Average Excess or Residual Now imagine we receive payment at a rate equal to the current excess of the renewal process. Our total reward up to time s is and the average reward up to time s is If X is the length of a renewal cycle, then the total reward during the cycle is So, the average excess is (also)

Inspection Paradox Suppose that the distribution of the time between renewals, F, is unknown. One way to estimate it is to choose some sampling times t1, t2, etc., and for each ti, record the total amount of time between the renewals just before and just after ti. This scheme will overestimate the inter-renewal times – Why? For each sampling time, t, we will record Find its distribution by conditioning on the time of the last renewal prior to time t

Inspection Paradox (cont.) SN(t) SN(t)+1 t-s t

Inspection Paradox (cont.) SN(t) SN(t)+1 t-s t For any s, so where X is an ordinary inter-renewal time. Intuitively, by choosing “random” times, it is more likely we will choose a time that falls in a long time interval.