CIS 2033 based on Dekking et al

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Ch. 19 Unbiased Estimators Ch. 20 Efficiency and Mean Squared Error CIS 2033: Computational Probability and Statistics Prof. Longin Jan Latecki Prepared.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Michael Maurizi Instructor Longin Jan Latecki C9:
Computer Systems Modelling Sam Haines Clare College, University of Cambridge 1 st March 2015.
Probability Distributions
Statistics Lecture 11.
Probability theory 2011 Outline of lecture 7 The Poisson process  Definitions  Restarted Poisson processes  Conditioning in Poisson processes  Thinning.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki Chapter 5: Continuous Random Variables.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C12: The Poisson process.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
The Poisson Probability Distribution The Poisson probability distribution provides a good model for the probability distribution of the number of “rare.
Chapter 4. Continuous Probability Distributions
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
© 2010 Pearson Education Inc.Goldstein/Schneider/Lay/Asmar, CALCULUS AND ITS APPLICATIONS, 12e – Slide 1 of 15 Chapter 12 Probability and Calculus.
Exponential Distribution & Poisson Process
1 Exponential Distribution & Poisson Process Memorylessness & other exponential distribution properties; Poisson process and compound P.P.’s.
LECTURE 14 SIMULATION AND MODELING Md. Tanvir Al Amin, Lecturer, Dept. of CSE, BUET CSE 411.
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Exponential Distribution
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Michael Baron. Probability and Statistics for Computer Scientists,
MATH 3033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Sean Hercus Instructor Longin Jan Latecki Ch. 6 Simulations.
Probability Refresher COMP5416 Advanced Network Technologies.
4.3 More Discrete Probability Distributions NOTES Coach Bridges.
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
 Recall your experience when you take an elevator.  Think about usually how long it takes for the elevator to arrive.  Most likely, the experience.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
Random Variables r Random variables define a real valued function over a sample space. r The value of a random variable is determined by the outcome of.
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics B: Michael Baron. Probability and Statistics for Computer Scientists,
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
The Poisson Probability Distribution
Poisson Random Variables
Random Variable 2013.
Probability Distributions: a review
Lecture 3 B Maysaa ELmahi.
Poisson Distribution.
Random variables (r.v.) Random variable
Exponential Distribution & Poisson Process
The Poisson Process.
C4: DISCRETE RANDOM VARIABLES
Engineering Probability and Statistics - SE-205 -Chap 3
Pertemuan ke-7 s/d ke-10 (minggu ke-4 dan ke-5)
Chapter 5 Statistical Models in Simulation
Multinomial Distribution
CIS 2033 based on Dekking et al
Probability & Statistics Probability Theory Mathematical Probability Models Event Relationships Distributions of Random Variables Continuous Random.
LESSON 12: EXPONENTIAL DISTRIBUTION
C14: The central limit theorem
Chapter 10: Covariance and Correlation
MATH 3033 based on Dekking et al
C3: Conditional Probability And Independence
STATISTICAL MODELS.
MATH 3033 based on Dekking et al
C3: Conditional Probability And Independence
CIS 2033 based on Dekking et al
CIS 2033 based on Dekking et al
C19: Unbiased Estimators
C3: Conditional Probability And Independence
Each Distribution for Random Variables Has:
C19: Unbiased Estimators
Chapter 10: Covariance and Correlation
MATH 3033 based on Dekking et al
MATH 3033 based on Dekking et al
Chapter 10: Covariance and Correlation
CIS 2033 based on Dekking et al
District Random Variables and Probability Distribution
Presentation transcript:

CIS 2033 based on Dekking et al CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Instructor Longin Jan Latecki C12: The Poisson process

12.2 – Poisson Distribution Definition: A discrete RV X has a Poisson distribution with parameter µ, where µ > 0 if its probability mass function is given by for k = 0,1,2…, where µ is the expected number of rare events occurring in time interval [0, t], which is fixed for X. We can express µ = t λ, where t is the length of the interval, e.g., number of minutes. Hence λ = µ / t = number of events per time unite = probability of success. λ is also called the intensity or frequency of the Poisson process. We denote this distribution: Pois(µ) = Pois(tλ). Expectation E[X] = µ = tλ and variance Var(X) = µ = tλ

Hence expected inter-arrival time is E(Ti) =1/λ. Since for Poisson Let X1, X2, … be arrival times such that the probability of k arrivals in a given time interval [0, t] has a Poisson distribution Pois(tλ): The differences Ti = Xi – Xi-1 are called inter-arrival times or wait times. The inter-arrival times T1=X1, T2=X2 – X1, T3=X3 – X2 … are independent RVs, each with an Exp(λ) distribution. Hence expected inter-arrival time is E(Ti) =1/λ. Since for Poisson λ = µ / t = (number of events) / (time unite) = probability of success, we have for the exponential distribution E(Ti) =1/λ = t / µ = (time unite) / (number of events) = wait time

Let X1, X2, … be arrival times such that the probability of k arrivals in a given time interval [0, t] has a Poisson distribution Pois(λt): Each arrival time Xi, is a random variable with Gam(i, λ) distribution for α=i : We also observe that Gam(1, λ) = Exp(λ):

Example form Baron Book:

12.2 –Random arrivals Example: Telephone calls arrival times Calls arrive at random times, X1, X2, X3… Homegeneity aka weak stationarity: is the rate lambda at which arrivals occur in constant over time: in a subinterval of length u the expectation of the number of telephone calls is λu. Independence: The number of arrivals in disjoint time intervals are independent random variables. N(I) = total number of calls in an interval I Nt=N([0,t]) E[Nt] = t λ Divide Interval [0,t] into n intervals, each of size t/n

12.2 –Random arrivals When n is large enough, every interval Ij,n = ((j-1)t/n , jt/n] contains either 0 or 1 arrivals. Arrival: For such a large n ( n > λ t), Rj = number of arrivals in the time interval Ij,n, Rj = 0 or 1 Rj has a Ber(p) distribution for some p. Recall: (For a Bernoulli random variable) E[Rj] = 0 • (1 – p) + 1 • p = p By Homogeneity assumption for each j p = λ • length of Ij,n = λ ( t / n) Total number of calls: Nt = R1 + R2 + … + Rn. By Independence assumption Rj are independent random variables, so Nt has a Bin(n,p) distribution, with p = λ t/n When n goes to infinity, Bin(n,p) converges to a Poisson distribution