Lectures prepared by: Elchanan Mossel Yelena Shvets

Slides:



Advertisements
Similar presentations
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Advertisements

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Special random variables Chapter 5 Some discrete or continuous probability distributions.
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Central Limit Theorem and Normal Distribution EE3060 Probability Yi-Wen Liu November 2010.
Review.
Section 5.7 Suppose X 1, X 2, …, X n is a random sample from a Bernoulli distribution with success probability p. Then Y = X 1 + X 2 + … + X n has a distribution.
Statistics Lecture 11.
Complexity1 Pratt’s Theorem Proved. Complexity2 Introduction So far, we’ve reduced proving PRIMES  NP to proving a number theory claim. This is our next.
Statistics Lecture 15. Percentile for Normal Distributions The 100p th percentile of the N( ,  2 ) distribution is  +  (p)  Where  (p) is.
The moment generating function of random variable X is given by Moment generating function.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C12: The Poisson process.
Discrete Random Variables and Probability Distributions
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Coulomb’s Law Section 36. Electrostatic field Poisson’s equation.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
2/6/2014PHY 770 Spring Lectures 7 & 81 PHY Statistical Mechanics 11 AM-12:15 PM & 12:30-1:45 PM TR Olin 107 Instructor: Natalie Holzwarth.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel elena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Probability Theory Overview and Analysis of Randomized Algorithms Prepared by John Reif, Ph.D. Analysis of Algorithms.
1 The Base Stock Model. 2 Assumptions  Demand occurs continuously over time  Times between consecutive orders are stochastic but independent and identically.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
1 Chapter 7 Sampling Distributions. 2 Chapter Outline  Selecting A Sample  Point Estimation  Introduction to Sampling Distributions  Sampling Distribution.
Chapter 7: Introduction to Sampling Distributions Section 2: The Central Limit Theorem.
1 Since everything is a reflection of our minds, everything can be changed by our minds.
Stat 13 Lecture 19 discrete random variables, binomial A random variable is discrete if it takes values that have gaps : most often, integers Probability.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Uniform, normal, and exponential. 2.Exponential example. 3.Uniform.
Probability Generating Functions Suppose a RV X can take on values in the set of non-negative integers: {0, 1, 2, …}. Definition: The probability generating.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
1 5.6 Poisson Distribution and the Poisson Process Some experiments result in counting the numbers of particular events occur in given times or on given.
Sampling and Sampling Distributions
The Poisson probability distribution
The Exponential and Gamma Distributions
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Linear Combination of Two Random Variables
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Linear Regression.
Distribution of the Sample Proportion
Using the Tables for the standard normal distribution
The sum of any two even integers is even.
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
The Binomial Distributions
CIS 2033 based on Dekking et al
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Presentation transcript:

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Lectures prepared by: Elchanan Mossel Yelena Shvets Follows Jim Pitman’s book: Probability Section 3.5

The Poisson (m) Distribution The Poisson distribution with parameter m or Poisson(m) distribution is the distribution of probabilities Pm(k) over {0,1,2,…} defined by:

Poisson Distribution Recall that Poisson() is a good approximation for N = Bin(n,p) where n is large p is small and  = np.

Poisson Distribution Example: Ngalaxies: the number of super-nova found in a square inch of the sky is distributed according to the Poisson distribution.

Poisson Distribution Example: Ndrops: the number of rain drops falling on a given square inch of the roof in a given period of time has a Poisson distribution. Y 1 X 1

Poisson Distribution:  and  Since N = Poisson() » Bin(n,/n) we expect that: Next we will verify it.

Mean of the Poisson Distribution By direct computation:

SD of the Poisson Distribution

Sum of independent Poissions Consider two independent r.v.’s : N1 » Poisson(m1) and N2 » Poisson(m2). We approximately have: N1 » Bin(1/p,p) and N2 » Bin(2/p,p), where p is small. So if we pretend that 1/p,2/p are integers then: N1 + N2 ~ Bin((1+2)/p,p) ~ Poisson(1+2)

Sum of independent Poissions Claim: if Ni » Poisson(mi) are independent then N1 + N2 + … + Nr ~ Poisson(1 + … r) Proof (for r=2):

Poisson Scatter The Poisson Scatter Theorem: Considers particles hitting the square where: Different particles hit different points and If we partition the square into n equi-area squares then each square is hit with the same probability pn independently of the hits of all other squares

Poisson Scatter The Poisson Scatter Theorem: states that under these two assumptions there exists a number  > 0 s.t: For each set B, the number N(B) of hits in B satisfies N(B) » Poisson( £ Area(B)) For disjoint sets B1,…,Br, the numbers of hits N(B1), …,N(Br) are independent. The process defined by the theorem is called Poisson scatter with intensity 

Poisson Scatter

Poisson Scatter

Poisson Scatter

Poisson Scatter Thinning Claim: Suppose that in a Poisson scatter with intensity  each point is kept with prob. p and erased with probability 1-p independently of its position and the erasure of all other points. Then the scatter of points kept is a Poisson scatter with intensity p .