9. Limit Theorems.

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Occupancy Problems m balls being randomly assigned to one of n bins. (Independently and uniformly) The questions: - what is the maximum number of balls.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
Discrete Uniform Distribution
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Independence of random variables
Review of Basic Probability and Statistics
Section 5.7 Suppose X 1, X 2, …, X n is a random sample from a Bernoulli distribution with success probability p. Then Y = X 1 + X 2 + … + X n has a distribution.
Prof. Bart Selman Module Probability --- Part d)
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
P robability Important Random Variable Independent random variable Mean and variance 郭俊利 2009/03/23.
Probability Mass Function Expectation 郭俊利 2009/03/16
The moment generating function of random variable X is given by Moment generating function.
Review of Probability and Statistics
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
The Central Limit Theorem For simple random samples from any population with finite mean and variance, as n becomes increasingly large, the sampling distribution.
Standard error of estimate & Confidence interval.
Hamid R. Rabiee Fall 2009 Stochastic Processes Review of Elementary Probability Lecture I.
CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables TexPoint fonts used in EMF. Read the TexPoint manual.
IRDM WS Chapter 2: Basics from Probability Theory and Statistics 2.1 Probability Theory Events, Probabilities, Random Variables, Distributions,
Permutations & Combinations and Distributions
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two.
Lectures prepared by: Elchanan Mossel elena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 26.
Bernoulli Trials Two Possible Outcomes –Success, with probability p –Failure, with probability q = 1  p Trials are independent.
Convergence in Distribution
X = 2*Bin(300,1/2) – 300 E[X] = 0 Y = 2*Bin(30,1/2) – 30 E[Y] = 0.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
Probability and Moment Approximations using Limit Theorems.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
Probability Distributions and Expected Value
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Chebyshev’s Inequality Markov’s Inequality Proposition 2.1.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
P robability Limit Infinity 郭俊利 2010/05/11.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Continuous Random Variables.
3.1 Discrete Random Variables Present the analysis of several random experiments Discuss several discrete random variables that frequently arise in applications.
Basic statistics Usman Roshan.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 3: Discrete Random Variables and Their Distributions CIS.
Jiaping Wang Department of Mathematical Science 04/22/2013, Monday
Inequalities, Covariance, examples
ONE DIMENSIONAL RANDOM VARIABLES
Expectations of Random Variables, Functions of Random Variables
Basic statistics Usman Roshan.
5. Continuous Random Variables
Appendix A: Probability Theory
Parameter, Statistic and Random Samples
3.1 Expectation Expectation Example
Conditional Probability on a joint discrete distribution
Discrete Probability Distributions
Chapter 4: Mathematical Expectation:
Monte Carlo Approximations – Introduction
ASV Chapters 1 - Sample Spaces and Probabilities
3. Independence and Random Variables
X is Uniform(0, 1). What is the PDF of Y = √X?
TexPoint fonts used in EMF.
5. Conditioning and Independence
Chap 11 Sums of Independent Random Variables and Limit Theorem Ghahramani 3rd edition 2019/5/16.
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
Mathematical Expectation
Presentation transcript:

9. Limit Theorems

Many times we do not need to calculate probabilities exactly An approximate or qualitative estimate often suffices P(magnitude 7+ earthquake within 10 years) = ?

I toss a coin 1000 times. The probability that I get a streak of 3 consecutive heads is < 10% ≈ 50% > 90%

I toss a coin 1000 times. The probability that I get a streak of 14 consecutive heads is < 10% ≈ 50% > 90%

Markov’s inequality Proof For every non-negative random variable X and every value a: P(X ≥ a) ≤ E[X] / a. Proof

1000 people throw their hats in the air 1000 people throw their hats in the air. What is the probability at least 100 people get their hat back?

X = Uniform(0, 4). How does P(X ≥ x) compare with Markov’s inequality?

I toss a coin 1000 times. What is the probability I get 3 consecutive heads (a) at least 700 times (b) at most 50 times

Chebyshev’s inequality For every random variable X and every t: P(|X – m| ≥ ts) ≤ 1 / t2. where m = E[X], s = √Var[X].

Chebyshev’s inequality For every random variable X and every t: P(|X – m| ≥ ts) ≤ 1 / t2. where m = E[X], s = √Var[X].

Chebyshev’s inequality: m a Markov’s inequality: P( X ≥ a ) ≤ m / a. p.m.f. / p.d.f. of X m – ts m m + ts Chebyshev’s inequality: P(|X – m| ≥ ts ) ≤ 1 / t2. s p.m.f. / p.d.f. of X

I toss a coin 64 times. What is the probability I get at most 24 heads?

👩🏻‍🦳👩🏻‍🎤👵🤰🏽🧔🏼👧🏻👨🏻‍🏫🧕👨‍✈️👨🏾‍🎨 👍👍👎👍👎👎👎👎👍👎 Polling 👩🏻‍🦳👩🏻‍🎤👵🤰🏽🧔🏼👧🏻👨🏻‍🏫🧕👨‍✈️👨🏾‍🎨 👍👍👎👍👎👎👎👎👍👎 X = X1 + … + Xn

Polling How accurate is the pollster’s estimate X/n? m = E[Xi], s = √Var[Xi] E[X] = Var[X] =

Polling P( |X/n – m| ≥ e )

The weak law of large numbers X1,…, Xn are independent with same PMF/PDF m = E[Xi], s = √Var[Xi], X = X1 + … + Xn For every e, d > 0 and n ≥ s2/(e2d): P(|X/n – m| ≥ e) ≤ d

We want confidence error d = 10% and sampling error e = 5% We want confidence error d = 10% and sampling error e = 5% . How many people should we poll?

1000 people throw their hats in the air 1000 people throw their hats in the air. What is the probability at least 100 people get their hat back?

I toss a coin 1000 times. What is the probability I get 3 consecutive heads (a) at least 250 times (b) at most 50 times

A polling simulation X1, …, Xn independent Bernoulli(1/2) X1 + … + Xn pollster’s estimate number of people polled n

A polling simulation 20 simulations X1 + … + Xn n pollster’s estimate number of people polled n

X1,…, Xn are independent with same PMF/PDF Let’s assume n is large. Weak law of large numbers: X1 + … + Xn ≈ mn with high probability P( |X – mn| ≥ ts √n ) ≤ 1 / t2. this suggests X1 + … + Xn ≈ mn + Ts √n

Some experiments X = X1 + … + Xn Xi independent Bernoulli(1/2) n = 6

Xi independent Poisson(1) X = X1 + … + Xn Xi independent Poisson(1) n = 3 n = 20

Xi independent Uniform(0, 1) X = X1 + … + Xn Xi independent Uniform(0, 1) n = 2 n = 10

f(t) = (2p)-½ e-t /2 2 t

The central limit theorem X1,…, Xn are independent with same PMF/PDF m = E[Xi], s = √Var[Xi], X = X1 + … + Xn For every t (positive or negative): lim P(X ≤ mn + ts √n ) = P(N ≤ t) n → ∞ where N is a normal random variable.

eventually, everything is normal

Toss a die 100 times. What is the probability that the sum of the outcomes exceeds 400?

We want confidence error d = 1% and sampling error e = 5% We want confidence error d = 1% and sampling error e = 5%. How many people should we poll?

Drop three points at random on a square Drop three points at random on a square. What is the probability that they form an acute triangle?

method requirements weakness Markov’s inequality one-sided, often imprecise E[X] only Chebyshev’s inequality E[X] and Var[X] often imprecise weak law of large numbers pairwise independence often imprecise central limit theorem independence of many samples no rigorous bound

The strong law of large numbers

The strong law of large numbers X1,…, Xn are independent with same PMF / PDF m = E[Xi], X = X1 + … + Xn P(limn → ∞ X/n = m) = 1 If E[Xi4] is finite then