Download presentation
Presentation is loading. Please wait.
Published byClara Long Modified over 9 years ago
1
week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is. Intuitively approaches ½ as n ∞.
2
week 122 Markov’s Inequality If X is a non-negative random variable with E(X) 0 then,
3
week 123 Chebyshev’s Inequality For a random variable X with E(X) 0 Proof:
4
week 124 Back to the Law of Large Numbers Interested in sequence of random variables X 1, X 2, X 3,… such that the random variables are independent and identically distributed (i.i.d). Let Suppose E(X i ) = μ, V(X i ) = σ 2, then and Intuitively, as n ∞, so
5
week 125 Formally, the Weak Law of Large Numbers (WLLN) states the following: Suppose X 1, X 2, X 3,…are i.i.d with E(X i ) = μ < ∞, V(X i ) = σ 2 < ∞, then for any positive number a as n ∞. This is called Convergence in Probability. Proof:
6
week 126 Example Flip a coin 10,000 times. Let E(X i ) = ½ and V(X i ) = ¼. Take a = 0.01, then by Chebyshev’s Inequality Chebyshev Inequality gives a very weak upper bound. Chebyshev Inequality works regardless of the distribution of the X i ’s.
7
week 127 Strong Law of Large Number Suppose X 1, X 2, X 3,…are i.i.d with E(X i ) = μ < ∞, then converges to μ as n ∞ with probability 1. That is This is called convergence almost surely.
8
week 128 Continuity Theorem for MGFs Let X be a random variable such that for some t 0 > 0 we have m X (t) < ∞ for. Further, if X 1, X 2,…is a sequence of random variables with and for all then {X n } converges in distribution to X. This theorem can also be stated as follows: Let F n be a sequence of cdfs with corresponding mgf m n. Let F be a cdf with mgf m. If m n (t) m(t) for all t in an open interval containing zero, then F n (x) F(x) at all continuity points of F. Example: Poisson distribution can be approximated by a Normal distribution for large λ.
9
week 129 Example to illustrate the Continuity Theorem Let λ 1, λ 2,…be an increasing sequence with λ n ∞ as n ∞ and let {X i } be a sequence of Poisson random variables with the corresponding parameters. We know that E(X n ) = λ n = V(X n ). Let then we have that E(Z n ) = 0, V(Z n ) = 1. We can show that the mgf of Z n is the mgf of a Standard Normal random variable. We say that Z n convergence in distribution to Z ~ N(0,1).
10
week 1210 Example Suppose X is Poisson(900) random variable. Find P(X > 950).
11
week 1211 Central Limit Theorem The central limit theorem is concerned with the limiting property of sums of random variables. If X 1, X 2,…is a sequence of i.i.d random variables with mean μ and variance σ 2 and, then by the WLLN we have that in probability. The CLT concerned not just with the fact of convergence but how S n /n fluctuates around μ. Note that E(S n ) = nμ and V(S n ) = nσ 2. The standardized version of S n is and we have that E(Z n ) = 0, V(Z n ) = 1.
12
week 1212 The Central Limit Theorem Let X 1, X 2,…be a sequence of i.i.d random variables with E(X i ) = μ < ∞ and Var(X i ) = σ 2 < ∞. Suppose the common distribution function F X (x) and the common moment generating function m X (t) are defined in a neighborhood of 0. Let Then, for - ∞ < x < ∞ where Ф(x) is the cdf for the standard normal distribution. This is equivalent to saying that converges in distribution to Z ~ N(0,1). Also, i.e. converges in distribution to Z ~ N(0,1).
13
week 1213 Example Suppose X 1, X 2,…are i.i.d random variables and each has the Poisson(3) distribution. So E(X i ) = V(X i ) = 3. The CLT says that as n ∞.
14
week 1214 Examples A very common application of the CLT is the Normal approximation to the Binomial distribution. Suppose X 1, X 2,…are i.i.d random variables and each has the Bernoulli(p) distribution. So E(X i ) = p and V(X i ) = p(1- p). The CLT says that as n ∞. Let Y n = X 1 + … + X n then Y n has a Binomial(n, p) distribution. So for large n, Suppose we flip a biased coin 1000 times and the probability of heads on any one toss is 0.6. Find the probability of getting at least 550 heads. Suppose we toss a coin 100 times and observed 60 heads. Is the coin fair?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.