Download presentation
Presentation is loading. Please wait.
Published byMyles Freeman Modified over 8 years ago
1
week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know its value. A statistic is a function of the sample data, i.e., it is a quantity whose value can be calculated from the sample data. It is a random variable with a distribution function. The random variables X 1, X 2,…, X n are said to form a (simple) random sample of size n if the X i ’s are independent random variables and each X i has the sample probability distribution. We say that the X i ’s are iid.
2
week12 Example Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is. It is a statistic.
3
week13 Sampling Distribution of a Statistic The sampling distribution of a statistic is the distribution of values taken by the statistic in all possible samples of the same size from the same population. The distribution function of a statistic is NOT the same as the distribution of the original population that generated the original sample. Probability rules can be used to obtain the distribution of a statistic provided that it is a “simple” function of the X i ’s and either there are relatively few different values in he population or else the population distribution has a “nice” form. Alternatively, we can perform a simulation experiment to obtain information about the sampling distribution of a statistic.
4
week14 Markov’s Inequality If X is a non-negative random variable with E(X) 0 then, Proof:
5
week15 Chebyshev’s Inequality For a random variable X with E(X) 0 Proof:
6
week16 Law of Large Numbers Interested in sequence of random variables X 1, X 2, X 3,… such that the random variables are independent and identically distributed (i.i.d). Let Suppose E(X i ) = μ, V(X i ) = σ 2, then and Intuitively, as n ∞, so
7
week17 Formally, the Weak Law of Large Numbers (WLLN) states the following: Suppose X 1, X 2, X 3,…are i.i.d with E(X i ) = μ < ∞, V(X i ) = σ 2 < ∞, then for any positive number a as n ∞. This is called Convergence in Probability. Proof:
8
week18 Example Flip a coin 10,000 times. Let E(X i ) = ½ and V(X i ) = ¼. Take a = 0.01, then by Chebyshev’s Inequality Chebyshev Inequality gives a very weak upper bound. Chebyshev Inequality works regardless of the distribution of the X i ’s. The WLLN state that the proportions of heads in the 10,000 tosses converge in probability to 0.5.
9
week19 Strong Law of Large Number Suppose X 1, X 2, X 3,…are i.i.d with E(X i ) = μ < ∞, then converges to μ as n ∞ with probability 1. That is This is called convergence almost surely.
10
week110 Central Limit Theorem The central limit theorem is concerned with the limiting property of sums of random variables. If X 1, X 2,…is a sequence of i.i.d random variables with mean μ and variance σ 2 and, then by the WLLN we have that in probability. The CLT concerned not just with the fact of convergence but how S n /n fluctuates around μ. Note that E(S n ) = nμ and V(S n ) = nσ 2. The standardized version of S n is and we have that E(Z n ) = 0, V(Z n ) = 1.
11
week111 The Central Limit Theorem Let X 1, X 2,…be a sequence of i.i.d random variables with E(X i ) = μ < ∞ and Var(X i ) = σ 2 < ∞. Let Then, for - ∞ < x < ∞ where Z is a standard normal random variable and Ф(z)is the cdf for the standard normal distribution. This is equivalent to saying that converges in distribution to Z ~ N(0,1). Also, i.e. converges in distribution to Z ~ N(0,1).
12
week112 Example Suppose X 1, X 2,…are i.i.d random variables and each has the Poisson(3) distribution. So E(X i ) = V(X i ) = 3. The CLT says that as n ∞.
13
week113 Examples A very common application of the CLT is the Normal approximation to the Binomial distribution. Suppose X 1, X 2,…are i.i.d random variables and each has the Bernoulli(p) distribution. So E(X i ) = p and V(X i ) = p(1- p). The CLT says that as n ∞. Let Y n = X 1 + … + X n then Y n has a Binomial(n, p) distribution. So for large n, Suppose we flip a biased coin 1000 times and the probability of heads on any one toss is 0.6. Find the probability of getting at least 550 heads. Suppose we toss a coin 100 times and observed 60 heads. Is the coin fair?
14
week114 Sampling from Normal Population If the original population has a normal distribution, the sample mean is also normally distributed. We don’t need the CLT in this case. In general, if X 1, X 2,…, X n i.i.d N(μ, σ 2 ) then S n = X 1 + X 2 +…+ X n ~ N(nμ, nσ 2 ) and
15
week115 Example
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.