Section 5.6 Important Theorem in the Text: The Central Limit TheoremTheorem 5.6-1 1. (a) Let X 1, X 2, …, X n be a random sample from a U(–2, 3) distribution.

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Distributions of sampling statistics Chapter 6 Sample mean & sample variance.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Section 5.4 is n  a i X i i = 1 n  M i (a i t). i = 1 M 1 (a 1 t) M 2 (a 2 t) … M n (a n t) = Y = a 1 X 1 + a 2 X 2 + … + a n X n = If X 1, X 2, …, X.
Suppose we are interested in the digits in people’s phone numbers. There is some population mean (μ) and standard deviation (σ) Now suppose we take a sample.
Section 5.7 Suppose X 1, X 2, …, X n is a random sample from a Bernoulli distribution with success probability p. Then Y = X 1 + X 2 + … + X n has a distribution.
1 Sampling Distributions Chapter Introduction  In real life calculating parameters of populations is prohibitive because populations are very.
Variance and Standard Deviation The Expected Value of a random variable gives the average value of the distribution The Standard Deviation shows how spread.
Chapter 8 – Normal Probability Distribution A probability distribution in which the random variable is continuous is a continuous probability distribution.
Section 5.3 Suppose X 1, X 2, …, X n are independent random variables (which may be either of the discrete type or of the continuous type) each from the.
Statistics Lecture 20. Last Day…completed 5.1 Today Parts of Section 5.3 and 5.4.
4. Convergence of random variables  Convergence in probability  Convergence in distribution  Convergence in quadratic mean  Properties  The law of.
Standard Normal Distribution
Lesson #17 Sampling Distributions. The mean of a sampling distribution is called the expected value of the statistic. The standard deviation of a sampling.
Horng-Chyi HorngStatistics II_Five43 Inference on the Variances of Two Normal Population &5-5 (&9-5)
Section 10.6 Recall from calculus: lim= lim= lim= x  y  — x x — x kx k 1 + — y y eekek (Let y = kx in the previous limit.) ekek If derivatives.
Probability theory 2011 Convergence concepts in probability theory  Definitions and relations between convergence concepts  Sufficient conditions for.
The moment generating function of random variable X is given by Moment generating function.
2003/04/24 Chapter 5 1頁1頁 Chapter 5 : Sums of Random Variables & Long-Term Averages 5.1 Sums of Random Variables.
The Central Limit Theorem For simple random samples from any population with finite mean and variance, as n becomes increasingly large, the sampling distribution.
Normal and Sampling Distributions A normal distribution is uniquely determined by its mean, , and variance,  2 The random variable Z = (X-  /  is.
Standard error of estimate & Confidence interval.
Chapter 6 Sampling and Sampling Distributions
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Section 15.8 The Binomial Distribution. A binomial distribution is a discrete distribution defined by two parameters: The number of trials, n The probability.
Section 3.6 Recall that y –1/2 e –y dy =   0 (Multivariable Calculus is required to prove this!)  (1/2) = Perform the following change of variables.
Sampling Distribution of the Sample Mean. Example a Let X denote the lifetime of a battery Suppose the distribution of battery battery lifetimes has 
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
Section 5.2 The Sampling Distribution of the Sample Mean.
Financial Mathematics. In finance, a hedge is an investment that is taken out specifically to reduce or cancel out the risk in another investment.financerisk.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Chapter 7: Introduction to Sampling Distributions Section 2: The Central Limit Theorem.
Sample Variability Consider the small population of integers {0, 2, 4, 6, 8} It is clear that the mean, μ = 4. Suppose we did not know the population mean.
1 Since everything is a reflection of our minds, everything can be changed by our minds.
Section 6-5 The Central Limit Theorem. THE CENTRAL LIMIT THEOREM Given: 1.The random variable x has a distribution (which may or may not be normal) with.
8 Sampling Distribution of the Mean Chapter8 p Sampling Distributions Population mean and standard deviation,  and   unknown Maximal Likelihood.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
The final exam solutions. Part I, #1, Central limit theorem Let X1,X2, …, Xn be a sequence of i.i.d. random variables each having mean μ and variance.
Confidence Interval & Unbiased Estimator Review and Foreword.
Using the Tables for the standard normal distribution.
Section Means and Variances of Random Variables AP Statistics
Topic 6: The distribution of the sample mean and linear combinations of random variables CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally.
1 Sampling distributions The probability distribution of a statistic is called a sampling distribution. : the sampling distribution of the mean.
Normal Normal Distributions  Family of distributions, all with the same general shape.  Symmetric about the mean  The y-coordinate (height) specified.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
MATH Section 3.1.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
4.3 Probability Distributions of Continuous Random Variables: For any continuous r. v. X, there exists a function f(x), called the density function of.
Sampling and Sampling Distributions
Ch5.4 Central Limit Theorem
Chapter 7 Review.
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
Example: X = Cholesterol level (mg/dL)
Sample Mean Distributions
Project Management Simulation, U-Distribution
C14: The central limit theorem
Using the Tables for the standard normal distribution
Introduction to Probability & Statistics The Central Limit Theorem
ASV Chapters 1 - Sample Spaces and Probabilities
Two-way analysis of variance (ANOVA)
Sampling Distributions
Sampling Distribution of the Mean
Exam 2 - Review Chapters
ASV Chapters 1 - Sample Spaces and Probabilities
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
Normal Probability Distributions
The Geometric Distributions
Fundamental Sampling Distributions and Data Descriptions
Presentation transcript:

Section 5.6 Important Theorem in the Text: The Central Limit TheoremTheorem (a) Let X 1, X 2, …, X n be a random sample from a U(–2, 3) distribution. Define the random variable Y = X 1 + X 2 + … + X n = n  X i. i = 1 Use the Central Limit Theorem to find a and b, both depending on n, so that the limiting distribution ofis N(0, 1). Y – a —— b Y is the sum of n independent and identically distributed random variables each of which has mean  = and variance  2 = 1/225/12. Therefore, the Central Limit Theorem tells us that the limiting distribution of is N(0, 1). Y – ———– n/2 5  n —– 2  3

Use the Central Limit Theorem to find a and b, with only b depending on n, so that the limiting distribution of is N(0, 1). X – a —— b (b) X is the mean of n independent and identically distributed random variables each of which has mean  = and variance  2 = 1/225/12. Therefore, the Central Limit Theorem tells us that the limiting distribution of is N(0, 1). X – ———– 1/2 5 —–— 2  3n

Suppose n = 25. Use the Central Limit Theorem to approximate P(Y  12). 1.-continued (c) P(Y  12) = P Y – 12 – ———  ———— = 25/2 25 —– 2  3 25/2 25 —– 2  3 P(Z  ) = – 0.07  (– 0.07) =1 –  (0.07) =

2. (a) A random sample X 1, X 2, …, X n is taken from a N(100, 64) distribution. Find each of the following: P(96 < X i < 104) for each i = 1, 2, …, n. P(96 < X i < 104) = 96 – X i – 104 – P( ———— < ———— < ———— ) = P(– 0.50 < Z < 0.50) =  (0.50) –  (– 0.50) =  (0.50) – (1 –  (0.50)) = – (1 – ) =

P(96 < X < 104) = 96 – X – 104 – P( ———— < ———— < ———— ) = P(– 1.00 < Z < 1.00) =  (1.00) –  (– 1.00) =  (1.00) – (1 –  (1.00)) = – (1 – ) = continued (b) P(96 < X < 104) when n = 4.

(c)P(96 < X < 104) when n = 16. P(96 < X < 104) = 96 – X – 104 – P( ———— < ———— < ———— ) = P(– 2.00 < Z < 2.00) =  (2.00) –  (– 2.00) =  (2.00) – (1 –  (2.00)) = – (1 – ) =

3. (a) A random sample X 1, X 2, …, X 25 is taken from a distribution defined by the p.d.f.f(x) = x / 50 if 0 < x < 10. P(X i < 6) for each i = 1, 2, …, 25. P(X i < 6) = x — dx= 50 9 / x 2 —— =

20 / 3  =  2 = 50 – 400 / 9 = 50 / 9 P(X < 6) = X – 6 – P( ———— < ———— ) = 20/3  2 / 3 P(Z < – 1.41) =  (– 1.41) =1 –  (1.41) = 1 – = (b)Use the Central Limit Theorem to approximate P(X < 6).