Section 5.7 Suppose X 1, X 2, …, X n is a random sample from a Bernoulli distribution with success probability p. Then Y = X 1 + X 2 + … + X n has a distribution.

Slides:



Advertisements
Similar presentations
Special random variables Chapter 5 Some discrete or continuous probability distributions.
Advertisements

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Acknowledgement: Thanks to Professor Pagano
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Section 5.4 is n  a i X i i = 1 n  M i (a i t). i = 1 M 1 (a 1 t) M 2 (a 2 t) … M n (a n t) = Y = a 1 X 1 + a 2 X 2 + … + a n X n = If X 1, X 2, …, X.
Section 2.6 Consider a random variable X = the number of occurrences in a “unit” interval. Let = E(X) = expected number of occurrences in a “unit” interval.
Sampling Distribution for the Sample Proportion. Qualitative Responses Thus far we have discussed quantitative data –The survey question we ask has required.
1 Sampling Distributions Chapter Introduction  In real life calculating parameters of populations is prohibitive because populations are very.
Chapter Six Sampling Distributions McGraw-Hill/Irwin Copyright © 2004 by The McGraw-Hill Companies, Inc. All rights reserved.
Chapter 8 – Normal Probability Distribution A probability distribution in which the random variable is continuous is a continuous probability distribution.
Section 5.3 Suppose X 1, X 2, …, X n are independent random variables (which may be either of the discrete type or of the continuous type) each from the.
Standard Normal Distribution
Lesson #17 Sampling Distributions. The mean of a sampling distribution is called the expected value of the statistic. The standard deviation of a sampling.
Section 10.6 Recall from calculus: lim= lim= lim= x  y  — x x — x kx k 1 + — y y eekek (Let y = kx in the previous limit.) ekek If derivatives.
Lesson #13 The Binomial Distribution. If X follows a Binomial distribution, with parameters n and p, we use the notation X ~ B(n, p) p x (1-p) (n-x) f(x)
The moment generating function of random variable X is given by Moment generating function.
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
Section 5.6 Important Theorem in the Text: The Central Limit TheoremTheorem (a) Let X 1, X 2, …, X n be a random sample from a U(–2, 3) distribution.
Normal and Sampling Distributions A normal distribution is uniquely determined by its mean, , and variance,  2 The random variable Z = (X-  /  is.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Section 15.8 The Binomial Distribution. A binomial distribution is a discrete distribution defined by two parameters: The number of trials, n The probability.
Section 3.5 Let X have a gamma( ,  ) with  = r/2, where r is a positive integer, and  = 2. We say that X has a chi-square distribution with r degrees.
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Lecture 4 1 Discrete distributions Four important discrete distributions: 1.The Uniform distribution (discrete) 2.The Binomial distribution 3.The Hyper-geometric.
Sampling Distribution of the Sample Mean. Example a Let X denote the lifetime of a battery Suppose the distribution of battery battery lifetimes has 
Convergence in Distribution
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Chapter 7: Introduction to Sampling Distributions Section 2: The Central Limit Theorem.
Sample Variability Consider the small population of integers {0, 2, 4, 6, 8} It is clear that the mean, μ = 4. Suppose we did not know the population mean.
1 Since everything is a reflection of our minds, everything can be changed by our minds.
Section 6-5 The Central Limit Theorem. THE CENTRAL LIMIT THEOREM Given: 1.The random variable x has a distribution (which may or may not be normal) with.
Onur DOĞAN.  asdaf. Suppose that a random number generator produces real numbers that are uniformly distributed between 0 and 100.  Determine the.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved. Section 5-5 Poisson Probability Distributions.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Probability Refresher COMP5416 Advanced Network Technologies.
Normal approximation of Binomial probabilities. Recall binomial experiment:  Identical trials  Two outcomes: success and failure  Probability for success.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
The final exam solutions. Part I, #1, Central limit theorem Let X1,X2, …, Xn be a sequence of i.i.d. random variables each having mean μ and variance.
Using the Tables for the standard normal distribution.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
1 Sampling distributions The probability distribution of a statistic is called a sampling distribution. : the sampling distribution of the mean.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U Authors: Gary Greer (with.
Probability Generating Functions Suppose a RV X can take on values in the set of non-negative integers: {0, 1, 2, …}. Definition: The probability generating.
2.2 Discrete Random Variables 2.2 Discrete random variables Definition 2.2 –P27 Definition 2.3 –P27.
Sampling and Sampling Distributions
Ch5.4 Central Limit Theorem
Functions and Transformations of Random Variables
Continuous Random Variables
Simple stochastic models 1
Week 8 Chapter 14. Random Variables.
Inferential Statistics and Probability a Holistic Approach
Suppose you roll two dice, and let X be sum of the dice. Then X is
Probability Review for Financial Engineers
Using the Tables for the standard normal distribution
Lecture Slides Elementary Statistics Twelfth Edition
Normal Approximations to the Binomial Distribution
Normal Probability Distributions
Lecture 23 Section Mon, Oct 25, 2004
Uniform Distributions and Random Variables
Exam 2 - Review Chapters
Modeling Discrete Variables
S2.3 Continuous distributions
Distributions Discrete and Continuous
Statistical Inference
Presentation transcript:

Section 5.7 Suppose X 1, X 2, …, X n is a random sample from a Bernoulli distribution with success probability p. Then Y = X 1 + X 2 + … + X n has a distribution. According to the Central Limit Theorem, can be treated as having an approximate N(0, 1) distribution for sufficiently large n. This approximation is generally considered to be good if both np  5 and n(1 – p)  5. When using a normal distribution to approximate a discrete type distribution, a continuity correction can be employed for increased accuracy. b(n, p) Y – X – ———— = ———– np  np(1–p) p  p(1–p)/n

1.Suppose Y has a b(400, 0.001) distribution. Explain why the normal distribution should not be used to approximate P(Y > 3). Since np = 400(0.001) = 0.4 < 5, the normal approximation should not be used. The Poisson approximation would be good, since p = is so small. 2. (a) The random variable X has a b(200, 0.4) distribution. Explain why the normal distribution could be used to approximate P(X  65). Since np = 200(0.4) = 80 > 5 and n(1–p) = 200(0.6) = 120 > 5, the normal approximation would be good.

P(X  65) = P(X  65.5) = P X –65.5 – ————  ———— = np80  np(1–p)  continued (b) Use the Central Limit Theorem to approximate P(X  65). P(Z  ) = – 2.09  (– 2.09) =1 –  (2.09) =

Y has adistribution. b(, ) Since the normal approximation would be good. P(Y > 370) = P(Y > 370.5) =P Y –370.5 – ————  ————— = np360  np(1–p) 12 P(Z > ) = –  (0.875) = 1 – = The random sample X 1, X 2, …, X 600 is taken from a U(–2, 3) distribution and the following random variable is defined: Y = number of positive X i s in the random sample. Use the Central Limit Theorem to approximate P(Y > 370). np = 600(0.6) = 360 > 5 and n(1–p) = 600(0.4) = 240 > 5, ——————— 2

4. (a) The random variable Y has a Poisson distribution with mean 85. Explain why we can think of the random variable Y as Y = X 1 + X 2 + … + X 85 where X 1, X 2, …, X 85 is a random sample from a Poisson distribution with mean 1 (one). We know from Text Exercise that the sum of n independent Poisson random variables with respective means 1, 2, …, n, is a Poisson random variable with mean … + n.

(b)Use the Central Limit Theorem to approximate P(86 < Y < 92). P(86 < Y < 92) = 86.5 – Y – 91.5 – P( ———— < ———— < ———— ) = 85  85 P(0.16 < Z < 0.71) =  (0.71) –  (0.16) = – = P(86.5 < Y < 91.5) =

5.A child’s game in an arcade allows the child to roll six balls (one at a time) down a slide into one of six slots labeled 1, 4, 2, 5, 3, and 6 in order from left to right. The game adds the numbers on the slots that the child rolls the six balls into. If this sum is under 19 or over 23, the child wins the game. Use the Central Limit Theorem to approximate the probability that a child who randomly rolls the six balls into the slots wins the game. Let the random variable X i be the number of the slot that the child rolls the ith ball into for i = 1, 2, 3, 4, 5, 6. Let the random variable Y be the sum of the numbers of the slots that the child rolls the six balls into, that is, Y = X 1 + X 2 + X 3 + X 4 + X 5 + X 6. For each i, the distribution of X i is the uniform distribution on the first six positive integers. E(X i ) =Var(X i ) =3.535/12 P(losing with random rolls) = P(19  Y  23) =

18.5 – Y – 23.5 – P( ———— < ———— < ———— ) = 21  35 / 2 P(18.5  Y  23.5) = P(– 0.60 < Z < 0.60) =  (0.60) –  (–0.60 ) = – (1 – ) = P(winning with random rolls) =1 – =