Normal Distribution ch5.

Slides:



Advertisements
Similar presentations
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Advertisements

ELEC 303 – Random Signals Lecture 18 – Statistics, Confidence Intervals Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 10, 2009.
1 Continuous Distributions ch4. 2   A random variable X of the continuous type has a support or space S that is an interval(possibly unbounded) or a.
1 Discrete Distributions Ch3. 2   Def.3.1-1: A function X that assigns to each element s in S exactly one real number X(s)=x is called a random variable.
CONTINUOUS RANDOM VARIABLES These are used to define probability models for continuous scale measurements, e.g. distance, weight, time For a large data.
Probability Densities
Review.
Simulation Modeling and Analysis
1 Sampling Distribution Theory ch6. 2 F Distribution: F(r 1, r 2 )  From two indep. random samples of size n 1 & n 2 from N(μ 1,σ 1 2 ) & N(μ 2,σ 2 2.
1 Continuous Distributions ch3. 2   A random variable X of the continuous type has a support or space S that is an interval(possibly unbounded) or a.
Chapter 6 The Normal Distribution
Chapter 6 Continuous Random Variables and Probability Distributions
Continuous Random Variables Chap. 12. COMP 5340/6340 Continuous Random Variables2 Preamble Continuous probability distribution are not related to specific.
1 Multivariate Distributions ch4. 2 Multivariable Distributions  It may be favorable to take more than one measurement on a random experiment. –The data.
Today Today: More on the Normal Distribution (section 6.1), begin Chapter 8 (8.1 and 8.2) Assignment: 5-R11, 5-R16, 6-3, 6-5, 8-2, 8-8 Recommended Questions:
1 Multivariable Distributions ch4. 2  It may be favorable to take more than one measurement on a random experiment.  The data may then be collected.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 6-1 Chapter 6 The Normal Distribution and Other Continuous Distributions.
Continuous Random Variables and Probability Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
TOPIC 5 Normal Distributions.
Copyright © Cengage Learning. All rights reserved. 7 Statistical Intervals Based on a Single Sample.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Continuous Random Variables and Probability Distributions.
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
Ka-fu Wong © 2004 ECON1003: Analysis of Economic Data Lesson5-1 Lesson 5: Continuous Probability Distributions.
Sampling Theory Determining the distribution of Sample statistics.
Continuous Probability Distribution  A continuous random variables (RV) has infinitely many possible outcomes  Probability is conveyed for a range of.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Chapter 4 Continuous Random Variables and Probability Distributions
1 Ch5. Probability Densities Dr. Deshi Ye
Sampling Theory Determining the distribution of Sample statistics.
Standard Statistical Distributions Most elementary statistical books provide a survey of commonly used statistical distributions. The reason we study these.
Jointly Distributed Random Variables
Random variables Petter Mostad Repetition Sample space, set theory, events, probability Conditional probability, Bayes theorem, independence,
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Probability theory 2 Tron Anders Moger September 13th 2006.
Continuous Probability Distributions  Continuous Random Variable  A random variable whose space (set of possible values) is an entire interval of numbers.
Chap. 4 Continuous Distributions
Copyright ©2011 Nelson Education Limited The Normal Probability Distribution CHAPTER 6.
(c) 2007 IUPUI SPEA K300 (4392) Outline Normal Probability Distribution Standard Normal Probability Distribution Standardization (Z-score) Illustrations.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Maximum Likelihood Estimator of Proportion Let {s 1,s 2,…,s n } be a set of independent outcomes from a Bernoulli experiment with unknown probability.
Lecture 15: Statistics and Their Distributions, Central Limit Theorem
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
1 Since everything is a reflection of our minds, everything can be changed by our minds.
1 Topic 5 - Joint distributions and the CLT Joint distributions –Calculation of probabilities, mean and variance –Expectations of functions based on joint.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
B AD 6243: Applied Univariate Statistics Data Distributions and Sampling Professor Laku Chidambaram Price College of Business University of Oklahoma.
Lecture 6 Normal Distribution By Aziza Munir. Summary of last lecture Uniform discrete distribution Binomial Distribution Mean and Variance of binomial.
The final exam solutions. Part I, #1, Central limit theorem Let X1,X2, …, Xn be a sequence of i.i.d. random variables each having mean μ and variance.
Using the Tables for the standard normal distribution.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Probability Distributions. Statistical Experiments – any process by which measurements are obtained. A quantitative variable x, is a random variable if.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Sampling and estimation Petter Mostad
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
SESSION 37 & 38 Last Update 5 th May 2011 Continuous Probability Distributions.
Random Variables Numerical Quantities whose values are determine by the outcome of a random experiment.
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
Distributions of Functions of Random Variables November 18, 2015
Chapter 18 Sampling distribution models math2200.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 6-1 Chapter 6 The Normal Distribution and Other Continuous Distributions Basic Business.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
THE NORMAL DISTRIBUTION
Sampling and Sampling Distributions
Ch5.4 Central Limit Theorem
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
Sample Mean Distributions
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
Presentation transcript:

Normal Distribution ch5

Normal Distribution The random variable X has a normal distribution, N(μ, σ2), if its p.d.f. is defined by where the mean μ∈(-∞, ∞), and variance σ2∈(0, ∞). If Z is N(0, 1), Z is called as a standard normal distribution. M(t)= M’(t)= E(X)=

Examples Ex.5.2-3: If Z is N(0,1), then from Table Va on p.686, Φ(1.24)=P(Z≦1.24)=0.8925, P(1.24≦Z≦2.37)=Φ (2.37)-Φ (1.24)=0.9911-0.8925=0.0986 P(-2.37≦Z≦-1.24)= P(1.24 ≦Z≦ 2.37)=0.0986 From Table Vb (right-tail) on p.687, (the negative z is derived from –Z) P(Z>1.24)=0.1075 P(Z≦-2.14)=P(Z≧2.14)=0.0162 Ex.5.2-4: If Z is N(0,1), then find a & b from Table Va & Vb. P(Z≦a)=0.9147 => a=1.37 P(Z≧b)=0.0526 => b=1.62 Percentiles: The 100(1-α) percentile is Zα, where P(Z≧Zα)=α=P(Z≦-Zα) Z1-α=-Zα (the upper 100αpercent point) The 100p percentile is πp, where P(X≦πp)=p p=1-α =>πp=Z1-p=-Zp

Conversion: N(μ,σ2) ) ⇒ N(0,1) Thm5.2-1: If X is N(μ,σ2), then Z=(X-μ)/σis N(0,1). Usage: Ex.5.2-6: X is N(3,16). Ex.5.2-7: X is N(25,36). Find c such that P(|X-25|≤c) =0.9544. Within 2*σabout the mean, X and Z share the same the probability. P(4X8)

Conversion N(μ,σ2) ⇒ χ2(1) Ex.5.2-8: If Z is N(0,1), k=? So Thm5.2-2: If X is N(μ,σ2), then V=[(X-μ)/σ]2=Z2is χ2(1). Pf: V=Z2, Z is N(0,1). Then G(v) of V is G(V) Ex.5.2-8: If Z is N(0,1), k=? So From the chi-square table with r=1, k2=3.841, so k=1.96.

Q-Q Plot Given a set of observations, how to determine its distribution? If its population is large, relative frequency histogram can be used. For small samples, a q-q plot can be used to check. Q-Q plot: say against a “theoretical” normal distribution N(0,1). The mean and variance are obvious, N(0,1): the quantile z1-p. The sample is suspected as N(μ,σ2): the quantile qp. The ideal curve in the plot would be qp=μ+σz1-p. If the actual curve is approximated to a straight line, the distribution of the data is verified as a normal distribution. The reciprocal of the slope is σ. The interception on the x-axis is μ. (Ref. Ex.4.4-9 & Fig.4.4-3 on p.201)

Statistics on Normal Distributions Thm5.3-1: X1,…,Xn are the outcomes on a random sample of size n from the normal distribution N(μ,σ2). The distribution of the sample mean is N(μ,σ2/n). Pf: Thm5.3-2: Z1,…,Zn are independent and all have N(0,1); Then, W=Z12+…+Zn2is χ2(n). Thm5.2-2: If X is N(μ,σ2), then V=[(X-μ)/σ]2=Z2 is χ2(1). Thm4.6-3: Y=X1+…+Xn is χ2(r1+…+rn)= χ2(n) in this case. MX(t)

Example Fig.5.3-1: p.d.f.s of means of samples from N(50,16). is N(50, 16/n).

Theoretical Mean and Sample Mean Cly5.3-1: Z1,…,Zn are independent and have N(μi,σi2),i=1..n; Then, W=[(Z1-μ1)/σi2]2+…+[(Zn-μn)/σn2]2 is χ2(n). Thm5.3-3: X1,…,Xn are the outcomes on a random sample of size n from the normal distribution N(μ,σ2). Then, The sample mean & variance are indep. is χ2(n-1) Pf: (a) omitted; (b): As theoretical mean is replaced by the sample mean, one degree of freedom is lost! MV(t)

Linear Combinations of N(μ,σ2) Ex5.3-2: X1,X2,X3,X4 are a random sample of size 4 from the normal distribution N(76.4,383). Then P(0.711W 7.779)=0.9-0.05=0.85, P(0.352 V 6.251)=0.9-0.05=0.85 Thm5.3-4: If X1,…,Xn are n mutually indep. normal variables with means μ1,…,μn & variances σ12,…,σn2, then the linear function has the normal distribution Pf: By moment-generating function, … Ex5.3-3: X1:N(693.2,22820) and X2:N(631.7,19205) are indep. Find P(X1>X2) Y=X1-X2 is N(61.5,42025). P(X1>X2)=P(Y>0)=

Box-Muller Transformation Ex5.3-4: X1 and X2 have indep. Uniform distributions U(0,1). Consider Two indep. U(0,1) ⇒ two indep. N(0,1)!!

Distribution Function Technique Ex.5.3-5: Z is N(0,1), U is χ2(r), Z and U are independent. The joint p.d.f. of Z and U is χ2(r+1)

Student’s T Distribution Gossett, William Sealy published “t-test” in Biometrika1908 to measure the confidence interval, the deviation of “small samples” from the “real”. Suppose the underlying distribution is normal with unknown σ2. Fig.5.3-2: The T p.d.f. becomes closer to the N(0, 1) p.d.f. as the number of degrees of freedom increases. tα(r) is the 100(1-α) percentile, or the upper 100αpercent point. [Table VI, p.658] f(t)= Only depends on r!

Examples Ex: Suppose T has a t distribution with r=7. From Table VI on p.688, Ex: Suppose T has a t distribution with r=14. Find a constant c, such that P(|T|<c)=0.9

Central Limit Theorem Ex4.6-2: X1,…,Xn are a random sample of size n from a distribution with mean μand variance σ2; then The sample mean: Thm5.4-1: (Central Limit Theorem) If is the mean of a random sample X1,…,Xn of size n from some distribution with a finite mean μand a finite positive variance σ2, then the distribution of is N(0, 1) in the limit as n →∞. Even if Xi is not N(μ,σ2). W= if n is large W

More Examples Ex: Let denote the mean of a random sample of size n=15 from the distribution whose p.d.f. is f(x)=3x2/2, -1<x<1. μ=0, σ2=3/5. Ex5.4-2: Let X1,…,X20 be a random sample of size 20 from the uniform distribution U(0,1). μ=½, σ2=1/12; Y=X1+…+X20. Ex5.4-3: Let denote the mean of a random sample of size n=25 from the distribution whose p.d.f. is f(x)=x3/4, 0<x<2 μ=1.6, σ2=8/75.

How large of size n is sufficient? If n=25, 30 or larger, the approximation is generally good. If the original distribution is symmetric, unimodal and of continuous type, n can be as small as 4 or 5. If the original is like normal, n can be lowered to 2 or 3. If it is exactly normal, n=1 or more is just good. However, if the original is highly skew, n must be quite large. Ex5.4-4: Let X1,…,X4 be a random sample of size 4 from the uniform distribution U(0,1) with p.d.f. f(x)=1, 0<x<1. μ=½, σ2=1/12; Y=X1+X2. Y=X1+…+X4.

Graphic Illustration Fig.5.4-1: Sum of n U(0, 1) R.V.s  N( n(1/2), n(1/12) ) p.d.f.

Skew Distributions Suppose f(x) and F(x) are the p.d.f. and distribution function of a random variable X with mean μ and variance σ2. Ex5.4-5: Let X1,…,Xn be a random sample of size n from a chi-square distribution χ2(1). Y=X1+…+Xn is χ2(n), E(Y)=n, Var(Y)=2n. n=20 or 100 →N( , ).

Graphic Illustration Fig.5.4-2: The p.d.f.s of sums of χ2(1), transformed so that their mean is equal to zero and variance is equal to one, becomes closer to the N(0, 1) as the number of degrees of freedom increases.

Simulation of R.V. X with f(x) & F(x) Random number generator will produce values y’s for U(0,1). Since F(x)=U(0,1)=Y, x=F-1(y) is an observed or simulated value of X. Ex.5.4-6: Let X1,…,Xn be a random sample of size n from the distribution with f(x), F(x), mean μand variance σ2. 1000 random samples are simulated to compute the values of W. A histogram of these values are grouped into 21 classes of equal width. f(x)=(x+1)/2, F(x)=(x+1)2/4, f(x)=3x2/2, F(x)=(x3+1)/2, -1<x<1; μ=1/3, σ2=2/9. -1<x<1; μ=0, σ2=3/5. N(0,1)

Approximation of Discrete Distributions Let X1,…,Xn be a random sample from a Bernoulli distribution with μ=p and σ2=npq, 0<p<1. Thus, Y=X1+…+Xn is binomial b(n,p). →N(np,npq) as n →∞. Rule: n is “sufficiently large” if np5 and nq5. If p deviates from 0.5 (skew!!), n need to be larger. Ex.5.5-1: Y, b(10,1/2), can be approximated by N(5,2.5).  Ex.5.5-2: Y, b(18,1/6), can be hardly approx. by N(3,2.5), ∵3<5.

Another Example Ex5.5-4: Y is b(36,1/2). Correct Probability: Approximation of Binomial Distribution b(n,p): Good approx.!

Approximation of Poisson Distribution Approximation of a Poisson Distribution Y with mean λ: Ex5.5-5: X that has a Poisson distribution with mean 20 can be seen as the sum Y of the observations of a random sample of size 20 from a Poisson distribution with mean 1. Correct Probability: Fig.5.5-3: Normal approx. of the Poisson Probability Histogram

Bivariate Normal Distribution The joint p.d.f of X : N(μX,σX2)and Y : N(μY,σY2) is Therefore, A linear function of x. A constant w.r.t. x.

Examples Ex.5.6-1: Ex.5.6-2

Bivariate Normal:ρ=0 ⇒ Independence Thm5.6-1: For X and Y with a bivariate normal distribution with ρ, X and Y are independent iffρ=0. So are trivariate and multivariate normal distributions. When ρ=0,