Standard Statistical Distributions Most elementary statistical books provide a survey of commonly used statistical distributions. The reason we study these.

Slides:



Advertisements
Similar presentations
STATISTICS Univariate Distributions
Advertisements

Special random variables Chapter 5 Some discrete or continuous probability distributions.
Chapter 6 Continuous Random Variables and Probability Distributions
Discrete Uniform Distribution
Statistics review of basic probability and statistics.
Chapter 5 Discrete Random Variables and Probability Distributions
Modeling Process Quality
Chapter 1 Probability Theory (i) : One Random Variable
Probability Densities
Simulation Modeling and Analysis
Chapter 6 Continuous Random Variables and Probability Distributions
Probability Distributions
1 Module 9 Modeling Uncertainty: THEORETICAL PROBABILITY MODELS Topics Binomial Distribution Poisson Distribution Exponential Distribution Normal Distribution.
Evaluating Hypotheses
Statistics.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Discrete Probability Distributions
Continuous Random Variables and Probability Distributions
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved.
Chapter 5 Continuous Random Variables and Probability Distributions
Continuous Random Variables and Probability Distributions
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Continuous Random Variables and Probability Distributions.
Class notes for ISE 201 San Jose State University
Discrete Random Variables and Probability Distributions
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Discrete Probability Distributions Binomial Distribution Poisson Distribution Hypergeometric Distribution.
Chapter 4 Continuous Random Variables and Probability Distributions
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Discrete Random Variables Chapter 4.
Discrete Distributions
1 Ch5. Probability Densities Dr. Deshi Ye
Class 3 Binomial Random Variables Continuous Random Variables Standard Normal Distributions.
B AD 6243: Applied Univariate Statistics Understanding Data and Data Distributions Professor Laku Chidambaram Price College of Business University of Oklahoma.
Chapter 5 Discrete Random Variables and Probability Distributions ©
DATA ANALYSIS Module Code: CA660 Lecture Block 3.
McGraw-Hill/Irwin Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.
DATA ANALYSIS Module Code: CA660 Lecture Block 3.
Chapter 5 Statistical Models in Simulation
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
Chapter 3 Basic Concepts in Statistics and Probability
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling STATISTICS Random Variables.
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
Summary Statistics When analysing practical sets of data, it is useful to be able to define a small number of values that summarise the main features present.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
ENGR 610 Applied Statistics Fall Week 3 Marshall University CITE Jack Smith.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
© 2002 Thomson / South-Western Slide 5-1 Chapter 5 Discrete Probability Distributions.
Business Statistics: Contemporary Decision Making, 3e, by Black. © 2001 South-Western/Thomson Learning 5-1 Business Statistics, 3e by Ken Black Chapter.
1 1 Slide University of Minnesota-Duluth, Econ-2030 (Dr. Tadesse) University of Minnesota-Duluth, Econ-2030 (Dr. Tadesse) Chapter 5: Probability Distributions:
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Sampling and estimation Petter Mostad
Continuous Random Variables and Probability Distributions
Chapter 31Introduction to Statistical Quality Control, 7th Edition by Douglas C. Montgomery. Copyright (c) 2012 John Wiley & Sons, Inc.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Chapter 5 Special Distributions Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
Chap 5-1 Discrete and Continuous Probability Distributions.
Biostatistics Class 3 Probability Distributions 2/15/2000.
MECH 373 Instrumentation and Measurements
3. Random Variables (Fig.3.1)
The Exponential and Gamma Distributions
Chapter 7: Sampling Distributions
Probability Theory and Specific Distributions (Moore Ch5 and Guan Ch6)
Discrete Random Variables and Probability Distributions
Geometric Poisson Negative Binomial Gamma
Presentation transcript:

Standard Statistical Distributions Most elementary statistical books provide a survey of commonly used statistical distributions. The reason we study these distributions are that They provide a comprehensive range of distributions for modelling practical applications Their mathematical properties are known They are described in terms of a few parameters, which have natural interpretations. 1. Bernoulli Distribution. This is used to model a trial which gives rise to two outcomes: success/ failure, male/ female, 0 / 1. Let p be the probability that the outcome is one and q = 1 - p that the outcome is zero. E[X] = p (1) + (1 - p) (0) = p VAR[X] = p (1) 2 + (1 - p) (0) 2 - E[X] 2 = p (1 - p). 2. Binomial Distribution. Suppose that we are interested in the number of successes X in n independent repetions of a Bernoulli trial, where the probability of success in an individual trial is p. Then Prob{X = k} = n C k p k (1-p) n - k, (k = 0, 1, …, n) E[X] = n p VAR[X] = n p (1 - p). This is the appropriate distribution to use in modeling the number of boys in a family of n = 4 children, the number of defective components in a batch n = 10 components and so on. 01p Prob p p (n=4, p=0.2)Prob 1 4 np

3. Poisson Distribution. The Poisson distribution arises as a limiting case of the binomial distribution, where n , p  in such a way that n p  a constant). Its density is Prob{X = k} = exp ( -    … ). Note that exp (x) stands for e to the power of x, where e is approximately E [X] = VAR [X] =  The Poisson distribution is used to model the number of occurrences of a certain phenomenon in a fixed period of time or space, as in the number of O particles emitted by a radioactive source in a fixed direction and period of time O telephone calls received at a switchboard during a given period O defects in a fixed length of cloth or paper O people arriving in a queue in a fixed interval of time O accidents that occur on a fixed stretch of road in a specified time interval. 4. Geometric Distribution. This arises in the “time” or number of steps k to the first success in a series of independent Bernoulli trials. The density is Prob{X = k} = p (1 - p) k-1 (k = 1, 2, … ). E[X] = 1/p VAR [X] = (1 - p) /p 2 Prob X X

5. Negative Binomial Distribution This is used to model the number of failures k that occur before the r th success in a series of independent Bernoulli trials. The density is Prob {X = k} = r+k-1 C k p r (1 - p) k (k = 0, 1, 2, … ) NoteE [X] = r (1 - p) / p VAR[X] = r (1 - p) / p Hypergeometric Distribution Consider a population of M items, of which W are deemed to be successes. Let X be the number of successes that occur in a sample of size n, drawn without replacement from the population. The density is Prob { X = k} = W C k M-W C n-k / M C n ( k = 0, 1, 2, … ) ThenE [X] = n W / M VAR [X] = n W (M - W) (M - n) / { M 2 (M - 1)} 7. Uniform Distribution A random variable X has a uniform distribution on the interval [a, b], if X has density f (X) = 1 / ( b - a)for a < X < b = 0otherwise. ThenE [X] = (a + b) / 2 VAR [X] = (b - a) 2 / 12 Uniformly distributed random numbers occur frequently in simulation models. However, computer based algorithms, such as linear congruential functions, can only approximate this distribution so great care is needed in interpreting the output of simulation models. 1 / (b-a) Prob 1 ab X

If X is a continuous random variable, then the probability that X takes a value in the range [a, b] is the area under the frequency function f(x) between these points: Prob { a < x < b } = F (b) - F (a) =  a b f(x) dx. In practical work, these integrals are evaluated by looking up entries in statistical tables. 9. Gaussian or Normal Distribution A random variable X has a normal distribution with mean  and standard deviation  if it has density f (x) = 1 exp { - ( x -  ) 2 }, -  x <      = 0,otherwise E [ X] =  VAR [X] =    As described below, the normal distribution arises naturally as the limiting distribution of the average of a set of independent, identically distributed random variables with finite variances. It plays a central role in sampling theory and is a good approximation to a large class of empirical distributions. For this reason, a default assumption in many empirical studies is that the distribution of each observation is approximately normal. Therefore, statistical tables of the normal distribution are of great importance in analysing practical data sets. X is said to be a standardised normal variable if  = 0 and  = 1. Prob f(x) X 0  1

10. Gamma Distribution The Gamma distribution arises in queueing theory as the time to the arrival of the n th customer in a single server queue, where the average arrival rate is  The frequency function is f(x) = ( x ) n - 1 exp ( - x) / ( n - 1)!, x  0,  0, n = 1, 2,... = 0, otherwise E [X] = n / VAR [X] = n / Exponential Distribution This is a special case of the Gamma distribution with n = 1 and so is used to model the interarrival time of customers, or the time to the arrival of the first customer, in a simple queue. The frequency function is f (x)= exp ( - x ),x  0  0 = 0,otherwise. 12. Chi-Square Distribution A random variable X has a Chi-square distribution with n degrees of freedom ( where n is a positive integer) if it is a Gamma distribution with = 1, so its frequency function is f (x) = x n - 1 exp ( - x) / ( n - 1) !,x  o = 0,otherwise. X  2 n (x) Prob

Chi-square Distribution (continued) The chi-square distribution arises in two important applications: O If X 1, X 2, …, X n is a sequence of independently distributed standardised normal random variables, then the sum of squares X X … + X n 2 has a chi-square distribution with n degrees of freedom O If x 1, x 2, …, x n is a random sample from a normal distribution with mean   and variance  2 and let x =  x i / nand S 2 =  ( x i - x ) 2 /  2, then S 2 has a chi-square distribution with n - 1 degrees of freedom, and the random variables S 2 and x are independent. 13. Beta Distribution. A random variable X has a Beta distribution with parameters  0 and  0 if it has frequency function f (x) =  x   ( 1 - x)   (   0 < x < 1 = 0, otherwise E [X] =  VAR [X] =    If n is an integer,  (n) = ( n - 1 ) !with  (1) = 1  (n + 1/2) = (n - 1/2) ( n - 3/2) … with  ( 1/2) =  

14. Student’s t Distribution A random varuable X has a t distribution with n degrees of freedom ( t n ) if it has density f(x) =  (n+1) / 2   + x 2 / n ) - (n+1) / 2 ( -  < x <     n  n / 2) = 0, otherwise. The t distribution is symmetrical about the origin, with E[X] = 0 VAR [X] = n / (n -2). For small values of n, the t n distribution is very flat. As n is increased the density assumes a bell shape. For values of n  25, the t n distribution is practically indistinguishable from the standard normal curve. O If X and Y are independent random variables If X has a standard normal distribution and Y has a  n 2 distribution then Xhas a t n distribution Y / n O If x 1, x 2, …, x n is a random sample from a normal distribution, with mean  and variance   and if we define s 2 = 1 / ( n - 1)  ( x i - x ) 2 then ( x -  ) / ( s /  n) has a t n- 1 distribution

15. F Distribution A random variable X has an F distribution with m and n degrees of freedom if it has density f(x) =  (m + n) / 2  m m / 2 n n / 2 x m /   x > 0   m / 2)  n / 2) (n + m x) ( m + n ) / 2 = 0, otherwise. NoteE[X] = n / ( n - 2)if n > 4 VAR [X] = 2 n 2 (m + n - 2)if n > 4 m (n - 4) ( n - 2 ) 2 IfO X andYare independent random variables, X has a  m 2 and Y a  n 2 distribution X / m has an F m, n distribution Y / n O One consequence of this is that the F distribution represents the distribution of the ratio of certain independent quadratic forms which can be constructed from random samples drawn from normal distributions: if x 1, x 2, …, x m ( m  is a random sample from a normal distribution with mean  1 and variance  1 2, and if y 1, y 2, …, y n ( n  is a random sample from a normal distribution with mean  2 and variance  2 2, then  ( x i - x ) 2 / ( m - 1)has an F m - 1, n - 1 distribution  ( y i - y ) 2 / ( n - 1)