Chapter 3 Some Special Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.

Slides:



Advertisements
Similar presentations
Probability Distribution
Advertisements

STATISTICS Univariate Distributions
Order Statistics The order statistics of a set of random variables X1, X2,…, Xn are the same random variables arranged in increasing order. Denote by X(1)
Special random variables Chapter 5 Some discrete or continuous probability distributions.
Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Discrete Uniform Distribution
1 Set #3: Discrete Probability Functions Define: Random Variable – numerical measure of the outcome of a probability experiment Value determined by chance.
Section 5.4 is n  a i X i i = 1 n  M i (a i t). i = 1 M 1 (a 1 t) M 2 (a 2 t) … M n (a n t) = Y = a 1 X 1 + a 2 X 2 + … + a n X n = If X 1, X 2, …, X.
Probability Densities
Probability Distributions
1 Review of Probability Theory [Source: Stanford University]
Bernoulli Distribution
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
3-1 Introduction Experiment Random Random experiment.
Continuous Random Variables and Probability Distributions
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Continuous Random Variables and Probability Distributions.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Some Continuous Probability Distributions Asmaa Yaseen.
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
Chapter 5 Sampling and Statistics Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Discrete Random Variables Chapter 4.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Standard Statistical Distributions Most elementary statistical books provide a survey of commonly used statistical distributions. The reason we study these.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Review of Exam 2 Sections 4.6 – 5.6 Jiaping Wang Department of Mathematical Science 04/01/2013, Monday.
Chapter 5 Statistical Models in Simulation
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
Moment Generating Functions
The Negative Binomial Distribution An experiment is called a negative binomial experiment if it satisfies the following conditions: 1.The experiment of.
Chapter 16 – Categorical Data Analysis Math 22 Introductory Statistics.
Continuous Distributions The Uniform distribution from a to b.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 5 Discrete Random Variables.
Math 4030 – 4a Discrete Distributions
COMP 170 L2 L17: Random Variables and Expectation Page 1.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Starting point for generating other distributions.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Random Variables Example:
Chapter 6: Continuous Probability Distributions A visual comparison.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
MATH 2311 Section 3.2. Bernoulli Trials A Bernoulli Trial is a random experiment with the following features: 1.The outcome can be classified as either.
Chapter 5 Special Distributions Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
Chapter 6: Continuous Probability Distributions A visual comparison.
2.2 Discrete Random Variables 2.2 Discrete random variables Definition 2.2 –P27 Definition 2.3 –P27.
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
3. Random Variables (Fig.3.1)
The Exponential and Gamma Distributions
IEE 380 Review.
Engineering Probability and Statistics - SE-205 -Chap 3
Appendix A: Probability Theory
Multinomial Distribution
Some Discrete Probability Distributions
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
6.3 Sampling Distributions
Chapter 3 : Random Variables
Each Distribution for Random Variables Has:
Geometric Poisson Negative Binomial Gamma
1/2555 สมศักดิ์ ศิวดำรงพงศ์
Moments of Random Variables
Presentation transcript:

Chapter 3 Some Special Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee

3.1 The Binomial and Related Distributions

Bernoulli Distribution A Bernoulli experiment is a random experiment in which the outcome can be classified in one of two mutually exclusive and exhaustive ways. – Example: defective/non-defective, success/failure. A sequence of independent Bernoulli trials has a fixed probability of success p.

Let X be a random variable associated with a Bernoulli trial.  X = 1 implies a success.  X = 0 implies a failure. The pmf of X can be written as :  p(x)=p x (1-p) 1-x x=0,1 In a sequence of n Bernoulli trials, we are often interested in a total of X number of successes.

Binomial Experiment Independent and identical n trials. Probability of success p is fixed for each trial. Only two possible outcomes for each trial. Number of trails are fixed.

Binomial Distribution The random variable X which counts the number of success of a Binomial experiment is said to have a Binomial distribution with parameters n and p and its pmf is given by:

Theorem

Negative Binomial Distribution Consider a sequence of independent repetitions of a random experiment with constant probability p of success. Let the random variable Y denote the total number of failures in this sequence before the rth success that is, Y +r is equal to the number of trials required to produce exactly r successes. The pmf of Y is called a Negative Binomial distribution

Thus the probability of getting r-1 successes in the first y+r-1 trials and getting the rth success in the (y+r)th trial gives the pmf of Y as

Geometric Distribution The special case of r = 1 in the negative binomial, that is finding the first success in y trials gives the geometric distribution. Thus we can re-write P(Y)=p q y-1 for y = 1, 2, 3, …. Lets find mean and variance for the Geometric Distribution.

Multinomial Distribution Define the random variable X i to be equal to the number of outcomes that are elements of C i, i = 1, 2, … k-1. Here C1, C2, … Ck are k mutually exhaustive and exclusive outcomes of the experiment. The experiment is repeated n number of times. The multinomial distribution is

Trinomial Distribution Let n= 3 in the multinomial distribution and we let X1 = X and X2= Y, then n –X-Y = X3 we have a trinomial distribution with the joint pmf of X and Y given as

3.2 The Poisson Distribution A random variable that has a pmf of the form p(x) as given below is said to have a Poisson distribution with parameter m.

Poisson Postulates Let g(x,w) denote the probability of x changes in each interval of length w. Let the symbol o(h) represent any function such that The postulates are – g(1,h)=λh+o(h), where λ is a positive constant and h > 0. – and – The number of changes in nonoverlapping intervals are independent.

Note The number of changes in X in an interval of length w has a Poisson distribution with parameter m = wλ

Theorem

3.3 The Gamma, Chi and Beta Distributions The gamma function of α can be written as

The Gamma Distribution A random variable X that has a pdf of the form below is said to have a gamma distribution with parameters α and β.

Exponential Distribution The gamma distribution is used to model wait times. W has a gamma distribution with α = k and β= 1/ λ. If W is the waiting time until the first change, that is k = 1, the pdf of W is the exponential distribution with parameter λ and its density is given as

Chi-Square Distribution A special case of the Gamma distribution with α=r/2 and β=2 gives the Chi-Square distribution. Here r is a positive integer called the degrees of freedom.

Theorem

Beta Distribution A random variable X is said to have a beta distribution with parameters α and β if its density is given as follows

The Normal Distribution We say a random variable X has a normal distribution if its pdf is given as below. The parameters μ and σ 2 are the mean and variance of X respectively. We write X has N(μ,σ 2 ).

The mgf The moment generating function for X~N(μ,σ 2 ) is

Theorems

3.6 t and F-Distributions

The t-distribution Let W be a random variable with N(0,1) and let V denote a random variable with Chi- square distribution with r degrees of freedom. Then Has a t-distribution with pdf

The F-distribution Consider two independent chi-square variables each with degrees of freedom r 1 and r 2. Let F = (U/r 1 )/(V/r 2 ) The variable F has a F-distribution with parameters r 1 and r 2 and its pdf is

Student’s Theorem