Chapter 3. Discrete Probability Distributions

Slides:



Advertisements
Similar presentations
Discrete Random Variables and Probability Distributions
Advertisements

Chapter 5 Discrete Probability Distributions 1 Copyright © 2012 The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
Commonly Used Distributions
Special random variables Chapter 5 Some discrete or continuous probability distributions.
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Why this can be happen to me?. Can you think, who’ll be the faster catch the fish??
Discrete Uniform Distribution
DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Discrete Random Variables
The Bernoulli distribution Discrete distributions.
ฟังก์ชั่นการแจกแจงความน่าจะเป็น แบบไม่ต่อเนื่อง Discrete Probability Distributions.
Chapter 4 Discrete Random Variables and Probability Distributions
Discrete Probability Distributions Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
Review.
Probability Distributions
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
Discrete Probability Distributions
Exercises on Discrete distributions
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Special discrete distributions Sec
Discrete Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Stat 321- Day 13. Last Time – Binomial vs. Negative Binomial Binomial random variable P(X=x)=C(n,x)p x (1-p) n-x  X = number of successes in n independent.
Irwin/McGraw-Hill © The McGraw-Hill Companies, Inc., 2000 LIND MASON MARCHAL 1-1 Chapter Five Discrete Probability Distributions GOALS When you have completed.
Binomial Distributions
Discrete Probability Distributions Binomial Distribution Poisson Distribution Hypergeometric Distribution.
Chapter 5 Discrete Probability Distribution I. Basic Definitions II. Summary Measures for Discrete Random Variable Expected Value (Mean) Variance and Standard.
Discrete Distributions
6- 1 Chapter Six McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Binomial and Related Distributions 學生 : 黃柏舜 學號 : 授課老師 : 蔡章仁.
Section 15.8 The Binomial Distribution. A binomial distribution is a discrete distribution defined by two parameters: The number of trials, n The probability.
McGraw-Hill/Irwin Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.
Lecture 4 1 Discrete distributions Four important discrete distributions: 1.The Uniform distribution (discrete) 2.The Binomial distribution 3.The Hyper-geometric.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
HYPERGEOMETRIC DISTRIBUTION PREPARED BY :A.TUĞBA GÖRE
JMB Chapter 5 Part 2 EGR Spring 2011 Slide 1 Multinomial Experiments  What if there are more than 2 possible outcomes? (e.g., acceptable, scrap,
Math 4030 – 4a Discrete Distributions
King Saud University Women Students
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Math b (Discrete) Random Variables, Binomial Distribution.
The Binomial Distribution
Topic 3 - Discrete distributions Basics of discrete distributions - pages Mean and variance of a discrete distribution - pages ,
Some Common Discrete Random Variables. Binomial Random Variables.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Chapter 3 Discrete Random Variables and Probability Distributions  Random Variables.2 - Probability Distributions for Discrete Random Variables.3.
Random Variables Example:
Probability Distributions, Discrete Random Variables
DISCRETE PROBABILITY MODELS
Chapter 4. Random Variables - 3
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Probability Generating Functions Suppose a RV X can take on values in the set of non-negative integers: {0, 1, 2, …}. Definition: The probability generating.
Chapter 5 Special Distributions Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
PROBABILITY AND STATISTICS WEEK 5 Onur Doğan. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
Chapter Six McGraw-Hill/Irwin
The binomial probability distribution
Ch3.5 Hypergeometric Distribution
Discrete Random Variables and Probability Distributions
Math 4030 – 4a More Discrete Distributions
Discrete Probability Distributions
Engineering Probability and Statistics - SE-205 -Chap 3
Discrete random variable X Examples: shoe size, dosage (mg), # cells,…
Chapter 3 Discrete Random Variables and Probability Distributions
Chapter 3 Discrete Random Variables and Probability Distributions
Chapter 5 Some Important Discrete Probability Distributions
PROBABILITY AND STATISTICS
Chapter 5: Some Discrete Probability Distributions:
Chapter 3 Discrete Random Variables and Probability Distributions
Discrete Probability Distributions
Each Distribution for Random Variables Has:
Presentation transcript:

Chapter 3. Discrete Probability Distributions 3.1 The Binomial Distribution 3.2 The Geometric and Negative Binomial Distributions 3.3 The Hypergeometric Distribution 3.4 The Poisson Distribution 3.5 The Multinomial Distribution

3.1 The Binomial Distribution 3.1.1 Bernoulli Random Variables(1/2) To model - the outcome of a coin toss, - whether a valve is open or shut, - whether an item is defective or not, - any other process that has only two possible outcomes. The outcomes are labeled 0 and 1 The random variable is defined by the parameter p, 0  p  1, which is the probability that the outcome is 1.

3.1.1 Bernoulli Random Variables(2/2) Expectations

3.1.2 Definition of the Binomial Distribution(1/5) Consider an experiment consisting of n Bernoulli trials (X1, ……, Xn) that are independent and that each have a constant probability p of success. Then the total number of successes X, that is X=X1+…+Xn, is a random variable that has a binomial distribution with parameters n and p, which is written X~B(n,p)

3.1.2 Definition of the Binomial Distribution(2/5) The probability mass function of a B(n,p) random variable is for , with

3.1.2 Definition of the Binomial Distribution(3/5) Ex) X~B(8,0.5)

3.1.2 Definition of the Binomial Distribution(4/5) Ex) X~B(8,0.5) 0.004 0.273 0.219 0.109 0.031 4 3 2 1 5 6 7 8 x Probability 4 3 2 1 5 6 7 8 0.004 0.636 0.855 0.965 0.996 0.1000 0.363 0.144 0.035

3.1.1 Definition of the Binomial Distribution(5/5) Symmetric Binomial Distributions : A B(n,0.5) distribution is a symmetric probability distribution for any value of the parameter n. The distribution is symmetric about the expected value n/2.

Example 24 : Air Force Scrambles(1/3) 16 planes. A probability of 0.25 that the engines of a particular plane does not start at a given attempt. Then, the number of planes successfully launched has a binomial distribution with parameters n = 16 and p = 0.75 The expected number of plane launched is

Example 24 : Air Force Scrambles(2/3) The variance is The probability that exactly 12 planes scramble successfully is The probability that at least 14 planes scramble successfully is

Example 24 : Air Force Scrambles(3/3) 0.000 0.052 0.180 0.208 0.054 0.006 9 7 5 1 3 11 13 15 x Probability 10 8 6 2 4 12 14 16 0.001 0.020 0.110 0.134 0.010 0.225 9 7 5 1 3 11 13 15 10 8 6 2 4 12 14 16 0.000 0.079 0.369 0.802 0.990 0.007 0.001 0.027 0.189 0.936 0.100 0.594

Proportion of successes in Bernoulli Trials Let Then, if

3. 2 The Geometric and Negative Binomial Distributions 3. 2 3.2 The Geometric and Negative Binomial Distributions 3.2.1 Definition of the Geometric Distribution(1/2) The number of trials up to and including the first success in a sequence of independent Bernoulli trials with a constant success probability p has a geometric distribution with parameter p. The probability mass function is for

3.2.1 Definition of the Geometric Distribution(2/2) The cumulative distribution function is The expectations

Expectation of X: Variance of X:

Example 24 : Air Force Scrambles(1/2) If the mechanics are unsuccessful in starting the engines, then they must wait 5 minutes before trying again. The distribution of the number of attempts needed to start a plane’s engine  geometric distribution with p = 0.75. The probability that the engines start on the third attempt is

Example 24 : Air Force Scrambles(2/2) The probability that the plane is launched within 10 minutes of the first attempt to start the engines is The expected number of attempts required to start the engines is

3.2.2 Definition of the Negative Binomial Distribution(1/2) The number of trials up to and including the r th success in a sequence of independent Bernoulli trials with a constant success probability p has a negative binomial distribution with parameter p and r The probability mass function is for

3.2.2 Definition of the Negative Binomial Distribution(2/2) The expectations

Example 12 : Personnel Recruitment(1/2) Suppose that a company wishes to hire three new workers and each applicant interviewed has a probability of 0.6 of being found acceptable. The distribution of the total number of applicants that the company needs to interview  Negative Binomial distribution with parameter p = 0.6 and r = 3. The probability that exactly six applicants need to be interviewed is

Example 12 : Personnel Recruitment(2/2) If the company has a budget that allows up to six applicants to be interviewed, then the probability that the budget is sufficient is The expected number of interviews required is

3. 3 The Hypergeometric Distribution 3. 3 3.3 The Hypergeometric Distribution 3.3.1 Definition of the Hypergeometric Distribution(1/3) Consider a collection of N items of which r are of a certain kind. If one of the items is chosen at random, the probability that it is of the special kind is clearly Consequently, if n items are chosen at random with replacement, then clearly the distribution of X, the number of defective items chosen, is

3.3.1 Definition of the Hypergeometric Distribution(2/3) However, if n items are chosen at random without replacement, then the distribution of X is the hypergeometric distribution. The hypergeometric distribution has a probability mass function given by for

3.3.1 Definition of the Hypergeometric Distribution(3/3) The expectations It represents the distribution of the number of items of a certain kind in a random sample of size n drawn without replacement from a population of size N that contains r items of this kind.

Let X and Y be independent binomial random variables such that Let us consider the conditional probability mass function of X given that X+Y=n. That is, the conditional distribution of X given the value of X+Y is hypergeometric!

Expectation of X: Let Xi be the following random variable: Xi=1 if the ith selection is acceptable; 0 otherwise. Then, Let X be the hypergeometric random variable with parameters (r,N.n). Then, This gives Variance of X:

Since Xi is a Bernoulli random variable, Also for i<j, Hence, Cf. Let Then, and As N goes to infinity,

Example 17 : Milk Container Contents(1/3) Suppose that milk is shipped to retail outlets in boxes that hold 16 milk containers. One particular box, which happens to contain six under weight containers, is opened for inspection, and five containers are chosen at random. The distribution of the number of underweight milk containers in the sample chosen by inspector  Hypergeometric distribution with N=16, r=6, and n=5.

Example 17 : Milk Container Contents(2/3) The probability that the inspector chooses exactly two underweight containers is The expected number of underweight containers chosen by the inspector is

Example 17 : Milk Container Contents(3/3) 0.058 0.034 0.002 0.206 0.412 0.288 4 3 2 1 5 Number of underweight milk containers found by inspector Probability

3. 4 The Poisson Distribution 3. 4 3.4 The Poisson Distribution 3.4.1 Definition of the Poisson Distribution(1/3) The distribution of The number of defects in an item The number of radioactive particles emitted by a substance The number of telephone calls received by an operator with a certain time limit That is, the number of “events” that occur within certain specified boundaries.

3.4.1 Definition of the Poisson Distribution(2/3) A random variable X distributed as a Poisson random variable with parameter λ, which is written has a probability mass function for

3.4.1 Definition of the Poisson Distribution(3/3) The Poisson distribution is often useful to model the number of times that a certain event occurs per unit of time, distance, or volume, and it has a mean and variance both equal to the parameter value λ. The expectations

The Poisson random variable can be used as an approximation for a binomial random variables with parameters (n,p) when n is large and p is small: Let X be a binomial random variable with parameters (n,p) and Then, That is, For large n and small p, Hence,

Example 3 : Software Errors(1/3) Suppose that the number of errors in a piece of software has a Poisson distribution with parameter λ=3. [The expected number of errors] = [The variance in the number of errors] = 3. The probability that a piece of software has no error is

Example 3 : Software Errors(2/3) The probability that there are three or more errors in a piece of software is

Example 3 : Software Errors(3/3) 0.101 0.050 0.168 0.001 0.008 9 7 5 1 3 11 13 Number of software errors Probability 10 8 6 2 4 12 0.022 0.003 0.149 0.224 9 7 5 1 3 11 13 10 8 6 2 4 12 0.101 0.050 0.168 0.001 0.008 0.022 0.003 0.149 0.224 0.000

3. 5 The Multinomial Distribution 3. 5 3.5 The Multinomial Distribution 3.5.1 Definition of the Multinomial Distribution(1/2) Consider a sequence of n independent trials where each individual trial can have k outcomes that occur with constant probability value p1,…, pk with p1+···+pk = 1. The random variable X1,…, Xk that count the number of occurrences of each outcome are said to have a multinomial distribution. The joint probability mass function is for nonnegative integer values of the satisfying

3.5.1 Definition of the Multinomial Distribution(2/2) The random variables X1,…, Xk have expectation and variances given by but they are not independent. (why?)

Example 1 : Machine Breakdowns(1/4) Suppose that the machine breakdowns are attributable to electrical faults, mechanical faults, and operator misuse, and these causes occur with probabilities of 0.2, 0.5, and 0.3, respectively. The engineer is interested in predicting the causes of the next ten breakdowns. X1: the number of breakdowns due to electrical reasons. X2: the number of breakdowns due to mechanical reasons. X3: the number of breakdowns due to operator misuse.

Example 1 : Machine Breakdowns(2/4) X1+X2+X3=10 If the breakdown causes can be assumed to be independent of one another, then the probability mass function is The probability that there will be three electrical breakdowns, five mechanical breakdowns, and two misuse breakdowns is

Example 1 : Machine Breakdowns(3/4) The expected number of electrical breakdowns is The expected number of mechanical breakdowns is The expected number of misuse breakdowns is

Example 1 : Machine Breakdowns(4/4) If the engineer is interested in the probability of there being no more two electrical breakdowns, then this can be calculated by nothing that X1~B(10, 0.2), so that