Some Discrete Probability Distributions Part 2

Slides:



Advertisements
Similar presentations
Negative Binomial Distribution
Advertisements

Chapter 5 Some Important Discrete Probability Distributions
ฟังก์ชั่นการแจกแจงความน่าจะเป็น แบบไม่ต่อเนื่อง Discrete Probability Distributions.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Statistics.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Pearson Prentice-Hall, Inc.Chap 5-1 Statistics for Managers Using Microsoft® Excel 5th Edition.
1 Set #3: Discrete Probability Functions Define: Random Variable – numerical measure of the outcome of a probability experiment Value determined by chance.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Statistics.
Class notes for ISE 201 San Jose State University
McGraw-Hill Ryerson Copyright © 2011 McGraw-Hill Ryerson Limited. Adapted by Peter Au, George Brown College.
Discrete Random Variable and Probability Distribution
CA200 Quantitative Analysis for Business Decisions.
The role of probability in statistics In statistical inference, we want to make general statements about the population based on measurements taken from.
Chapter 5 Some Discrete Probability Distributions.
JMB Chapter 6 Lecture 3 EGR 252 Spring 2011 Slide 1 Continuous Probability Distributions Many continuous probability distributions, including: Uniform.
JMB Ch6 Lecture 3 revised 2 EGR 252 Fall 2011 Slide 1 Continuous Probability Distributions Many continuous probability distributions, including: Uniform.
Probability Distribution
5-1 Business Statistics: A Decision-Making Approach 8 th Edition Chapter 5 Discrete Probability Distributions.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Basic Business Statistics.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Basic Business Statistics.
ENGR 610 Applied Statistics Fall Week 3 Marshall University CITE Jack Smith.
JMB Chapter 5 Part 2 EGR Spring 2011 Slide 1 Multinomial Experiments  What if there are more than 2 possible outcomes? (e.g., acceptable, scrap,
JMB Chapter 5 Part 1 EGR Spring 2011 Slide 1 Known Probability Distributions  Engineers frequently work with data that can be modeled as one of.
Math 4030 – 4a Discrete Distributions
Lesson 6 - R Discrete Probability Distributions Review.
Some Common Discrete Random Variables. Binomial Random Variables.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Business Statistics, A First Course (4e) © 2006 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Business Statistics,
Chap 5-1 Chapter 5 Discrete Random Variables and Probability Distributions Statistics for Business and Economics 6 th Edition.
Theoretical distributions: the other distributions.
Chapter 6 The Normal Distribution and Other Continuous Distributions
Chapter 3 Applied Statistics and Probability for Engineers
Covariance/ Correlation
Chapter Six McGraw-Hill/Irwin
Discrete Random Variables and Probability Distributions
Math 4030 – 4a More Discrete Distributions
Known Probability Distributions
Discrete Random Variables
Discrete Random Variables
Continuous Probability Distributions Part 2
Known Probability Distributions
Multinomial Experiments
Multinomial Distribution
Chapter 5 Some Important Discrete Probability Distributions
Discrete Probability Distributions
Multinomial Experiments
Continuous Probability Distributions Part 2
Some Discrete Probability Distributions
Discrete Probability Distributions
Discrete Probability Distributions
Known Probability Distributions
Continuous Probability Distributions Part 2
Some Discrete Probability Distributions Part 2
Continuous Probability Distributions Part 2
Continuous Probability Distributions Part 2
Some Discrete Probability Distributions
Multinomial Experiments
Multinomial Experiments
Multinomial Experiments
Theorem 5.3: The mean and the variance of the hypergeometric distribution h(x;N,n,K) are:  = 2 = Example 5.10: In Example 5.9, find the expected value.
Known Probability Distributions
Multinomial Experiments
Known Probability Distributions
Known Probability Distributions
Hypergeometric Distribution
Multinomial Experiments
Continuous Probability Distributions Part 2
Multinomial Experiments
Known Probability Distributions
Presentation transcript:

Some Discrete Probability Distributions Part 2 EGR 252 Walpole Chapter 5 Some Discrete Probability Distributions Part 2

Multinomial Experiments What if there are more than 2 possible outcomes? (e.g., acceptable, scrap, rework) That is, suppose we have: n independent trials k outcomes that are mutually exclusive (e.g., ♠, ♣, ♥, ♦) exhaustive (i.e., ∑all kpi = 1) Then f(x1, x2, …, xk; p1, p2, …, pk, n) = JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Multinomial Examples Example 5.7 refer to page 150 Problem 22, page 152 Convert ratio 8:4:4 to probabilities (8/16, 4/16) f( __, __, __; ___, ___, ___, __) =___.25, 8) = (8 choose 5,2,1)(0.5)5(0.25)2(0.25)1 = 8!/(5!2!1!)* )(0.5)5(0.25)2(0.25)1 = 21/256 or 0.082031 x1 = _______ p1 = 0.50 x2 = p2 = 0.25 x3 = p3 = f( 5,2,1; 0.5, 0.25, 0.25, 8) = (8 choose 5,2,1)(0.5)5(0.25)2(0.25)1 = 8!/(5!2!1!)* )(0.5)5(0.25)2(0.25)1 = 21/256 or 0.082031 JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Binomial vs.Hypergeometric Distribution Replacement and Independence Binomial (assumes sampling “with replacement”) and hypergeometric (sampling “without replacement”) Binomial assumes independence, while hypergeometric does not. Hypergeometric: The probability associated with getting x successes in the sample (given k successes in the lot.) Note the difference between binomial (assumes sampling “with replacement”) and hypergeometric (sampling “without replacement”) Also, binomial assumes independence, while hypergeometric does not. JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Hypergeometric Example Example from Complete Business Statistics, 4th ed (McGraw-Hill) Automobiles arrive in a dealership in lots of 10. Five out of each 10 are inspected. For one lot, it is known that 2 out of 10 do not meet prescribed safety standards. What is probability that at least 1 out of the 5 tested from that lot will be found not meeting safety standards? This example follows a hypergeometric distribution: A random sample of size n is selected without replacement from N items. k of the N items may be classified as “successes” and N-k are “failures.” The probability associated with getting x successes in the sample (given k successes in the lot.) JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Solution: Hypergeometric Example In our example, k = number of “successes” = 2 n = number in sample = 5 N = the lot size = 10 x = number found = 1 or 2 P(X > 1) = 0.556 + 0.222 = 0.778 JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Expectations: Hypergeometric Distribution The mean and variance of the hypergeometric distribution are given by What are the expected number of cars that fail inspection in our example? What is the standard deviation? μ = nk/N = 5*2/10 = 1 σ2 = (5/9)(5*2/10)(1-2/10) = 0.444 σ = 0.667 μ = nk/N = 5*2/10 = 1 σ2 =(5/9)(5*2/10)(1-2/10) = 0.444 σ = 0.667 JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Additional problems … A worn machine tool produced defective parts for a period of time before the problem was discovered. Normal sampling of each lot of 20 parts involves testing 6 parts and rejecting the lot if 2 or more are defective. If a lot from the worn tool contains 3 defective parts: What is the expected number of defective parts in a sample of six from the lot? N = 20 n = 6 k = 3 μ = nk/N = 6*3/20 =18/20=0.9 What is the expected variance? σ2 = (14/19)(6*3/20)(1-3/20) = 0.5637 What is the probability that the lot will be rejected? P(X>2) = 1 – [P(0)+P(1)] N = 20 n = 6 k = 3 μ = nk/N = 6*3/20 =18/20=0.9 σ2 = (14/19)(6*3/20)(1-3/20) = 0.5637 P(X>2) = 1 – [P(0)+P(1)] = 1-0.7982=0.2018 = P(2)+P(3) = [3 choose 2]*[17 choose 4] / [20 choose 6] + [3 choose 3]*[17 choose 3] / [20 choose 6] = 0.1842 + 0.0175 = 0.2018 JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Binomial Approximation Note, if N >> n, then we can approximate the hypergeometric with the binomial distribution. Example: Automobiles arrive in a dealership in lots of 100. 5 out of each 100 are inspected. 2 /10 (p=0.2) are indeed below safety standards. What is probability that at least 1 out of 5 inspected will not meet safety standards? Recall: P(X ≥ 1) = 1 – P(X < 1) = 1 – P(X = 0) Hypergeometric distribution Binomial distribution 1 - h(0;100,5,20) = 0.6807 1 - b(0;5,0.2) 1 - 0.3277 = 0.6723 JMB 2012 h(0;100,5,20) = (20 choose 0)(80 choose 5)/(100 choose 5) = 0.3193 1-P(0) = 1- 0.3193 = 0.6807 From Table A1, n=5, p=0.2 b(0;5,0.2) = 0.3277 1-P(0) = 1-.3277 = 0.6723 NOTE: If N = 200, p=0.2 and n=5, then the hypergeometric distribution yields P(X > 1) = 0.676 Using Excel, 0.676 = 1 - HYPGEOMDIST(0,5,40,200) We can see that the binomial approximation gets very close to the hypergeometric value as N gets very large relative to n. (See also example 5.12, pg. 155-6) JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Negative Binomial Distribution b* A binomial experiment in which trials are repeated until a fixed number of successes occur. Example: Historical data indicates that 30% of all bits transmitted through a digital transmission channel are received in error. An engineer is running an experiment to try to classify these errors, and will start by gathering data on the first 10 errors encountered. What is the probability that the 10th error will occur on the 25th trial? JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Negative Binomial Equation This example follows a negative binomial distribution: Repeated independent trials. Probability of success = p and probability of failure = q = 1-p. Random variable, X, is the number of the trial on which the kth success occurs. The probability associated with the kth success occurring on trial x is given by, Where, k = “success number” x = trial number on which k occurs p = probability of success (error) q = 1 – p JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Example: Negative Binomial Distribution What is the probability that the 10th error will occur on the 25th trial? k = “success number” = 10 x = trial number on which k occurs = 25 p = probability of success (error) = 0.3 q = 1 – p = 0.7 b*(15;10,0.3) = (24 choose 9)(.3)10(.7)15 = 0.037 JMB 2012 0.03665 =NEGBINOMDIST(15,10,0.3) JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Geometric Distribution Continuing with our example in which p = probability of success (error) = 0.3 What is the probability that the 1st bit received in error will occur on the 5th trial? This is an example of the geometric distribution, which is a special case of the negative binomial in which k = 1. The probability associated with the 1st success occurring on trial x is P = (0.3)(0.7)4 = 0.072 JMB 2012 (0.3)(0.7)4 = 0.072 0.07203 =NEGBINOMDIST(4,1,0.3) JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Additional problems … A worn machine tool produces 1% defective parts. If we assume that parts produced are independent: What is the probability that the 2nd defective part will be the 6th one produced? What is the probability that the 1st defective part will be seen before 3 are produced? How many parts can we expect to produce before we see the 1st defective part? Negative binomial or geometric? Expected value = ? JMB 2012 Problem 1 x=6 k=2 x-1=5 k-1=1 b*(6:2,0.01) = (5 choose 1)(.01)2(.99)4 = 0.00048 Problem 2 P(X<3) = P(x=1)+P(x=2) = (0.01)*(0.99)1-1 + (0.01)*(0.99)2-1 = 0.0199 0.01 =NEGBINOMDIST(0,1,0.01) 0.0099 =NEGBINOMDIST(1,1,0.01) Problem 3 (Based on Theorem 5.3, pg. 160 μ = 1/p for geometric) Problem 3 μ = 1/p = 1/0.01 = 100 JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Poisson Process The number of occurrences in a given interval or region with the following properties: “memoryless” ie number in one interval is independent of the number in a different interval P(occurrence) during a very short interval or small region is proportional to the size of the interval and doesn’t depend on number occurring outside the region or interval. P(X>1) in a very short interval is negligible memoryless  number in one interval is independent of the number in a different interval JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Poisson Process Situations Number of bits transmitted per minute. Number of calls to customer service in an hour. Number of bacteria present in a given sample. Number of hurricanes per year in a given region. memoryless  number in one interval is independent of the number in a different interval JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Poisson Distribution Probabilities The probability associated with the number of occurrences in a given period of time is given by, Where, λ = average number of outcomes per unit time or region t = time interval or region JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Service Call Example - Poisson Process An average of 2.7 service calls per minute are received at a particular maintenance center. The calls correspond to a Poisson process. To determine personnel and equipment needs to maintain a desired level of service, the plant manager needs to be able to determine the probabilities associated with numbers of service calls. λ = 2.7 and t = 1 minute JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Our Example: λ = 2.7 and t = 1 minute What is the probability that fewer than 2 calls will be received in any given minute? The probability that fewer than 2 calls will be received in any given minute is P(X < 2) = P(X = 0) + P(X = 1) The mean and variance are both λt, so μ = λt = 2.7 (1) = 2.7 Note: Table A.2, pp. 732-734, gives Σt p(x;μ) P(x=0) = e-2.72.70/0! + e-2.72.71/1! = 0.2487 μ = 2.7 (1) = 2.7 JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Service Call Example (Part 2) If more than 6 calls are received in a 3-minute period, an extra service technician will be needed to maintain the desired level of service. What is the probability of that happening? μ = λt = (2.7) (3)= 8.1 8.1 is not in the table; we must use basic equation Suppose λt = 8; see table with μ = 8 and r = 6 P(X > 6) = 1 – P(X < 6) = 1 - 0.3134 = 0.6866 μ = 2.7*3 = 8.1 ≈ 8 see page 668 , with μ = 8 and r = 6, P(X < 6) = 0.3134 P = 1-0.3134 = 0.6866 9th ed: page 733 JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.

Poisson Distribution JMB Ch. 5 Part 2 rev 2018 EGR 252 JMB 9th ed.