Multinomial Experiments

Slides:



Advertisements
Similar presentations
Negative Binomial Distribution
Advertisements

JMB Chapter 6 Part 1 v2 EGR 252 Spring 2009 Slide 1 Continuous Probability Distributions Many continuous probability distributions, including: Uniform.
ฟังก์ชั่นการแจกแจงความน่าจะเป็น แบบไม่ต่อเนื่อง Discrete Probability Distributions.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Pearson Prentice-Hall, Inc.Chap 5-1 Statistics for Managers Using Microsoft® Excel 5th Edition.
1 Set #3: Discrete Probability Functions Define: Random Variable – numerical measure of the outcome of a probability experiment Value determined by chance.
1 Pertemuan 8 Variabel Acak-2 Matakuliah: A0064 / Statistik Ekonomi Tahun: 2005 Versi: 1/1.
Class notes for ISE 201 San Jose State University
Irwin/McGraw-Hill © The McGraw-Hill Companies, Inc., 2000 LIND MASON MARCHAL 1-1 Chapter Five Discrete Probability Distributions GOALS When you have completed.
McGraw-Hill Ryerson Copyright © 2011 McGraw-Hill Ryerson Limited. Adapted by Peter Au, George Brown College.
Discrete Random Variable and Probability Distribution
Discrete Distributions
CA200 Quantitative Analysis for Business Decisions.
The role of probability in statistics In statistical inference, we want to make general statements about the population based on measurements taken from.
Chapter 5 Some Discrete Probability Distributions.
JMB Chapter 6 Lecture 3 EGR 252 Spring 2011 Slide 1 Continuous Probability Distributions Many continuous probability distributions, including: Uniform.
JMB Ch6 Lecture 3 revised 2 EGR 252 Fall 2011 Slide 1 Continuous Probability Distributions Many continuous probability distributions, including: Uniform.
Probability Distribution
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Basic Business Statistics.
ENGR 610 Applied Statistics Fall Week 3 Marshall University CITE Jack Smith.
JMB Chapter 5 Part 2 EGR Spring 2011 Slide 1 Multinomial Experiments  What if there are more than 2 possible outcomes? (e.g., acceptable, scrap,
JMB Chapter 5 Part 1 EGR Spring 2011 Slide 1 Known Probability Distributions  Engineers frequently work with data that can be modeled as one of.
Math 4030 – 4a Discrete Distributions
Lesson 6 - R Discrete Probability Distributions Review.
Some Common Discrete Random Variables. Binomial Random Variables.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Business Statistics, A First Course (4e) © 2006 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Business Statistics,
Chap 5-1 Chapter 5 Discrete Random Variables and Probability Distributions Statistics for Business and Economics 6 th Edition.
Theoretical distributions: the other distributions.
Chapter 3 Applied Statistics and Probability for Engineers
Covariance/ Correlation
Chapter Six McGraw-Hill/Irwin
Discrete Random Variables and Probability Distributions
Math 4030 – 4a More Discrete Distributions
Known Probability Distributions
Discrete Random Variables
Chapter 5 Created by Bethany Stubbe and Stephan Kogitz.
Discrete Random Variables
Continuous Probability Distributions Part 2
Known Probability Distributions
Multinomial Experiments
Chapter 7 Sampling Distributions.
Chapter 5 Sampling Distributions
The Bernoulli distribution
Multinomial Distribution
Chapter 5 Some Important Discrete Probability Distributions
Chapter 5: Some Discrete Probability Distributions:
Some Discrete Probability Distributions Part 2
Continuous Probability Distributions Part 2
Some Discrete Probability Distributions
Known Probability Distributions
Continuous Probability Distributions Part 2
Some Discrete Probability Distributions Part 2
Continuous Probability Distributions Part 2
Continuous Probability Distributions Part 2
Some Discrete Probability Distributions
Multinomial Experiments
Multinomial Experiments
Multinomial Experiments
Theorem 5.3: The mean and the variance of the hypergeometric distribution h(x;N,n,K) are:  = 2 = Example 5.10: In Example 5.9, find the expected value.
Known Probability Distributions
Multinomial Experiments
Known Probability Distributions
Known Probability Distributions
Hypergeometric Distribution
Multinomial Experiments
Continuous Probability Distributions Part 2
Multinomial Experiments
Known Probability Distributions
Presentation transcript:

Multinomial Experiments What if there are more than 2 possible outcomes? (e.g., acceptable, scrap, rework) That is, suppose we have: n independent trials k outcomes that are mutually exclusive (e.g., ♠, ♣, ♥, ♦) exhaustive (i.e., ∑all kpi = 1) Then f(x1, x2, …, xk; p1, p2, …, pk, n) = JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Problem 22, pg. 152 Convert ratio 8:4:4 to probabilities f( __, __, __; ___, ___, ___, __) =_______ f( 5,2,1; 0.5, 0.25, 0.25, 8) = (8 choose 5,2,1)(0.5)5(0.25)2(0.25)1 = 8!/(5!2!1!)* )(0.5)5(0.25)2(0.25)1 = 21/256 or 0.082031 x1 = _______ p1 = 0.50 x2 = p2 = 0.25 n = _____ x3 = p3 = f( 5,2,1; 0.5, 0.25, 0.25, 8) = (8 choose 5,2,1)(0.5)5(0.25)2(0.25)1 = 8!/(5!2!1!)* )(0.5)5(0.25)2(0.25)1 = 21/256 or 0.082031 JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Binomial vs.Hypergeometric Distribution Replacement and Independence Binomial (assumes sampling “with replacement”) and hypergeometric (sampling “without replacement”) Binomial assumes independence, while hypergeometric does not. Hypergeometric: The probability associated with getting x successes in the sample (given k successes in the lot.) Note the difference between binomial (assumes sampling “with replacement”) and hypergeometric (sampling “without replacement”) Also, binomial assumes independence, while hypergeometric does not. JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Hypergeometric Example Example from Complete Business Statistics, 4th ed (McGraw-Hill) Automobiles arrive in a dealership in lots of 10. Five out of each 10 are inspected. For one lot, it is known that 2 out of 10 do not meet prescribed safety standards. What is probability that at least 1 out of the 5 tested from that lot will be found not meeting safety standards? This example follows a hypergeometric distribution: A random sample of size n is selected without replacement from N items. k of the N items may be classified as “successes” and N-k are “failures.” The probability associated with getting x successes in the sample (given k successes in the lot.) JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Solution: Hypergeometric Example In our example, k = number of “successes” = 2 n = number in sample = 5 N = the lot size = 10 x = number found = 1 or 2 P(X > 1) = 0.556 + 0.222 = 0.778 P(X > 1) = 0.556 + 0.222 = 0.778 JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Expectations: Hypergeometric Distribution The mean and variance of the hypergeometric distribution are given by What are the expected number of cars that fail inspection in our example? What is the standard deviation? μ = nk/N = 5*2/10 = 1 σ2 = (5/9)(5*2/10)(1-2/10) = 0.444 σ = 0.667 μ = nk/N = 5*2/10 = 1 σ2 =(5/9)(5*2/10)(1-2/10) = 0.444 σ = 0.667 JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Your turn … A worn machine tool produced defective parts for a period of time before the problem was discovered. Normal sampling of each lot of 20 parts involves testing 6 parts and rejecting the lot if 2 or more are defective. If a lot from the worn tool contains 3 defective parts: What is the expected number of defective parts in a sample of six from the lot? N = 20 n = 6 k = 3 μ = nk/N = 6*3/20 =18/20=0.9 What is the expected variance? σ2 = (14/19)(6*3/20)(1-3/20) = 0.5637 What is the probability that the lot will be rejected? P(X>2) = 1 – [P(0)+P(1)] N = 20 n = 6 k = 3 μ = nk/N = 6*3/20 =18/20=0.9 σ2 = (14/19)(6*3/20)(1-3/20) = 0.5637 P(X>2) = 1 – [P(0)+P(1)] = 1-0.7982=0.2018 = P(2)+P(3) = [3 choose 2]*[17 choose 4] / [20 choose 6] + [3 choose 3]*[17 choose 3] / [20 choose 6] = 0.1842 + 0.0175 = 0.2018 JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Binomial Approximation Note, if N >> n, then we can approximate this with the binomial distribution. For example: Automobiles arrive in a dealership in lots of 100. 5 out of each 100 are inspected. 2 /10 (p=0.2) are indeed below safety standards. What is probability that at least 1 out of 5 will be found not meeting safety standards? Recall: P(X ≥ 1) = 1 – P(X < 1) = 1 – P(X = 0) Hypergeometric distribution Binomial distribution 1 - h(0;100,5,20) = 0.676 1 - b(0;5,0.2) 1 - 0.3277 = 0.6723 h(0;100,5,20) = (20 choose 0)(80 choose 5)/(100 choose 5) = 0.3913 1-P(0) = 1- 0.3913 = 0.6807 From Table A1, n=5, p=0.2 b(0;5,0.2) = 0.3277 1-P(0) = 1-.3277 = 0.6723 NOTE: If N = 200, then the hypergeometric distribution yields P(X > 1) = 0.676 Comparing to example 5.14, we can see that the binomial approximation gets very close as N gets very large relative to n. JMB Chapter 5 Part 2 EGR 252.001 Spring 2009 (Compare to example 5.14, pg. 129)

Negative Binomial Distribution b* A binomial experiment in which trials are repeated until a fixed number of successes occur Example: Historical data indicates that 30% of all bits transmitted through a digital transmission channel are received in error. An engineer is running an experiment to try to classify these errors, and will start by gathering data on the first 10 errors encountered. What is the probability that the 10th error will occur on the 25th trial? JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Negative Binomial Example This example follows a negative binomial distribution: Repeated independent trials. Probability of success = p and probability of failure = q = 1-p. Random variable, X, is the number of the trial on which the kth success occurs. The probability associated with the kth success occurring on trial x is given by, Where, k = “success number” = 10 x = trial number on which k occurs = 25 p = probability of success (error) = 0.3 q = 1 – p = 0.7 JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Negative Binomial Distribution In our example, k = “success number” = 10 x = trial number on which k occurs = 25 p = probability of success (error) = 0.3 q = 1 – p = 0.7 b*(15;10,0.1) = (24 choose 9)(.3)10(.7)15 = 0.037 JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Geometric Distribution Example: In our example, what is the probability that the 1st bit received in error will occur on the 5th trial? This is an example of the geometric distribution, which is a special case of the negative binomial in which k = 1. The probability associated with the 1st success occurring on trial x is given by = (0.3)(0.7)4 = 0.072 (0.3)(0.7)4 = 0.072 JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Your turn … A worn machine tool produces 1% defective parts. If we assume that parts produced are independent: What is the probability that the 2nd defective part will be the 6th one produced? What is the probability that the 1st defective part will be seen before 3 are produced? How many parts can we expect to produce before we see the 1st defective part? (Hint: see Theorem 5.4, pg. 161) b*(6:2,0.01) = (5 choose 1)(.01)2(.99)4 = 0.00048 P(X<3) = P(1)+P(2) = (0.01)*(0.99)1-1 + (0.01)*(0.99)2-1 = 0.0199 μ = 1/p = 1/0.2 = 5 JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Poisson Process The number of occurrences in a given interval or region with the following properties: “memoryless” ie number in one interval is independent of the number in a different interval P(occurrence) during a very short interval or small region is proportional to the size of the interval and doesn’t depend on number occurring outside the region or interval. P(X>1) in a very short interval is negligible memoryless  number in one interval is independent of the number in a different interval JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Poisson Process Situations Number of bits transmitted per minute. Number of calls to customer service in an hour. Number of bacteria in a given sample. Number of hurricanes per year in a given region. memoryless  number in one interval is independent of the number in a different interval JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Service Call Example - Poisson Process An average of 2.7 service calls per minute are received at a particular maintenance center. The calls correspond to a Poisson process. To determine personnel and equipment needs to maintain a desired level of service, the plant manager needs to be able to determine the probabilities associated with numbers of service calls. JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Poisson Distribution Probabilities The probability associated with the number of occurrences in a given period of time is given by, Where, λ = average number of outcomes per unit time or region t = time interval or region JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Our Example: λ = 2.7 and t = 1 minute What is the probability that fewer than 2 calls will be received in any given minute? The probability that fewer than 2 calls will be received in any given minute is P(X < 2) = P(X = 0) + P(X = 1) The mean and variance are both λt, so μ = λt =________________ Note: Table A.2, pp. 748-749, gives Σt p(x;μ) P(x=0) = e-2.72.70/0! + e-2.72.71/1! = 0.2487 μ = 2.7 JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Service Call Example - Part 2 If more than 6 calls are received in a 3-minute period, an extra service technician will be needed to maintain the desired level of service. What is the probability of that happening? μ = λt = (2.7) (3)= 8.4 8.4 is not in the table; use basic equation Suppose λt = 8; see table with μ = 8 and r = 6 P(X > 6) = 1 – P(X < 6) = 1 - 0.3134 = 0.6866 μ = 2.7*3 = 8.1 ≈ 8 see page 668 , with μ = 8 and r = 6, P(X < 6) = 0.3134 P = 1-0.3134 = 0.6866 JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Poisson Distribution JMB Chapter 5 Part 2 EGR 252.001 Spring 2009

Poisson Distribution The effect of λ on the Poisson distribution JMB Chapter 5 Part 2 EGR 252.001 Spring 2009