Probability Concepts and Applications
Learning Objectives Understand the basic foundations of probability analysis. Describe statistically dependent and independent events. Use Bayes’ theorem to establish posterior probabilities. Describe and provide examples of both discrete and continuous random variables. Explain the difference between discrete and continuous probability distributions. Calculate expected values and variances and use the normal table.
Introduction Life is uncertain; we are not sure what the future will bring. Probability is a numerical statement about the likelihood that an event will occur.
Fundamental Concepts The probability, P, of any event or state of nature occurring is greater than or equal to 0 and less than or equal to 1. That is: 0 P (event) 1 The sum of the simple probabilities for all possible outcomes of an activity must equal 1.
Diversey Paint Example Demand for white latex paint at Diversey Paint and Supply has always been either 0, 1, 2, 3, or 4 gallons per day. Over the past 200 days, the owner has observed the following frequencies of demand:
Diversey Paint Example Demand for white latex paint at Diversey Paint and Supply has always been either 0, 1, 2, 3, or 4 gallons per day Over the past 200 days, the owner has observed the following frequencies of demand Notice the individual probabilities are all between 0 and 1 0 ≤ P (event) ≤ 1 And the total of all event probabilities equals 1 ∑ P (event) = 1.00 QUANTITY DEMANDED NUMBER OF DAYS PROBABILITY 40 0.20 (= 40/200) 1 80 0.40 (= 80/200) 2 50 0.25 (= 50/200) 3 20 0.10 (= 20/200) 4 10 0.05 (= 10/200) Total 200 1.00 (= 200/200)
Number of occurrences of the event Total number of trials or outcomes Determining objective probability : Relative frequency Typically based on historical data P (event) = Number of occurrences of the event Total number of trials or outcomes Classical or logical method Logically determine probabilities without trials P (head) = 1 2 Number of ways of getting a head Number of possible outcomes (head or tail)
Subjective probability is based on the experience and judgment of the person making the estimate. Opinion polls Judgment of experts Delphi method The Delphi method (/ˈdɛlfaɪ/ del-fy) is a structured communication technique or method, originally developed as a systematic, interactive forecasting method which relies on a panel of experts.[1][2][3][4] The experts answer questionnaires in two or more rounds. After each round, a facilitator or change agent[5] provides an anonymous summary of the experts’ forecasts from the previous round as well as the reasons they provided for their judgments. Thus, experts are encouraged to revise their earlier answers in light of the replies of other members of their panel. It is believed that during this process the range of the answers will decrease and the group will converge towards the "correct" answer.
Mutually Exclusive Events Events are said to be mutually exclusive if only one of the events can occur on any one trial. Tossing a coin will result in either a head or a tail. Rolling a die will result in only one of six possible outcomes.
Collectively Exhaustive Events Events are said to be collectively exhaustive if the list of outcomes includes every possible outcome. Both heads and tails as possible outcomes of coin flips. All six possible outcomes of the roll of a die. OUTCOME OF ROLL PROBABILITY 1 1/6 2 3 4 5 6 Total 1
Drawing a Card Draw one card from a deck of 52 playing cards P (drawing a 7) = 4/52 = 1/13 P (drawing a heart) = 13/52 = 1/4 These two events are not mutually exclusive since a 7 of hearts can be drawn These two events are not collectively exhaustive since there are other cards in the deck besides 7s and hearts
COLLECTIVELY EXHAUSTIVE Table of Differences DRAWS MUTUALLY EXCLUSIVE COLLECTIVELY EXHAUSTIVE 1. Draws a spade and a club Yes No 2. Draw a face card and a number card 3. Draw an ace and a 3 4. Draw a club and a nonclub 5. Draw a 5 and a diamond 6. Draw a red card and a diamond
Adding Mutually Exclusive Events We often want to know whether one or a second event will occur. When two events are mutually exclusive, the law of addition is: P (event A or event B) = P (event A) + P (event B) P (spade or club) = P (spade) + P (club) = 13/52 + 13/52 = 26/52 = 1/2 = 0.50
Adding Not Mutually Exclusive Events The equation must be modified to account for double counting. The probability is reduced by subtracting the chance of both events occurring together. P (event A or event B) = P (event A) + P (event B) – P (event A and event B both occurring) P (A or B) = P (A) + P (B) – P (A and B) P(five or diamond) = P(five) + P(diamond) – P(five and diamond) = 4/52 + 13/52 – 1/52 = 16/52 = 4/13
Venn Diagrams P (A) P (B) P (A) P (B) P (A and B) Events that are mutually exclusive. P (A or B) = P (A) + P (B) Figure 2.1 Events that are not mutually exclusive. P (A or B) = P (A) + P (B) – P (A and B) Figure 2.2
Statistically Independent Events Events may be either independent or dependent. For independent events, the occurrence of one event has no effect on the probability of occurrence of the second event.
Which Sets of Events Are Independent? 1. (a) Your education (b) Your income level 2. (a) Draw a jack of hearts from a full 52-card deck (b) Draw a jack of clubs from a full 52-card deck 3. (a) Chicago Cubs win the National League pennant (b) Chicago Cubs win the World Series 4. (a) Snow in Santiago, Chile (b) Rain in Tel Aviv, Israel Dependent events Independent events Dependent events Independent events
Three Types of Probabilities Marginal (or simple) probability is just the probability of a single event occurring. P (A) Joint probability is the probability of two or more events occurring and is equal to the product of their marginal probabilities for independent events. P (AB) = P (A) x P (B) Conditional probability is the probability of event B given that event A has occurred. P (B | A) = P (B) Or the probability of event A given that event B has occurred P (A | B) = P (A)
Joint Probability Example The probability of tossing a 6 on the first roll of the die and a 2 on the second roll: P (6 on first and 2 on second) = P (tossing a 6) x P (tossing a 2) = 1/6 x 1/6 = 1/36 = 0.028
P (B) = 0.30 (a marginal probability) Independent Events A bucket contains 3 black balls and 7 green balls. Draw a ball from the bucket, replace it, and draw a second ball. The probability of a black ball drawn on first draw is: P (B) = 0.30 (a marginal probability) The probability of two green balls drawn is: P (GG) = P (G) x P (G) = 0.7 x 0.7 = 0.49 (a joint probability for two independent events)
P (G | G) = P (G) = 0.70 (a conditional probability as in event 3) Independent Events A bucket contains 3 black balls and 7 green balls. Draw a ball from the bucket, replace it, and draw a second ball. The probability of a black ball drawn on the second draw if the first draw is green is: P (B | G) = P (B) = 0.30 (a conditional probability but equal to the marginal because the two draws are independent events) The probability of a green ball drawn on the second draw if the first draw is green is: P (G | G) = P (G) = 0.70 (a conditional probability as in event 3)
Statistically Dependent Events The marginal probability of an event occurring is computed in the same way: P (A) P (A | B) = P (AB) P (B) Calculating conditional probabilities is slightly more complicated. The probability of event A given that event B has occurred is: The formula for the joint probability of two events is: P (AB) = P (B | A) P (A)
When Events Are Dependent Assume that we have an urn containing 10 balls of the following descriptions: 4 are white (W) and lettered (L) 2 are white (W) and numbered (N) 3 are yellow (Y) and lettered (L) 1 is yellow (Y) and numbered (N) P (WL) = 4/10 = 0.4 P (YL) = 3/10 = 0.3 P (WN) = 2/10 = 0.2 P (YN) = 1/10 = 0.1 P (W) = 6/10 = 0.6 P (L) = 7/10 = 0.7 P (Y) = 4/10 = 0.4 P (N) = 3/10 = 0.3
When Events Are Dependent The urn contains 10 balls: 4 balls White (W) and Lettered (L) 2 balls Numbered (N) 3 balls Yellow (Y) 1 ball Yellow (Y) and Numbered (N) Probability (WL) = 4 10 Probability (WN) = 2 10 Probability (YL) = 3 10 Figure 2.3 Probability (YN) = 1 10
When Events Are Dependent The conditional probability that the ball drawn is lettered, given that it is yellow, is: P (L | Y) = = = 0.75 P (YL) P (Y) 0.3 0.4 We can verify P (YL) using the joint probability formula P (YL) = P (L | Y) x P (Y) = (0.75)(0.4) = 0.3
Joint Probabilities for Dependent Events If the stock market reaches 12,500 point by January, there is a 70% probability that Tubeless Electronics will go up. You believe that there is only a 40% chance the stock market will reach 12,500. Let M represent the event of the stock market reaching 12,500 and let T be the event that Tubeless goes up in value. P (MT) = P (T | M) x P (M) = (0.70)(0.40) = 0.28
Revising Probabilities with Bayes’ Theorem Bayes’ theorem is used to incorporate additional information and help create posterior probabilities. Prior Probabilities New Information Bayes’ Process Posterior Probabilities
Posterior Probabilities A cup contains two dice identical in appearance but one is fair (unbiased), the other is loaded (biased). The probability of rolling a 3 on the fair die is 1/6 or 0.166. The probability of tossing the same number on the loaded die is 0.60. We select one by chance, toss it, and get a 3. What is the probability that the die rolled was fair? What is the probability that the loaded die was rolled?
Posterior Probabilities We know the probability of the die being fair or loaded is: P (fair) = 0.50 P (loaded) = 0.50 And that P (3 | fair) = 0.166 P (3 | loaded) = 0.60 We compute the probabilities of P (3 and fair) and P (3 and loaded): P (3 and fair) = P (3 | fair) x P (fair) = (0.166)(0.50) = 0.083 P (3 and loaded) = P (3 | loaded) x P (loaded) = (0.60)(0.50) = 0.300
Posterior Probabilities The sum of these probabilities gives us the unconditional probability of tossing a 3: P (3) = 0.083 + 0.300 = 0.383 We know the probability of the die being fair or loaded is P (fair) = 0.50 P (loaded) = 0.50 And that P (3 | fair) = 0.166 P (3 | loaded) = 0.60 We compute the probabilities of P (3 and fair) and P (3 and loaded) P (3 and fair) = P (3 | fair) x P (fair) = (0.166)(0.50) = 0.083 P (3 and loaded) = P (3 | loaded) x P (loaded) = (0.60)(0.50) = 0.300
Posterior Probabilities P (fair | 3) = = = 0.22 P (fair and 3) P (3) 0.083 0.383 If a 3 does occur, the probability that the die rolled was the fair one is: P (loaded | 3) = = = 0.78 P (loaded and 3) P (3) 0.300 0.383 The probability that the die was loaded is: These are the revised or posterior probabilities for the next roll of the die. We use these to revise our prior probability estimates.
Bayes’ Calculations Given event B has occurred: Given a 3 was rolled: STATE OF NATURE P (B | STATE OF NATURE) PRIOR PROBABILITY JOINT PROBABILITY POSTERIOR PROBABILITY A P(B | A) x P(A) = P(B and A) P(B and A)/P(B) = P(A|B) A’ P(B | A’) x P(A’) = P(B and A’) P(B and A’)/P(B) = P(A’|B) P(B) Table 2.2 Given a 3 was rolled: STATE OF NATURE P (B | STATE OF NATURE) PRIOR PROBABILITY JOINT PROBABILITY POSTERIOR PROBABILITY Fair die 0.166 x 0.5 = 0.083 0.083 / 0.383 = 0.22 Loaded die 0.600 = 0.300 0.300 / 0.383 = 0.78 P(3) = 0.383
General Form of Bayes’ Theorem We can compute revised probabilities more directly by using: where the complement of the event ; for example, if is the event “fair die”, then is “loaded die”. = ¢ A
General Form of Bayes’ Theorem This is basically what we did in the previous example: Replace with “fair die” Replace with “loaded die Replace with “3 rolled” We get
Further Probability Revisions We can obtain additional information by performing the experiment a second time If you can afford it, perform experiments several times. We roll the die again and again get a 3.
Further Probability Revisions We can obtain additional information by performing the experiment a second time If you can afford it, perform experiments several times We roll the die again and again get a 3
Further Probability Revisions We can obtain additional information by performing the experiment a second time If you can afford it, perform experiments several times We roll the die again and again get a 3 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall
Further Probability Revisions After the first roll of the die: probability the die is fair = 0.22 probability the die is loaded = 0.78 After the second roll of the die: probability the die is fair = 0.067 probability the die is loaded = 0.933
X = number of refrigerators sold during the day Random Variables A random variable assigns a real number to every possible outcome or event in an experiment. X = number of refrigerators sold during the day Discrete random variables can assume only a finite or limited set of values. Continuous random variables can assume any one of an infinite set of values.
Random Variables – Numbers EXPERIMENT OUTCOME RANDOM VARIABLES RANGE OF RANDOM VARIABLES Stock 50 Christmas trees Number of Christmas trees sold X 0, 1, 2,…, 50 Inspect 600 items Number of acceptable items Y 0, 1, 2,…, 600 Send out 5,000 sales letters Number of people responding to the letters Z 0, 1, 2,…, 5,000 Build an apartment building Percent of building completed after 4 months R 0 ≤ R ≤ 100 Test the lifetime of a lightbulb (minutes) Length of time the bulb lasts up to 80,000 minutes S 0 ≤ S ≤ 80,000