Statistics for Management and Economics Chapter 6

Slides:



Advertisements
Similar presentations
Probability.
Advertisements

Probability Simple Events
Presentation 5. Probability.
Chapter 4 Probability and Probability Distributions
© 2011 Pearson Education, Inc
Probability Week 4 GT Probability A value between zero and one, inclusive, describing the relative possibility (chance or likelihood) an event will.
NIPRL Chapter 1. Probability Theory 1.1 Probabilities 1.2 Events 1.3 Combinations of Events 1.4 Conditional Probability 1.5 Probabilities of Event Intersections.
Chapter 4 Using Probability and Probability Distributions
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. 6.1 Chapter Six Probability.
Business and Economics 7th Edition
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 4-1 Business Statistics: A Decision-Making Approach 7 th Edition Chapter.
Keller: Stats for Mgmt & Econ, 7th Ed Probability
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. 6.1 Chapter Six Probability.
Chapter 4 Probability.
Union… The union of two events is denoted if the event that occurs when either or both event occurs. It is denoted as: A or B We can use this concept to.
Chap 4-1 EF 507 QUANTITATIVE METHODS FOR ECONOMICS AND FINANCE FALL 2008 Chapter 4 Probability.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
PROBABILITY (6MTCOAE205) Chapter 2 Probability.
Chapter 4 Basic Probability
Special Topics. Definitions Random (not haphazard): A phenomenon or trial is said to be random if individual outcomes are uncertain but the long-term.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. 6.1 Chapter Six Probability.
Probability.
Chapter 4 Probability See.
Lecture Slides Elementary Statistics Twelfth Edition
SESSION 27 & 28 Last Update 6 th April 2011 Probability Theory.
Theory of Probability Statistics for Business and Economics.
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
Chapter Six Probability Sir Naseer Shahzada. There is a 99% chance we’ll do… …in today’s class !
BA 201 Lecture 6 Basic Probability Concepts. Topics Basic Probability Concepts Approaches to probability Sample spaces Events and special events Using.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 4-1 Chapter 4 Basic Probability Business Statistics: A First Course 5 th Edition.
© 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin A Survey of Probability Concepts Chapter 5.
LECTURE 15 THURSDAY, 15 OCTOBER STA 291 Fall
LECTURE 14 TUESDAY, 13 OCTOBER STA 291 Fall
Chapter 4 Probability ©. Sample Space sample space.S The possible outcomes of a random experiment are called the basic outcomes, and the set of all basic.
Copyright © 2010 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
1 CHAPTERS 14 AND 15 (Intro Stats – 3 edition) PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY.
1 Chapter 4 – Probability An Introduction. 2 Chapter Outline – Part 1  Experiments, Counting Rules, and Assigning Probabilities  Events and Their Probability.
BIOSTAT 3 Three tradition views of probabilities: Classical approach: make certain assumptions (such as equally likely, independence) about situation.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
From Randomness to Probability Chapter 14. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved Chapter 4 Introduction to Probability Experiments, Counting Rules, and Assigning Probabilities.
1 Probability Chapter Assigning probabilities to Events Random experiment –a random experiment is a process or course of action, whose outcome.
BIA 2610 – Statistical Methods
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
1 Chapter 4, Part 1 Basic ideas of Probability Relative Frequency, Classical Probability Compound Events, The Addition Rule Disjoint Events.
Chapter 4 Probability Concepts Events and Probability Three Helpful Concepts in Understanding Probability: Experiment Sample Space Event Experiment.
PROBABILITY AND BAYES THEOREM 1. 2 POPULATION SAMPLE PROBABILITY STATISTICAL INFERENCE.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 4-1 Chapter 4 Basic Probability Business Statistics: A First Course 5 th Edition.
Sets and Probability Chapter 7. Ch. 7 Sets and Probabilities 4-2 Basic Concepts of Probability 4-3 Addition Rule 4-4 Multiplication Rule 4-5 Multiplication.
3-1 Copyright © 2014, 2011, and 2008 Pearson Education, Inc.
AP Statistics From Randomness to Probability Chapter 14.
Chapter 2 Probability. Motivation We need concept of probability to make judgments about our hypotheses in the scientific method. Is the data consistent.
Probability and Probability Distributions. Probability Concepts Probability: –We now assume the population parameters are known and calculate the chances.
Yandell – Econ 216 Chap 4-1 Chapter 4 Basic Probability.
Chapter 3 Probability.
Chapter 4 Probability Concepts
Chapter 6 6.1/6.2 Probability Probability is the branch of mathematics that describes the pattern of chance outcomes.
Chapter 4 Basic Probability.
Chapter 6 Probability Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc.
MAT 446 Supplementary Note for Ch 2
A Survey of Probability Concepts
Statistics for Business and Economics
Statistics for 8th Edition Chapter 3 Probability
Honors Statistics From Randomness to Probability
Keller: Stats for Mgmt & Econ, 7th Ed Probability
Presentation transcript:

Statistics for Management and Economics Chapter 6 Probability Statistics for Management and Economics Chapter 6

Objectives Assigning Probability to Events Joint, Marginal, and Conditional Probability Probability Rules and Trees Bayes’ Law Identifying the Correct Method

Probability Chance There’s a 99% chance that we’ll discuss Probability in today’s class Critical component of statistical inference Used for decision making

Random Experiment Experiment Outcomes Flip a coin Heads, Tails A random experiment is an action or process that leads to one of several possible outcomes. For example: Experiment Outcomes Flip a coin Heads, Tails Exam Marks Numbers: 0, 1, 2, ..., 100 Assembly Time t > 0 seconds Course Grades F, D, C, B, A, A+

Probabilities List the outcomes of a random experiment… This list must be exhaustive, i.e. ALL possible outcomes included. Die roll {1,2,3,4,5} Die roll {1,2,3,4,5,6} The list must be mutually exclusive, i.e. no two outcomes can occur at the same time: Die roll {odd number or even number} Die roll{ number less than 4 or even number}

Sample Space A list of exhaustive and mutually exclusive outcomes is called a sample space and is denoted by S. The outcomes are denoted by O1, O2, …, Ok Using notation from set theory, we can represent the sample space and its outcomes as: S = {O1, O2, …, Ok}

Requirements of Probabilities Given a sample space S = {O1, O2, …, Ok}, the probabilities assigned to the outcome must satisfy these requirements: The probability of any outcome is between 0 and 1 i.e. 0 ≤ P(Oi) ≤ 1 for each i, and The sum of the probabilities of all the outcomes equals 1 i.e. P(O1) + P(O2) + … + P(Ok) = 1 P(Oi) represents the probability of outcome i

Approaches to Assigning Probabilities Classical approach: make certain assumptions (such as equally likely, independence) about situation. Relative frequency: assigning probabilities based on experimentation or historical data. Subjective approach: Assigning probabilities based on judgment or prior experience

Classical Approach If an experiment has n possible outcomes, this method would assign a probability of 1/n to each outcome. Experiment: Rolling a die Sample Space: S = {1, 2, 3, 4, 5, 6} Probabilities: Each sample point has a 1/6 chance of occurring.

What are the underlying, unstated assumptions?? Classical Approach Experiment: Rolling dice Sample Space: S = {2, 3, …, 12} Probability Examples: P(2) = 1/36 1 2 3 4 5 6 7 8 9 10 11 12 P(6) = 5/36 P(10) = 3/36 What are the underlying, unstated assumptions??

Relative Frequency Approach Bits & Bytes Computer Shop tracks the number of desktop computer systems it sells over a month (30 days): For example, 10 days out of 30 2 desktops were sold. From this we can construct the probabilities of an event (i.e. the # of desktop sold on a given day)… Desktops Sold # of Days 1 2 10 3 12 4 5

Relative Frequency Approach Desktops Sold # of Days 1 1/30 = .03 2 2/30 = .07 10 10/30 = .33 3 12 12/30 = .40 4 5 5/30 = .17 ∑ = 1.00 “There is a 40% chance Bits & Bytes will sell 3 desktops on any given day”

Subjective Approach “In the subjective approach we define probability as the degree of belief that we hold in the occurrence of an event” E.g. weather forecasting’s “P.O.P.”: “Probability of Precipitation” (or P.O.P.) is defined in different ways by different forecasters, but basically it’s a subjective probability based on past observations combined with current weather conditions. POP 60% – based on current conditions, there is a 60% chance of rain (say).

Events & Probabilities An individual outcome of a sample space is called a simple event, while An event is a collection or set of one or more simple events in a sample space. Roll of a die: S = {1, 2, 3, 4, 5, 6} Simple event: the number “3” will be rolled Event: an even number (one of 2, 4, or 6) will be rolled

Events & Probabilities The probability of an event is the sum of the probabilities of the simple events that constitute the event. E.g. (assuming a fair die) S = {1, 2, 3, 4, 5, 6} P(1) = P(2) = P(3) = P(4) = P(5) = P(6) = 1/6 SO… P(EVEN) = P(2) + P(4) + P(6) = 1/6 + 1/6 + 1/6 = 3/6 = 1/2

Interpreting Probability One way to interpret probability is this: If a random experiment is repeated an infinite number of times, the relative frequency for any given outcome is the probability of this outcome. For example: the probability of heads in flip of a balanced coin is .5, determined using the classical approach. The probability is interpreted as being the long-term relative frequency of heads if the coin is flipped an infinite number of times. So why don’t we just compute the probability of heads from a few flips?

The Coin Toss The result of any single coin toss is random. But the result over many tosses is predictable, as long as the trials are independent (i.e., the outcome of a new coin toss is not influenced by the result of the previous toss). The probability of heads is 0.5 = the proportion of times you get heads in many repeated trials. First series of tosses Second series

Independence of Events One important first step in dealing with probabilities of more than one event is to establish independence (or not). There are mathematical ways to determine which we will learn. Two events are independent if the probability that one event occurs on any given trial of an experiment is not affected or changed by the occurrence of the other event. When are events not independent?

Combinations of Events Intersection of Two Events Marginal Probability Conditional Probability Independence Union of Two Events

“Credit Scorecards” Credit scorecards are used by banks and financial institutions to determine whether applicants will receive loans. The scorecard is the product of a statistical technique that converts questions about income, residence, and other variables into a score. The higher the score, the higher the probability is that the applicant will repay. A cutoff score would be used to predict those who will repay and those who will default. Because no score card is perfect, it is possible to make two types of error: granting credit to those who will default and not lending money to those who would have repaid.

Intersection of Two Events Credit scorecards are used by financial institutions to help decide to whom loans should be granted. An analysis of the records of one bank produced the following probabilities Score Under 400 Score of 400 or More Fully Repaid 0.19 0.64 Defaulted 0.13 0.04 The intersection of two events occurs when both A and B occur. We note this event as: A and B. Or sometimes: A  B This is the probability that an applicant has a credit score under 400 AND loan was fully repaid; it’s a joint probability. A joint probability is the probability of the intersection, or: P(A and B)

Often this notation represents the variables described. A Little Notation Often shorthand notation is used to represent the events: R = Applicant fully repaid the loan D = Applicant defaulted on the loan U4 = Applicant’s score was under 400 O4 = Applicant’s score was 400 or more Often this notation represents the variables described. P(D and U4) = 0.13 This represents the probability an applicant defaulted on a loan and the had a score under 400. U4 O4 R 0.19 0.64 D 0.13 0.04

Marginal Probabilities Marginal probabilities are computed by adding across rows and down columns; that is they are calculated in the margins of the table: P(D) = .13 + .04 “What’s the probability an applicant defaulted on a loan?” U4 O4 P(Rowi) R 0.19 0.64 0.83 D 0.13 0.04 0.17 P(Columnj) 0.32 0.68 1.00 “What’s the probability an applicant had a score under 400?” P(U4) = .19 + .13 BOTH margins must add to 1 (useful error check)

Conditional Probability Conditional probability is used to determine how two events are related; that is, we can determine the probability of one event given the occurrence of another related event. Conditional probabilities are written as P(A | B) and read as “the probability of A given B” and is calculated as:

Conditional Probability The probability of an event given that another event has occurred is called a conditional probability… Note how “A given B” and “B given A” are related!

Conditional Probability “What’s the probability that a credit score was under 400 given that the applicant fully repaid a loan?” Thus, we want to know “what is P(U4 |R) ?” The | symbol represents “given”

Conditional Probability We want to calculate P(U4 | R) U4 O4 P(rowi) R 0.19 0.64 0.83 D 0.13 0.04 0.17 P(colj) 0.32 0.68 1.00 Thus, there is a 22.9% chance that an applicant has a credit score under 400 given that he or she has repaid a loan .

Independence Another use for calculating conditional probability is to determine whether two events are related. In particular, we would like to know whether they are independent, that is, if the probability of one event is not affected by the occurrence of the other event. Two events A and B are said to be independent if P(A|B) = P(A) or P(B|A) = P(B) WHY?

Independence For example, we saw that P(U4 | R) = .229 The marginal probability for U4 is: P(U4) = 0.32 Since P(U4|R) ≠ P(U4), payment status (repaid vs. defaulted) and credit score (over vs. under 400) are not independent events. Stated another way, they are dependent. That is, the probability of one event (payment) is affected by the occurrence of the other event (credit score).

Union of Events The union of two events is denoted is the event that occurs when either or both event occurs. It is denoted as: A or B or sometimes as A U B We can use this concept to answer questions like: “Determine the probability that an applicant fully repays a loan or has a credit score under 400.”

Union of Events R or U4 occurs whenever: U4 O4 P(rowi) R D P(colj) Determine the probability that an applicant fully repays a loan (R) or the applicant has a credit score <400 (U4). R or U4 occurs whenever: R and U4 occurs, R and O4 occurs, or O4 and D occurs… U4 O4 P(rowi) R 0.19 0.64 0.83 D 0.13 0.04 0.17 P(colj) 0.32 0.68 1.00 P(R or U4) = .19 + .64 + .13 = .96

Union of Events U4 O4 P(rowi) R D P(colj) Determine the probability that an applicant fully repays a loan (R) or the applicant has a credit score >400 (O4). O4 Be sure not to count this cell twice!! U4 O4 P(rowi) R 0.19 0.64 0.83 D 0.13 0.04 0.17 P(colj) 0.32 0.68 1.00 R P(U4 or R) = .19 + .64 + .04 = .87

Alternative Calculation: Union Take 100% and subtract off “when doesn’t R or O4 occur?” At D and U4 O4 U4 O4 P(Ai) R 0.19 0.64 0.83 D 0.13 0.04 0.17 P(Bj) 0.32 0.68 1.00 R P(R or O4) = 1 – P(D and U4) = 1 – .13 = .87

Probability Rules There are three rules that enable us to calculate the probability of more complex events from the probability of simpler events… The Complement Rule The Multiplication Rule The Addition Rule

The Complement Rule The complement of an event A is the event that occurs when A does not occur. The complement rule gives us the probability of an event NOT occurring. That is: P(AC) = 1 – P(A) For example, in the simple roll of a fair die, the probability of the number “1” being rolled is 1/6. The probability that some number other than “1” will be rolled is 1 – 1/6 = 5/6.

Contract Bidding An aerospace company has submitted bids on two separate federal government defense contracts. The company president believes that there is a 40% probability of winning the first contract. If they win the first contract, the probability of winning the second is 70%. However, if they lose the first contract, the president thinks that the probability of winning the second contract decreases to 50%. Let’s say that… F+ = The event of winning the first contract F- = The event of losing the first contract S+ = The event of winning the second contract S- = The event of losing the second contract So we know that… P(F+) = 0.40 and P(F-) = 0.60 What else do we know?

The Multiplication Rule The multiplication rule is used to calculate the joint probability of two events. It is based on the formula for conditional probability defined earlier: If we multiply both sides of the equation by P(B) we have: P(A and B) = P(A | B)•P(B) Likewise, P(A and B) = P(B | A) • P(A) If A and B are independent events, then P(A and B) = P(A)•P(B)

We know (from the problem): Contract Bidding F+ = The event of winning the first contract F- = The event of losing the first contract S+ = The event of winning the second contract S- = The event of losing the second contract We know (from the problem): “If they win the first contract, the probability of winning the second is 70%.” – let’s reword this as “given that they’ve won the first contract, the probability of winning the second is 70%” OR P(S+ | F+) = 0.70 “…if they lose the first contract the probability of winning the second…[is] 50%.” OR P(S+ | F-) = 0.50 And… P(S- | F+) = 0.30 P(S- | F-) = 0.50 WHY?

The Addition Rule P(A or B) = P(A) + P(B) – P(A and B) Recall: the addition rule was introduced earlier to provide a way to compute the probability of event A or B or both A and B occurring; i.e. the union of A and B. P(A or B) = P(A) + P(B) We much subtract the joint probability P(A and B) from the sum of the probabilities of A and B? Why? P(A or B) = P(A) + P(B) – P(A and B)

P(R or O4) = P(R) + P(O4) – P(R and O4) = 0.83 + 0.68 – 0.64 = .87 Addition Rule P(R) = .19 + .64 = .83 and P(O4) = .64 + .04 = .68 By adding P(R) plus P(O4) we add P(R and O4) twice. To correct we subtract P(R and O4) from P(R) + P(O4) O4 U4 O4 P(rowi) R 0.19 0.64 0.83 D 0.13 0.04 0.17 P(colj) 0.32 0.68 1.00 R P(R or O4) = P(R) + P(O4) – P(R and O4) = 0.83 + 0.68 – 0.64 = .87

Addition Rule for Mutually Exclusive Events If and A and B are mutually exclusive the occurrence of one event makes the other one impossible. This means that P(A and B) = 0 The addition rule for mutually exclusive events is P(A or B) = P(A) + P(B) We often use this form when we add some joint probabilities calculated from a probability tree

Probability Trees A probability tree is a simple and effective method of applying the probability rules by representing events in an experiment by lines. The resulting figure resembles a tree. First selection Second selection P(F) = 3/10 P( M) = 7/10 P(F|M) = 3/9 P(F|F) = 2/9 P( M|M) = 6/9 P( M|F) = 7/9 This is P(F), the probability of selecting a female student first This is P(F|F), the probability of selecting a female student second, given that a female was already chosen first