John Loucks St. Edward’s University . SLIDES . BY.

Slides:



Advertisements
Similar presentations
QUANTITATIVE METHODS FOR BUSINESS 8e
Advertisements

1 1 Slide © 2008 Thomson South-Western. All Rights Reserved © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or.
1 Decision Analysis with Sample Information. 2 I begin here with our familiar example. States of Nature Decision Alternatives s1s2 d187 d2145 d320-9 At.
Decision Making Under Risk Continued: Bayes’Theorem and Posterior Probabilities MGS Chapter 8 Slides 8c.
Decision Theory.
1 1 Slide © 2001 South-Western College Publishing/Thomson Learning Anderson Sweeney Williams Anderson Sweeney Williams Slides Prepared by JOHN LOUCKS QUANTITATIVE.
1 1 Slide © 2004 Thomson/South-Western Payoff Tables n The consequence resulting from a specific combination of a decision alternative and a state of nature.
Chapter 18 Statistical Decision Theory Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Statistics for Business and Economics 7 th.
Decision Theory.
LECTURE TWELVE Decision-Making UNDER UNCERTAINITY.
Chapter 21 Statistical Decision Theory
Decision Analysis. What is Decision Analysis? The process of arriving at an optimal strategy given: –Multiple decision alternatives –Uncertain future.
Managerial Decision Modeling with Spreadsheets
1 1 Slide © 2000 South-Western College Publishing/ITP Slides Prepared by JOHN LOUCKS.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Dr. C. Lightner Fayetteville State University
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or.
Chapter 7 Decision Analysis
Slides prepared by JOHN LOUCKS St. Edward’s University.
BA 452 Lesson C.3 Statistical Decision Making ReadingsReadings Chapter 13 Decision Analysis.
Chapter 4 Decision Analysis.
Decision Analysis Chapter 13.
1 1 Slide Decision Analysis n Structuring the Decision Problem n Decision Making Without Probabilities n Decision Making with Probabilities n Expected.
Decision Analysis A method for determining optimal strategies when faced with several decision alternatives and an uncertain pattern of future events.
Chapter 8 Decision Analysis MT 235.
1 1 Slide Decision Analysis Professor Ahmadi. 2 2 Slide Decision Analysis Chapter Outline n Structuring the Decision Problem n Decision Making Without.
1 1 Slide © 2009 South-Western, a part of Cengage Learning Slides by John Loucks St. Edward’s University.
Business 260: Managerial Decision Analysis
BA 555 Practical Business Analysis
Decision Analysis Chapter 3
Decision Making Under Uncertainty and Under Risk
Decision analysis: part 1 BSAD 30 Dave Novak Source: Anderson et al., 2013 Quantitative Methods for Business 12 th edition – some slides are directly from.
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
BA 452 Lesson C.4 The Value of Information ReadingsReadings Chapter 13 Decision Analysis.
© 2008 Prentice Hall, Inc.A – 1 Operations Management Module A – Decision-Making Tools PowerPoint presentation to accompany Heizer/Render Principles of.
Operations Management Decision-Making Tools Module A
1 1 Slide © 2009 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Operational Decision-Making Tools: Decision Analysis
Decision Analysis Chapter 3
1 1 Slide © 2005 Thomson/South-Western EMGT 501 HW Solutions Chapter 12 - SELF TEST 9 Chapter 12 - SELF TEST 18.
© 2006 Prentice Hall, Inc.A – 1 Operations Management Module A – Decision-Making Tools © 2006 Prentice Hall, Inc. PowerPoint presentation to accompany.
MA - 1© 2014 Pearson Education, Inc. Decision-Making Tools PowerPoint presentation to accompany Heizer and Render Operations Management, Eleventh Edition.
Chapter 8 Decision Analysis n Problem Formulation n Decision Making without Probabilities n Decision Making with Probabilities n Risk Analysis and Sensitivity.
Decision Analysis (cont)
Chapter 3 Decision Analysis.
1 1 Slide Decision Theory Professor Ahmadi. 2 2 Slide Learning Objectives n Structuring the decision problem and decision trees n Types of decision making.
Chapter 9 - Decision Analysis - Part I
A - 1© 2011 Pearson Education, Inc. publishing as Prentice Hall A A Decision-Making Tools PowerPoint presentation to accompany Heizer and Render Operations.
Copyright © 2009 Cengage Learning 22.1 Chapter 22 Decision Analysis.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc Chapter 23 Decision Analysis.
Decision Theory McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
Models for Strategic Marketing Decision Making. Market Entry Decisions To enter first or to wait Sources of First-Mover Advantages –Technological leadership.
Decision Trees. Introduction Decision trees enable one to look at decisions: with many alternatives and states of nature which must be made in sequence.
To accompany Quantitative Analysis for Management, 7e by (Render/Stair 4-1 © 2000 by Prentice Hall, Inc., Upper Saddle River, N.J Quantitative Analysis.
Decision Analysis.
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Decision Analysis Anderson, Sweeney and Williams Chapter 4 Read: Sections 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, and appendix 4.1.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 16 Decision Analysis.
Chapter 19 Statistical Decision Theory ©. Framework for a Decision Problem action i.Decision maker has available K possible courses of action : a 1, a.
Chapter 8 Decision Analysis n Problem Formulation n Decision Making without Probabilities n Decision Making with Probabilities n Risk Analysis and Sensitivity.
1 1 Slide © 2005 Thomson/South-Western Chapter 13 Decision Analysis n Problem Formulation n Decision Making without Probabilities n Decision Making with.
Decision Tree Analysis
Decision Analysis Chapter 15.
Operations Management
Chapter 23 Decision Analysis.
MNG221- Management Science –
Chapter 13 Decision Analysis
Statistical Decision Theory
Decision Theory Analysis
Decision Analysis Decision Trees Chapter 3
Presentation transcript:

John Loucks St. Edward’s University . SLIDES . BY

Chapter 21 Decision Analysis Problem Formulation Decision Making with Probabilities Decision Analysis with Sample Information Computing Branch Probabilities Using Bayes’ Theorem

Problem Formulation The first step in the decision analysis process is problem formulation. We begin with a verbal statement of the problem. Then we identify: the decision alternatives the states of nature (uncertain future events) the payoffs (consequences) associated with each specific combination of: decision alternative state of nature

Problem Formulation A decision problem is characterized by decision alternatives, states of nature, and resulting payoffs. The decision alternatives are the different possible strategies the decision maker can employ. The states of nature refer to future events, not under the control of the decision maker, which may occur. States of nature should be defined so that they are mutually exclusive and collectively exhaustive.

Payoff Tables The consequence resulting from a specific combination of a decision alternative and a state of nature is a payoff. A table showing payoffs for all combinations of decision alternatives and states of nature is a payoff table. Payoffs can be expressed in terms of profit, cost, time, distance or any other appropriate measure.

Decision Trees A decision tree provides a graphical representation showing the sequential nature of the decision-making process. Each decision tree has two types of nodes: round nodes correspond to the states of nature square nodes correspond to the decision alternatives

Decision Trees The branches leaving each round node represent the different states of nature while the branches leaving each square node represent the different decision alternatives. At the end of each limb of a tree are the payoffs attained from the series of branches making up that limb.

Decision Making with Probabilities Once we have defined the decision alternatives and states of nature for the chance events, we focus on determining probabilities for the states of nature. The classical method, relative frequency method, or subjective method of assigning probabilities may be used. Because one and only one of the N states of nature can occur, the probabilities must satisfy two conditions: P(sj) > 0 for all states of nature

Decision Making with Probabilities Then we use the expected value approach to identify the best or recommended decision alternative. The expected value of each decision alternative is calculated (explained on the next slide). The decision alternative yielding the best expected value is chosen.

Expected Value Approach The expected value of a decision alternative is the sum of weighted payoffs for the decision alternative. The expected value (EV) of decision alternative di is defined as where: N = the number of states of nature P(sj ) = the probability of state of nature sj Vij = the payoff corresponding to decision alternative di and state of nature sj

Expected Value Approach Example: Burger Prince Burger Prince Restaurant is considering opening a new restaurant on Main Street. It has three different restaurant layout models (A, B, and C), each with a different seating capacity. Burger Prince estimates that the average number of customers served per hour will be 80, 100, or 120. The payoff table for the three models is on the next slide.

Expected Value Approach Payoff Table Average Number of Customers Per Hour s1 = 80 s2 = 100 s3 = 120 Model A Model B Model C $10,000 $15,000 $14,000 $ 8,000 $18,000 $12,000 $ 6,000 $16,000 $21,000

Expected Value Approach Calculate the expected value for each decision. The decision tree on the next slide can assist in this calculation. Here d1, d2, d3 represent the decision alternatives of models A, B, and C. And s1, s2, s3 represent the states of nature of 80, 100, and 120 customers per hour.

Expected Value Approach Payoffs Decision Tree .4 s1 10,000 s2 .2 2 15,000 s3 .4 d1 14,000 .4 s1 8,000 d2 3 1 s2 .2 18,000 s3 .4 d3 12,000 .4 s1 6,000 4 s2 .2 16,000 s3 .4 21,000

Expected Value Approach EMV = .4(10,000) + .2(15,000) + .4(14,000) = $12,600 d1 2 Model A EMV = .4(8,000) + .2(18,000) + .4(12,000) = $11,600 Model B d2 1 3 d3 EMV = .4(6,000) + .2(16,000) + .4(21,000) = $14,000 Model C 4 Choose the model with largest EV, Model C

Expected Value of Perfect Information Frequently, information is available that can improve the probability estimates for the states of nature. The expected value of perfect information (EVPI) is the increase in the expected profit that would result if one knew with certainty which state of nature would occur. The EVPI provides an upper bound on the expected value of any sample or survey information.

Expected Value of Perfect Information The expected value of perfect information is defined as EVPI = |EVwPI – EVwoPI| where: EVPI = expected value of perfect information EVwPI = expected value with perfect information about the states of nature EVwoPI = expected value without perfect information

Expected Value of Perfect Information EVPI Calculation Step 1: Determine the optimal return corresponding to each state of nature. Step 2: Compute the expected value of these optimal returns. Step 3: Subtract the EV of the optimal decision from the amount determined in step (2).

Expected Value of Perfect Information Calculate the expected value for the optimum payoff for each state of nature and subtract the EV of the optimal decision. EVPI = .4(10,000) + .2(18,000) + .4(21,000) - 14,000 = $2,000

Decision Analysis With Sample Information Knowledge of sample (survey) information can be used to revise the probability estimates for the states of nature. Prior to obtaining this information, the probability estimates for the states of nature are called prior probabilities. With knowledge of conditional probabilities for the outcomes or indicators of the sample or survey information, these prior probabilities can be revised by employing Bayes' Theorem. The outcomes of this analysis are called posterior probabilities or branch probabilities for decision trees.

Decision Analysis With Sample Information Decision Strategy A decision strategy is a sequence of decisions and chance outcomes. The decisions chosen depend on the yet to be determined outcomes of chance events. The approach used to determine the optimal decision strategy is based on a backward pass through the decision tree.

Decision Analysis With Sample Information Backward Pass Through the Decision Tree At Chance Nodes: Compute the expected value by multiplying the payoff at the end of each branch by the corresponding branch probability. At Decision Nodes: Select the decision branch that leads to the best expected value. This expected value becomes the expected value at the decision node.

Decision Analysis With Sample Information Example: Burger Prince Burger Prince must decide whether to purchase a marketing survey from Stanton Marketing for $1,000. The results of the survey are "favorable" or "unfavorable". The conditional probabilities are: P(favorable | 80 customers per hour) = .2 P(favorable | 100 customers per hour) = .5 P(favorable | 120 customers per hour) = .9

Decision Analysis With Sample Information Decision Tree (top half) s1 (.148) $10,000 s2 (.185) 4 $15,000 d1 s3 (.667) $14,000 s1 (.148) $8,000 d2 s2 (.185) 2 5 $18,000 s3 (.667) I1 (.54) d3 $12,000 s1 (.148) $6,000 s2 (.185) 6 $16,000 s3 (.667) 1 $21,000

Decision Analysis With Sample Information Decision Tree (bottom half) s1 (.696) 1 $10,000 I2 (.46) s2 (.217) 7 $15,000 d1 s3 (.087) $14,000 s1 (.696) $8,000 d2 s2 (.217) 3 8 $18,000 s3 (.087) d3 $12,000 s1 (.696) $6,000 s2 (.217) 9 $16,000 s3 (.087) $21,000

Decision Analysis With Sample Information EMV = .148(10,000) + .185(15,000) + .667(14,000) = $13,593 d1 4 $17,855 d2 EMV = .148 (8,000) + .185(18,000) + .667(12,000) = $12,518 2 5 I1 (.54) d3 EMV = .148(6,000) + .185(16,000) +.667(21,000) = $17,855 6 1 EMV = .696(10,000) + .217(15,000) +.087(14,000)= $11,433 7 d1 I2 (.46) d2 EMV = .696(8,000) + .217(18,000) + .087(12,000) = $10,554 3 8 d3 $11,433 EMV = .696(6,000) + .217(16,000) +.087(21,000) = $9,475 9

Expected Value of Sample Information The expected value of sample information (EVSI) is the additional expected profit possible through knowledge of the sample or survey information. EVSI = |EVwSI – EVwoSI| where: EVSI = expected value of sample information EVwSI = expected value with sample information about the states of nature EVwoSI = expected value without sample information

Expected Value of Sample Information EVwSI Calculation Step 1: Determine the optimal decision and its expected return for the possible outcomes of the sample using the posterior probabilities for the states of nature. Step 2: Compute the expected value of these optimal returns.

Decision Analysis With Sample Information 4 $13,593 $17,855 d2 2 5 $12,518 I1 (.54) d3 6 $17,855 EVwSI = .54(17,855) + .46(11,433) = $14,900.88 1 7 $11,433 d1 I2 (.46) d2 3 8 $10,554 d3 $11,433 9 $ 9,475

Expected Value of Sample Information If the outcome of the survey is "favorable”, choose Model C. If the outcome of the survey is “unfavorable”, choose Model A. EVwSI = .54($17,855) + .46($11,433) = $14,900.88

Expected Value of Sample Information EVSI Calculation Subtract the EVwoSI (the value of the optimal decision obtained without using the sample information) from the EVwSI. EVSI = .54($17,855) + .46($11,433) - $14,000 = $900.88 Conclusion Because the EVSI is less than the cost of the survey, the survey should not be purchased.

Computing Branch Probabilities Using Bayes’ Theorem Bayes’ Theorem can be used to compute branch probabilities for decision trees. For the computations we need to know: the initial (prior) probabilities for the states of nature, the conditional probabilities for the outcomes or indicators of the sample information given each state of nature. A tabular approach is a convenient method for carrying out the computations.

Computing Branch Probabilities Using Bayes’ Theorem Step 1 For each state of nature, multiply the prior probability by its conditional probability for the indicator. This gives the joint probabilities for the states and indicator. Step 2 Sum these joint probabilities over all states. This gives the marginal probability for the indicator. Step 3 For each state, divide its joint probability by the marginal probability for the indicator. This gives the posterior probability distribution.

Decision Analysis With Sample Information Example: Burger Prince Recall that Burger Prince is considering purchasing a marketing survey from Stanton Marketing. The results of the survey are "favorable“ or "unfavorable". The conditional probabilities are: P(favorable | 80 customers per hour) = .2 P(favorable | 100 customers per hour) = .5 P(favorable | 120 customers per hour) = .9 Compute the branch (posterior) probability distribution.

Posterior Probabilities Favorable State Prior Conditional Joint Posterior 80 100 120 .4 .2 .2 .5 .9 .08 .10 .36 .148 .185 .667 .08/.54 Total .54 1.000 P(favorable) = .54

Posterior Probabilities Unfavorable State Prior Conditional Joint Posterior 80 100 120 .4 .2 .8 .5 .1 .32 .10 .04 .696 .217 .087 .32/.46 Total .46 1.000 P(unfavorable) = .46

End of Chapter 21