Chapter 13 Decision Analysis

Slides:



Advertisements
Similar presentations
QUANTITATIVE METHODS FOR BUSINESS 8e
Advertisements

Decision Theory.
Chapter 3 Decision Analysis.
1 1 Slide © 2001 South-Western College Publishing/Thomson Learning Anderson Sweeney Williams Anderson Sweeney Williams Slides Prepared by JOHN LOUCKS QUANTITATIVE.
20- 1 Chapter Twenty McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
1 1 Slide © 2004 Thomson/South-Western Payoff Tables n The consequence resulting from a specific combination of a decision alternative and a state of nature.
Chapter 18 Statistical Decision Theory Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Statistics for Business and Economics 7 th.
Decision Theory.
LECTURE TWELVE Decision-Making UNDER UNCERTAINITY.
Chapter 21 Statistical Decision Theory
Chapter 3 Decision Analysis.
Copyright © 2002 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter Twenty An Introduction to Decision Making GOALS.
Managerial Decision Modeling with Spreadsheets
1 1 Slide © 2000 South-Western College Publishing/ITP Slides Prepared by JOHN LOUCKS.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
DSC 3120 Generalized Modeling Techniques with Applications
Dr. C. Lightner Fayetteville State University
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or.
Chapter 7 Decision Analysis
Slides prepared by JOHN LOUCKS St. Edward’s University.
Chapter 4 Decision Analysis.
1 1 Slide Decision Analysis n Structuring the Decision Problem n Decision Making Without Probabilities n Decision Making with Probabilities n Expected.
Part 3 Probabilistic Decision Models
Chapter 8 Decision Analysis MT 235.
1 1 Slide Decision Analysis Professor Ahmadi. 2 2 Slide Decision Analysis Chapter Outline n Structuring the Decision Problem n Decision Making Without.
1 1 Slide © 2009 South-Western, a part of Cengage Learning Slides by John Loucks St. Edward’s University.
Topic 2. DECISION-MAKING TOOLS
Business 260: Managerial Decision Analysis
Decision Analysis Chapter 3
Decision Making Under Uncertainty and Under Risk
Decision analysis: part 1 BSAD 30 Dave Novak Source: Anderson et al., 2013 Quantitative Methods for Business 12 th edition – some slides are directly from.
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
BA 452 Lesson C.4 The Value of Information ReadingsReadings Chapter 13 Decision Analysis.
© 2008 Prentice Hall, Inc.A – 1 Operations Management Module A – Decision-Making Tools PowerPoint presentation to accompany Heizer/Render Principles of.
Operations Management Decision-Making Tools Module A
1 1 Slide © 2009 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Decision Analysis Chapter 3
1 1 Slide © 2005 Thomson/South-Western EMGT 501 HW Solutions Chapter 12 - SELF TEST 9 Chapter 12 - SELF TEST 18.
Chapter 8 Decision Analysis n Problem Formulation n Decision Making without Probabilities n Decision Making with Probabilities n Risk Analysis and Sensitivity.
“ The one word that makes a good manager – decisiveness.”
An Introduction to Decision Theory (web only)
Chapter 3 Decision Analysis.
1 1 Slide Decision Theory Professor Ahmadi. 2 2 Slide Learning Objectives n Structuring the decision problem and decision trees n Types of decision making.
Chapter 9 - Decision Analysis - Part I
Welcome Unit 4 Seminar MM305 Wednesday 8:00 PM ET Quantitative Analysis for Management Delfina Isaac.
Decision Theory McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
Models for Strategic Marketing Decision Making. Market Entry Decisions To enter first or to wait Sources of First-Mover Advantages –Technological leadership.
Fundamentals of Decision Theory Chapter 16 Mausam (Based on slides of someone from NPS, Maria Fasli)
Decision Analysis.
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Decision Analysis Anderson, Sweeney and Williams Chapter 4 Read: Sections 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, and appendix 4.1.
DECISION MODELS. Decision models The types of decision models: – Decision making under certainty The future state of nature is assumed known. – Decision.
Chapter 19 Statistical Decision Theory ©. Framework for a Decision Problem action i.Decision maker has available K possible courses of action : a 1, a.
Chapter 8 Decision Analysis n Problem Formulation n Decision Making without Probabilities n Decision Making with Probabilities n Risk Analysis and Sensitivity.
1 1 Slide © 2005 Thomson/South-Western Chapter 13 Decision Analysis n Problem Formulation n Decision Making without Probabilities n Decision Making with.
DECISION THEORY & DECISION TREE
Decision Analysis Chapter 12.
Chapter Twenty McGraw-Hill/Irwin
Investment risks Investment decisions and strategies.
Welcome to MM305 Unit 4 Seminar Larry Musolino
Slides 8a: Introduction
Chapter 19 Decision Making
Decision Analysis Chapter 15.
Operations Management
John Loucks St. Edward’s University . SLIDES . BY.
MNG221- Management Science –
Statistical Decision Theory
Decision Theory Analysis
Decision Analysis.
Presentation transcript:

Chapter 13 Decision Analysis Md. Abdullah Al Mahmud Assistant Professor Manarat International University

Chapter 13 Decision Analysis Decision Making without Probabilities Decision Making with Probabilities Risk Analysis and Sensitivity Analysis Decision Analysis with Sample Information Computing Branch Probabilities Utility and Decision Making

Decision Analysis: Making Justifiable, Defensible Decisions Decision analysis is the discipline of evaluating complex alternatives in terms of values and uncertainty. Decision analysis provides insight into how the defined alternatives differ from one another and then generates suggestions for new and improved alternatives.

Decision Analysis: Making Justifiable, Defensible Decisions Numbers quantify subjective values and uncertainties, which enable us to understand the decision situation. These numerical results then must be translated back into words in order to generate qualitative insight. Humans can understand, compare, and manipulate numbers. Therefore, in order to create a decision analysis model, it is necessary to create the model structure and assign probabilities and values to fill the model for computation.

Decision Analysis: Making Justifiable, Defensible Decisions Complexity in the modern world, along with information quantity, uncertainty, and risk, make it necessary to provide a rational decision making framework. The goal of decision analysis is to give guidance, information, insight, and structure to the decision-making process in order to make better, more 'rational' decisions.

Elements of Decision Analysis Models A sole individual is designated as the decision-maker. For example, the CEO of a company, who is accountable to the shareholders. A finite number of possible (future) events called the 'States of Nature' (a set of possible scenarios). They are the circumstances under which a decision is made. The states of nature are identified and grouped in set "S"; its members are denoted by "s(j)". Set S is a collection of mutually exclusive events meaning that only one state of nature will occur.

Elements of Decision Analysis Models A finite number of possible decision alternatives (i.e., actions) is available to the decision-maker. Only one action may be taken. Payoff is the return of a decision. Different combinations of decisions and states of nature (uncertainty) generate different payoffs. Payoffs are usually shown in tables. In decision analysis payoff is represented by positive (+) value for net revenue, income, or profit and negative (-) value for expense, cost or net loss.

Decision Categories There are different types of decision models that help to analyze the different scenarios. Depending on the amount and degree of knowledge we have, the three most widely used types are: Decision-making under pure uncertainty Decision-making under risk Decision-making by buying information (pushing the problem towards the deterministic "pole")

Decision Making without Probabilities In decision-making under pure uncertainty, the decision maker has absolutely no knowledge, not even about the likelihood of occurrence for any state of nature. In such situations, the decision-maker's behavior is purely based on his/her attitude toward the unknown. Some of these behaviors are optimistic, pessimistic, and least regret,.

Decision Making without Probabilities Optimist: The glass is half-full. Pessimist: The glass is half-empty. Manager: The glass is twice as large as it needs to be. Or, as in the following metaphor of a captain in a rough sea: The pessimist complains about the wind; the optimist expects it to change; the realist adjusts the sails.

Decision Making without Probabilities Optimists are right; so are the pessimists. It is up to you to choose which you will be. The optimist sees opportunity in every problem; the pessimist sees problem in every opportunity. Both optimists and pessimists contribute to our society.

Optimistic Approach The optimistic approach would be used by an optimistic decision maker. The decision with the largest possible payoff is chosen. If the payoff table was in terms of costs, the decision with the lowest cost would be chosen.

Conservative Approach The conservative approach would be used by a conservative decision maker. For each decision the minimum payoff is listed and then the decision corresponding to the maximum of these minimum payoffs is selected. (Hence, the minimum possible payoff is maximized.) If the payoff was in terms of costs, the maximum costs would be determined for each decision and then the decision corresponding to the minimum of these maximum costs is selected. (Hence, the maximum possible cost is minimized.)

Minimax Regret Approach The minimax regret approach requires the construction of a regret table or an opportunity loss table. This is done by calculating for each state of nature the difference between each payoff and the largest payoff for that state of nature. Then, using this regret table, the maximum regret for each possible decision is listed. The decision chosen is the one corresponding to the minimum of the maximum regrets.

Example Consider the following problem with three decision alternatives and three states of nature with the following payoff table representing profits: States of Nature s1 s2 s3 d1 4 4 -2 Decisions d2 0 3 -1 d3 1 5 -3

Example Optimistic Approach An optimistic decision maker would use the optimistic (maximax) approach. We choose the decision that has the largest single value in the payoff table. Maximum Decision Payoff d1 4 d2 3 d3 5 Maximaxdecision Maximax payoff

Example Conservative Approach A conservative decision maker would use the conservative (maximin) approach. List the minimum payoff for each decision. Choose the decision with the maximum of these minimum payoffs. Minimum Decision Payoff d1 -2 d2 -1 d3 -3 Maximin decision Maximin payoff

Example Minimax Regret Approach For the minimax regret approach, first compute a regret table by subtracting each payoff in a column from the largest payoff in that column. In this example, in the first column subtract 4, 0, and 1 from 4; etc. The resulting regret table is: s1 s2 s3 d1 0 1 1 d2 4 2 0 d3 3 0 2

Example Minimax Regret Approach (continued) For each decision list the maximum regret. Choose the decision with the minimum of these values. Maximum Decision Regret d1 1 d2 4 d3 3 Minimax decision Minimax regret

Decision Making with Probabilities Expected Value Approach If probabilistic information regarding the states of nature is available, one may use the expected value (EV) approach. Here the expected return for each decision is calculated by summing the products of the payoff under each state of nature and the probability of the respective state of nature occurring. The decision yielding the best expected return is chosen.

Expected Value of a Decision Alternative The expected value of a decision alternative is the sum of weighted payoffs for the decision alternative. The expected value (EV) of decision alternative di is defined as: where: N = the number of states of nature P(sj ) = the probability of state of nature sj Vij = the payoff corresponding to decision alternative di and state of nature sj

Example: Burger Prince Burger Prince Restaurant is contemplating opening a new restaurant on Main Street. It has three different models, each with a different seating capacity. Burger Prince estimates that the average number of customers per hour will be 80, 100, or 120. The payoff table for the three models is on the next slide.

Example: Burger Prince Payoff Table Average Number of Customers Per Hour s1 = 80 s2 = 100 s3 = 120 Model A $10,000 $15,000 $14,000 Model B $ 8,000 $18,000 $12,000 Model C $ 6,000 $16,000 $21,000

Example: Burger Prince Expected Value Approach Calculate the expected value for each decision. The decision tree on the next slide can assist in this calculation. Here d1, d2, d3 represent the decision alternatives of models A, B, C, and s1, s2, s3 represent the states of nature of 80, 100, and 120.

Example: Burger Prince Decision Tree Payoffs .4 s1 10,000 s2 .2 2 15,000 s3 .4 d1 14,000 .4 s1 8,000 d2 1 3 s2 .2 18,000 d3 s3 .4 12,000 .4 s1 6,000 4 s2 .2 16,000 s3 .4 21,000

Example: Burger Prince Expected Value For Each Decision Choose the model with largest EV, Model C. EMV = .4(10,000) + .2(15,000) + .4(14,000) = $12,600 d1 2 Model A EMV = .4(8,000) + .2(18,000) + .4(12,000) = $11,600 d2 Model B 1 3 d3 EMV = .4(6,000) + .2(16,000) + .4(21,000) = $14,000 Model C 4

Expected Value of Perfect Information Frequently information is available which can improve the probability estimates for the states of nature. The expected value of perfect information (EVPI) is the increase in the expected profit that would result if one knew with certainty which state of nature would occur. The EVPI provides an upper bound on the expected value of any sample or survey information.

Expected Value of Perfect Information EVPI Calculation Step 1: Determine the optimal return corresponding to each state of nature. Step 2: Compute the expected value of these optimal returns. Step 3: Subtract the EV of the optimal decision from the amount determined in step (2).

Example: Burger Prince Expected Value of Perfect Information Calculate the expected value for the optimum payoff for each state of nature and subtract the EV of the optimal decision. EVPI=.4(10,000)+ .2(18,000) + .4(21,000) - 14,000 =$2,000

Risk Analysis Risk analysis helps the decision maker recognize the difference between: the expected value of a decision alternative, and the payoff that might actually occur The risk profile for a decision alternative shows the possible payoffs for the decision alternative along with their associated probabilities.

Example: Burger Prince Risk Profile for the Model C Decision Alternative .50 .40 Probability .30 .20 .10 5 10 15 20 25 Profit ($thousands)

Sensitivity Analysis Sensitivity analysis can be used to determine how changes to the following inputs affect the recommended decision alternative: probabilities for the states of nature values of the payoffs If a small change in the value of one of the inputs causes a change in the recommended decision alternative, extra effort and care should be taken in estimating the input value.

Bayes’ Theorem and Posterior Probabilities Knowledge of sample or survey information can be used to revise the probability estimates for the states of nature. Prior to obtaining this information, the probability estimates for the states of nature are called prior probabilities. With knowledge of conditional probabilities for the outcomes or indicators of the sample or survey information, these prior probabilities can be revised by employing Bayes' Theorem. The outcomes of this analysis are called posterior probabilities or branch probabilities for decision trees.

Computing Branch Probabilities Branch (Posterior) Probabilities Calculation Step 1: For each state of nature, multiply the prior probability by its conditional probability for the indicator -- this gives the joint probabilities for the states and indicator. Step 2: Sum these joint probabilities over all states -- this gives the marginal probability for the indicator. Step 3: For each state, divide its joint probability by the marginal probability for the indicator -- this gives the posterior probability distribution.

Expected Value of Sample Information The expected value of sample information (EVSI) is the additional expected profit possible through knowledge of the sample or survey information.

Expected Value of Sample Information EVSI Calculation Step 1: Determine the optimal decision and its expected return for the possible outcomes of the sample using the posterior probabilities for the states of nature. Step 2: Compute the expected value of these optimal returns. Step 3: Subtract the EV of the optimal decision obtained without using the sample information from the amount determined in step (2).

Efficiency of Sample Information Efficiency of sample information is the ratio of EVSI to EVPI. As the EVPI provides an upper bound for the EVSI, efficiency is always a number between 0 and 1.

Example: Burger Prince Sample Information Burger Prince must decide whether or not to purchase a marketing survey from Stanton Marketing for $1,000. The results of the survey are "favorable" or "unfavorable". The conditional probabilities are: P(favorable | 80 customers per hour) = .2 P(favorable | 100 customers per hour) = .5 P(favorable | 120 customers per hour) = .9 Should Burger Prince have the survey performed by Stanton Marketing?

Example: Burger Prince Influence Diagram Decision Chance Consequence Market Survey Results Avg. Number of Customers Per Hour Market Survey Restaurant Size Profit

Example: Burger Prince Posterior Probabilities Favorable State Prior Conditional Joint Posterior 80 .4 .2 .08 .148 100 .2 .5 .10 .185 120 .4 .9 .36 .667 Total .54 1.000 P(favorable) = .54

Example: Burger Prince Posterior Probabilities Unfavorable State Prior Conditional Joint Posterior 80 .4 .8 .32 .696 100 .2 .5 .10 .217 120 .4 .1 .04 .087 Total .46 1.000 P(unfavorable) = .46

Example: Burger Prince Decision Tree (top half) s1 (.148) $10,000 s2 (.185) 4 d1 $15,000 s3 (.667) $14,000 s1 (.148) $8,000 d2 5 s2 (.185) 2 $18,000 s3 (.667) I1 (.54) d3 $12,000 s1 (.148) $6,000 6 s2 (.185) $16,000 s3 (.667) 1 $21,000

Example: Burger Prince Decision Tree (bottom half) 1 s1 (.696) $10,000 I2 (.46) s2 (.217) 7 d1 $15,000 s3 (.087) $14,000 s1 (.696) d2 $8,000 s2 (.217) 8 3 $18,000 s3 (.087) d3 $12,000 s1 (.696) $6,000 s2 (.217) 9 $16,000 s3 (.087) $21,000

Example: Burger Prince EMV = .148(10,000) + .185(15,000) + .667(14,000) = $13,593 d1 4 $17,855 d2 5 EMV = .148 (8,000) + .185(18,000) + .667(12,000) = $12,518 2 I1 (.54) d3 EMV = .148(6,000) + .185(16,000) +.667(21,000) = $17,855 6 1 EMV = .696(10,000) + .217(15,000) +.087(14,000)= $11,433 7 d1 I2 (.46) d2 EMV = .696(8,000) + .217(18,000) + .087(12,000) = $10,554 8 3 d3 $11,433 EMV = .696(6,000) + .217(16,000) +.087(21,000) = $9,475 9

Example: Burger Prince Expected Value of Sample Information If the outcome of the survey is "favorable”, choose Model C. If it is “unfavorable”, choose model A. EVSI = .54($17,855) + .46($11,433) - $14,000 = $900.88 Since this is less than the cost of the survey, the survey should not be purchased.

Example: Burger Prince Efficiency of Sample Information The efficiency of the survey: EVSI/EVPI = ($900.88)/($2000) = .4504

Meaning of Utility Utilities are used when the decision criteria must be based on more than just expected monetary values. Utility is a measure of the total worth of a particular outcome, reflecting the decision maker’s attitude towards a collection of factors. Some of these factors may be profit, loss, and risk. This analysis is particularly appropriate in cases where payoffs can assume extremely high or extremely low values.

Example: Weiss Advisors Weiss Advisors have analyzed the profit potential of five different investments. The probabilities of the gains on $1000 are as follows:   Gain Investment $0 $200 $500 $1000 A .9 0 0 .1 B 0 .8 .2 0 C .05 .9 0 .05 D 0 .8 .1 .1 E .6 0 .3 .1 One of Weiss’ investors informs Weiss that he is indifferent between investments A, B, and C.

Example: Weiss Advisors Developing Utilities for Payoffs Assign a utility of 10 to a $1000 gain and a utility of 0 to a gain of $0. Let x = the utility of a $200 gain and y = the utility of a $500 gain. The expected utility on investment A is then .9(0) + .1(10) = 1. Since the investor is indifferent between investments A and C, this must mean the expected utility of investment C = the expected utility of investment A = 1. But the expected utility of investment C = .05(0) +.90x + .05(10). Since this must equal 1, solving for x, gives x = 5/9.

Example: Weiss Advisors Developing Utilities for Payoffs (continue) Also since the investor is indifferent between A, B, and C, the expected utility of investment B must be 1. Thus, 0(0) + .8(5/9) + .2y + 0(10) = 1. Solving for y, gives y = 25/9. Thus the utility values for gains of 0, 200, 500, and 1000 are 0, 5/9, 25/9, and 10, respectively.

Expected Utility Approach Once a utility function has been determined, the optimal decision can be chosen using the expected utility approach. Here, the expected utility for each decision alternative is computed as: The decision alternative with the highest expected utility is chosen.

Example: Risk Avoider Consider a three-state, three-decision problem with the following payoff table in dollars: s1 s2 s3 d1 +100,000 +40,000 -60,000 d2 +50,000 +20,000 -30,000 d3 +20,000 +20,000 -10,000 The probabilities for the three states of nature are: P(s1) = .1, P(s2) = .3, and P(s3) = .6.

Example: Risk Avoider Utility Table for Decision Maker s1 s2 s3

Example: Risk Avoider Expected Utility Expected s1 s2 s3 Utility Probability .1 .3 .6 Decision maker should choose decision d3.

End of Chapter 13 Md. Abdullah Al Mahmud Assistant Professor Manarat International University