1 CHAPTER 6 (handout) Decision Trees. 2 6.1. Introduction Sequential decision making w sequence of chance-dependent decisions w presentation of analysis.

Slides:



Advertisements
Similar presentations
Decision Analysis (Decision Trees) Y. İlker TOPCU, Ph.D twitter.com/yitopcu.
Advertisements

Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Decision Making Under Risk Continued: Bayes’Theorem and Posterior Probabilities MGS Chapter 8 Slides 8c.
1 1 Slide © 2001 South-Western College Publishing/Thomson Learning Anderson Sweeney Williams Anderson Sweeney Williams Slides Prepared by JOHN LOUCKS QUANTITATIVE.
1 Decision Analysis What is it? What is the objective? More example Tutorial: 8 th ed:: 5, 18, 26, 37 9 th ed: 3, 12, 17, 24 (to p2) (to p5) (to p50)
Introduction to Management Science
SA-1 Probabilistic Robotics Planning and Control: Partially Observable Markov Decision Processes.
1 1 Slide © 2004 Thomson/South-Western Payoff Tables n The consequence resulting from a specific combination of a decision alternative and a state of nature.
Introduction to Management Science
Chapter 18 Statistical Decision Theory Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Statistics for Business and Economics 7 th.
Chapter 21 Statistical Decision Theory
Operations Research Assistant Professor Dr. Sana’a Wafa Al-Sayegh 2 nd Semester ITGD4207 University of Palestine.
Decision Analysis. What is Decision Analysis? The process of arriving at an optimal strategy given: –Multiple decision alternatives –Uncertain future.
Managerial Decision Modeling with Spreadsheets
1 1 Slide © 2000 South-Western College Publishing/ITP Slides Prepared by JOHN LOUCKS.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Dr. C. Lightner Fayetteville State University
Decision analysis: part 2
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or.
Decision Analysis Chapter 15: Hillier and Lieberman Dr. Hurley’s AGB 328 Course.
Chapter 7 Decision Analysis
Slides prepared by JOHN LOUCKS St. Edward’s University.
Chapter 4 Decision Analysis.
1 1 Slide Decision Analysis n Structuring the Decision Problem n Decision Making Without Probabilities n Decision Making with Probabilities n Expected.
Part 3 Probabilistic Decision Models
1 1 Slide Decision Analysis Professor Ahmadi. 2 2 Slide Decision Analysis Chapter Outline n Structuring the Decision Problem n Decision Making Without.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 18-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
1 1 Slide © 2009 South-Western, a part of Cengage Learning Slides by John Loucks St. Edward’s University.
BA 555 Practical Business Analysis
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
THE HONG KONG UNIVERSITY OF SCIENCE & TECHNOLOGY CSIT 5220: Reasoning and Decision under Uncertainty L09: Graphical Models for Decision Problems Nevin.
1 Chapter 12 Value of Information. 2 Chapter 12, Value of information Learning Objectives: Probability and Perfect Information The Expected Value of Information.
NIPRL Chapter 1. Probability Theory 1.1 Probabilities 1.2 Events 1.3 Combinations of Events 1.4 Conditional Probability 1.5 Probabilities of Event Intersections.
BA 452 Lesson C.4 The Value of Information ReadingsReadings Chapter 13 Decision Analysis.
Engineering Economic Analysis Canadian Edition
Decision Analysis. Decision Analysis provides a framework and methodology for rational decision making when the outcomes are uncertain.
1 1 Slide © 2009 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Operations Management Decision-Making Tools Module A
CD-ROM Chap 14-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition CD-ROM Chapter 14 Introduction.
1 1 Slide © 2005 Thomson/South-Western EMGT 501 HW Solutions Chapter 12 - SELF TEST 9 Chapter 12 - SELF TEST 18.
Chapter 8 Decision Analysis n Problem Formulation n Decision Making without Probabilities n Decision Making with Probabilities n Risk Analysis and Sensitivity.
Decision Analysis (cont)
1 1 Slide Decision Theory Professor Ahmadi. 2 2 Slide Learning Objectives n Structuring the decision problem and decision trees n Types of decision making.
Engineering Economic Analysis Canadian Edition
Introduction to Probability  Probability is a numerical measure of the likelihood that an event will occur.  Probability values are always assigned on.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 16-1 Chapter 16 Decision Making Statistics for Managers Using Microsoft.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 17-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 17-1 Chapter 17 Decision Making Basic Business Statistics 10 th Edition.
Quantitative Decision Techniques 13/04/2009 Decision Trees and Utility Theory.
Copyright © 2009 Cengage Learning 22.1 Chapter 22 Decision Analysis.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc Chapter 23 Decision Analysis.
Models for Strategic Marketing Decision Making. Market Entry Decisions To enter first or to wait Sources of First-Mover Advantages –Technological leadership.
Real Life Quadratic Equations Maximization Problems Optimization Problems Module 10 Lesson 4:
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 Chapter 8 Revising Judgments in the Light of New Information.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 16 Decision Analysis.
ENGM 661 Engr. Economics for Managers Decisions Under Uncertainty.
Decision Making Under Uncertainty: Pay Off Table and Decision Tree.
1 Automated Planning and Decision Making 2007 Automated Planning and Decision Making Prof. Ronen Brafman Various Subjects.
Processing a Decision Tree Connecticut Electronics.
Chap 18-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter 18 Introduction to Decision Analysis.
1 1 Slide © 2005 Thomson/South-Western Chapter 13 Decision Analysis n Problem Formulation n Decision Making without Probabilities n Decision Making with.
Decision Analysis Chapter 12.
Nevin L. Zhang Room 3504, phone: ,
Chapter 19 Decision Making
Decision Tree Analysis
John Loucks St. Edward’s University . SLIDES . BY.
Markov Decision Processes
MNG221- Management Science –
Chapter 17 Decision Making
Presentation transcript:

1 CHAPTER 6 (handout) Decision Trees

Introduction Sequential decision making w sequence of chance-dependent decisions w presentation of analysis can be complex Decision Trees w Pictorial device to represent problem & calculations w Useful for problems with small no. of sequential decisions

Another Decision Tree Ex. 2 boxes, externally identical Must decide which box w a 1 : box 1: 6 black balls, 4 white balls w a 2 : box 2: 8 black balls, 2 white balls w Correct guessReceive $100 w Wrong guessReceive $0 Prior Probability w P(  1 ) = 0.5 w P(  2 ) = 0.5

4 Decision Tree w A connected set of nodes & arcs w Nodes:join arcs w Arcs:have direction (L to R) w Branch:arc & all elements that follow it w 2 branches from same initial node cannot have elements in common w 2 nodes cannot be joined by > 1 arc

5 Example of a Decision Tree

6 A diagram which is not a tree

7 Types of nodes w Decision point Choosing next action (branch) w Chance node Uncontrollable probabilistic event w Terminal node Specifies final payoff

8 Example of Sequential Decision Problem Car Exchange Problem A person must decide whether to keep or exchange his car in a showroom. There are 2 decisions: a 1 : keep cost = 1400 SR a 2 : exchange, has 2 possibilities: good buyP(G) = 0.6cost = 1200 SR bad buyP(B) = 0.4cost = 1600 SR Good or bad buy can be identified only after buying and using the car. What he should do to minimize his cost?

9 Car Exchange Problem (no information) Payoff (Cost) Matrix  P(  )a 1 : keep a 2 : exchange  1 : Good  2 : Bad EV

10 Car exchange decision tree Keep Exchange G: 0.6 B: 0.4 $1400 G: 0.6 B: 0.4 $1200 $1600

11 Car exchange decision tree Keep Exchange G: 0.6 B: 0.4 $1400 G: 0.6 B: 0.4 $1200 $1600 $1400 $1360

A Sequential Test Problem Car Exchange Problem Assume the person has 5 options for deciding whether to keep or exchange his car. (i)Decide without extra information (ii)Decide on basis of free road (driving) test (iii)Decide after oil consumption test costing $25 (iv)Decide after combined road/oil test costing $10 (v)Decide sequentially: road test then possibly oil test costing $10 In (iv), both tests must be taken In (v), oil test is optional, depending on road test

13 Car Exchange Problem (with information) The decision tree is complicated Cannot fit in 1 slide 5 branches: 5 options Probabilities after extra information are conditional (posterior) To illustrate, we choose the branch of option (v) Road test then, depending on result, possible oil test costing $10

14 Car Exchange Problem (with information) Result of road test: y 1 : fair p(y 1 ) = 0.5 y 2 : poorp(y 2 ) = 0.5 Result of oil consumption test: Z 1 : highp(Z 1 |y) Z 2 : mediump(Z 2 |y) Z 3 : lowp(Z 3 |y)

15 Car exchange decision tree (with information) y 1 : 0.5 y 2 : 0.5 No test Oil test Z3Z3 Z1Z1 Z2Z2 No test Oil test Z3Z3 Z1Z1 Z2Z2 Road test

16 Car exchange decision tree with information (y 1 branch) y 1 : 0.5 No test Z 1 : 0.28 Oil test a2a2 a1a a2a2 a1a a2a2 a1a a2a2 a1a Z 2 : 0.24 Z 3 : 0.48

17 Car exchange decision tree with information (y 1 branch) y 1 : 0.5 No test Z 1 : 0.28 Oil test a2a2 a1a a2a2 a1a a2a2 a1a a2a2 a1a Z 2 : 0.24 Z 3 :

18 Car exchange decision tree with information (y 2 branch) y 2 : 0.5 No test Z 1 : 0.32 Oil test a2a2 a1a a2a2 a1a a2a2 a1a a2a2 a1a Z 2 : 0.26 Z 3 : 0.42

19 Car exchange decision tree with information (y 2 branch) y 2 : 0.5 No test Z 1 : 0.32 Oil test a2a2 a1a a2a2 a1a a2a2 a1a a2a2 a1a Z 2 : 0.26 Z 3 :

20 Decision Tree Calculations Tree is developed from left to right Calculations are made from right to left Many calculation are redundant For inferior solutions Not needed in final solution Probabilities after extra information (road or oil tests) are conditional (posterior) Calculated by Bayes’ theorem

21 Initial Payoff Data (no information) Payoff (Reward) Matrix  P(  )a 1 : Box 1 a 2 : Box 2  1 : Box  2 : Box EV5050

22 Initial Probability Data (no information) Prior Probability Matrix  P(  )B: Black W: White  1 : Box  2 : Box

23 Decision tree without information Box 1 Box 2  1 : 0.5  2 : 0.5 $100 $0  1 : 0.5  2 : 0.5 $0 $100 $50

24 Decision Tree Example with information w Samples from box can be taken w Ball is returned to the box w Up to 2 samples are allowed w Cost = $3 per sample w What is the optimal plan?

25 Posterior probabilities for sample 1 Probability Calculations  P(  )P(B)P(W)JointPosterior  1 :  2 : 

26 Decision tree with information No sample Sample 1 B: 0.7 $50 $ W: 0.3 $ $ a 1 or a 2 $ Sample 2 No sample No information

27 Posterior probabilities for sample 2 when sample 1 is Black Probability Calculations  P(  )P(B)P(W)JointPosterior  1 :  2 : 

28 Sample 1 Black, No Sample 2 No 2 nd sample Sample 2 1: : 0.57 $97 $-3 B: 0.72 W: 0.28 $ $ 1: : 0.57 $-3 $97 a1a1 a2a2 Black sample

29 Samples 1 & 2 Both Black Black sample 2 Sample 2 1: : 0.64 $94 $-6 B: 0.72 W: 0.28 $ 1: : 0.64 $-6 $94 a1a1 a2a2 Black sample No Sample $54 58

30 Sample 1 Black, Sample 2 White White sample 2 Sample 2 1: : 0.39 $94 $-6 W: 0.28 B: 0.72 $58 1: : 0.39 $-6 $94 a1a1 a2a2 Black sample No Sample $

31 Posterior probabilities for sample 2 when sample 1 is White Probability Calculations  P(  )P(B)P(W)JointPosterior  1 :  2 : 

32 Sample 1 White, No Sample 2 No 2 nd sample Sample 2 1: : 0.33 $97 $-3 B: 0.66 W: 0.34 $ $ 1: : 0.33 $-3 $97 a1a1 a2a2 White sample

33 Sample 1 White, Sample 2 Black Black sample 2 Sample 2 1: : 0.39 $94 $-6 B: 0.66 W: 0.34 $ 1: : 0.39 $-6 $94 a1a1 a2a2 White sample No Sample $64 55

34 Samples 1 & 2 Both White White sample 2 Sample 2 1: : 0.21 $94 $-6 W: 0.34 B: 0.66 $55 1: : 0.21 $-6 $94 a1a1 a2a2 White sample No Sample $

35 Decision tree summary of results No samples Sample 1 B: 0.7 $50 $55 W: 0.3 a 1 or a 2 $54 Sample 2 No 2 nd sample No information $64 Sample 2 No 2 nd sample $58 $55 W, 0.28: a 1 B, 0.72: a 2 a2a2 B, 0.66: a 1 W, 0.34: a 1 a1a1 $ a 1 : 6B, 4W a 2 : 8B, 2W

36 Decision Tree with Fixed Costs w Example of fixed cost: sampling cost = 3/sample in previous example w If objective is to maximize expected payoff, w Constant costs can be deducted either from: Terminal node payoffs Expected values

37 Example: Including fixed costs Sample 1 Black, cost = $3  1 : 0.43  2 : 0.57 $100 $0 43 – 3 a1a1 Sample 1 Black, cost = $3  1 : 0.43  2 : 0.57 $97 $– 3 40 a1a1 Recall Slide 9

38 Fixed Costs & Utilities w Utilities can be used instead of payoffs w If objective is to maximize expected utility Constant costs must be deducted from terminal node payoffs Net payoffs are converted to net utilities Expected values are taken of utilities of net payoffs

39 Including fixed costs Sample 1 Black, cost = $3  1 : 0.43  2 : 0.57 U(100) U(0) EU–U(3) a1a1 Sample 1 Black, cost = $3  1 : 0.43  2 : 0.57 U(97) U(– 3) EU a1a1 Incorrect Correct

40 Allowing an optional 3 rd sample w Suppose now a 3 rd sample is allowed w Sample cost = $3 w Assume the decision whether or not to take sample 3 depends on results of samples 1 and 2 w What is the optimal plan?

41 Posterior probabilities for sample 3 After 2 blacks (slide 8)slide 8  P(  )P(B)P(W)JointPosterior  1 :  2 : 

42 Decision tree with optional sample 3 Sample 1B: 0.7 $50 W: 0.3 $54 Sample 2 No 2 nd sample No sample $ $57.2 Sample 3 No 3 rd sample $64 Sample 2 No 2 nd sample $ $61.1 Sample 3 No 3 rd sample

43 Fixing the number of samples w Suppose now a 3 rd sample is allowed w Sample cost = $3 w Assume we must decide the number of samples in advance: 0, 1, 2, or 3 w What is the optimal plan?

44 Zero samples a 1 : Box 1  1 : 0.5  2 : 0.5 $100 $0  1 : 0.5  2 : 0.5 $0 $100 $50 a 2 : Box 2 50 No samples

45 One Sample B: 0.7 W: 0.3 1: : 0.57 $97 $-3 1: : 0.57 $-3 $97 a1a1 a2a2 Sample once : : 0.33 $97 $-3 1: : 0.33 $-3 $97 a1a1 a2a

46 Posterior probabilities for 2 samples Examples: P(BB|  1 ) = P(BB) = 0.6(0.6)= 0.36 P(BW|  1 ) = P(BW) + P(WB) = 0.6* *0.6= 0.48 P(WW|  1 ) = P(WW) = 0.4(0.4)= 0.16  P(  )BBBWWWJoint  1 :  2 :  Post  1 :  2 :

47 Two Samples BB: 0.5 WW: 0.1 1: : 0.64 $94 $-6 a1a1 a2a2 Sample twice : : 0.64 $-6 $ : 0.6 2: 0.4 $94 $-6 a1a1 a2a2 54 1: 0.6 2: 0.4 $-6 $ : 0.8 2: 0.2 $94 $-6 a1a1 a2a2 74 1: 0.8 2: 0.2 $-6 $94 14 BW: 0.4

48 Posterior probabilities for 3 samples P(BBB|  1 )= 0.6(0.6)(0.6)= P(BBW|  1 )= P(BBW) + P(BWB) + P(WBB) = 3*0.6*0.6*0.4 = P(BWW|  1 )= P(BWW) + P(WBW) + P(WWB) = 3*0.6*0.4*0.4 = P(WWW|  1 )= 0.4(0.4)(0.4) =  P BBBBBWBWWWWWJoint  1 :  2 :  Post  1 :  2 :

49 Three Samples BBB: 0.36 WWW: : 0.3 2: 0.7 $91 $-9 a1a1 a2a2 Sample 3 times : 0.3 2: 0.7 $-9 $91 61 BBW: : : 0.47 $91 $-9 a1a1 a2a2 44 1: : 0.47 $-9 $ : : 0.25 $91 $-9 a1a1 a2a2 66 1: : 0.25 $-9 $ : : 0.11 $91 $-9 a1a1 a2a2 80 1: : 0.11 $-9 $91 2 BWW: 0.19

50 Summary of results with fixed number of samples $50 $57 1 sample 0 samples $55.7 $58 2 samples 3 Samples

51 Value of Sample (new information w Results of previous example With sequential samples (slide 23)slide 23 With fixed no. of samples (slide 31)slide 31 w 3 rd Sample is never needed w Questions: How many samples should be taken? Is it better to decide immediately or after more information?

52 Expected Value of Information w AssumeP(  1 ) = p,P(  2 ) = 1 – p w Then P(  )P(B)P(W)Joint  1 :p p0.4p  2 :1–p (1-p)0.2(1-p)  1.0(4-p)/5(1+p)/5 Posterior 3p/(4-p)2p/(1+p) 4(1-p)/(4-p)(1-p)/(1+p)

53 Expected payoff w Best payoff if Black = 100[ max{3p/(4-p), 4(1-p)/(4-p)} ] w Best payoff if White = 100[ max{2p/(1+p), (1-p)/(1+p)} ] w Expected outcome w F(p) = 100 (4-p)/5 [ max{3p/(4-p), 4(1-p)/(4-p)} ] (1+p)/5[ max{2p/(1+p), (1-p)/(1+p)} ] w F(p) = 100[ max{0.6p, 0.8(1-p)} + max{0.4p, 0.2(1-p)} ] w F(p) = max{60p, 80(1-p)} + max{40p, 20(1-p)} w F(p) = max{a, b} + max{c, d}

54 Graph of expected payoff p /71/3

55 Maximum Expected Payoff w To maximize F(p) on 0 < p < 1, w Graphical solution gives 0 < p < 1/3F(p) = 100(1 – p)b + d 1/3 < p < 4/7F(p) = 80 – 40pb + c 4/7 < p < 1F(p) = 100pa + c w For 1 st and 3 rd ranges, solution is same as expected payoff given only P(  1 ) = p, P(  2 ) = 1 – p. w Only 2 nd range has improvement in expected payoff w Sample should be taken only if: 1/3 < p < 4/7

56 Expected Value of Sample Information w Value of sample information = Expected improvement in payoff = 80 – 40p – (100 – 100p),0 < p < 0.5 = 80 – 40p – (100p), 0.5 < p < 1 Or = 60p – 20, 0 < p < 0.5 = 80 – 140p,0.5 < p < 1

57 Range of p for sample cost = 3 w For sample cost = 3 w Sample should be taken only improvement is > 3 60p – 20 > 3 p > – 140p > 3 p < 0.55 w Thus, < p < 0.55

58 For fixed no. of samples Posteriors after 2 samples (slide 27)slide 27 BBBWWW P(  1 ) = p Since all probabilities are outside the range ( < p < 0.55 ) A 3 rd sample should not be taken

59 How many samples? w So far, analysis is for the value of 1 sample w We can estimate value of several samples w Max. no. of samples Expected payoff with no information= 50 Payoff with perfect information= 100 Max. no. of samples = (100 – 50)/3= 16