1 Chapter 8 Revising Judgments in the Light of New Information.

Slides:



Advertisements
Similar presentations
Copyright © 2004 by The McGraw-Hill Companies, Inc. All rights reserved
Advertisements

Chapter 8: Decision Analysis
1 1 Slide © 2004 Thomson/South-Western Payoff Tables n The consequence resulting from a specific combination of a decision alternative and a state of nature.
Chapter 18 Statistical Decision Theory Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Statistics for Business and Economics 7 th.
Chapter 21 Statistical Decision Theory
Managerial Decision Modeling with Spreadsheets
1 1 Slide © 2000 South-Western College Publishing/ITP Slides Prepared by JOHN LOUCKS.
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin An Introduction to Decision Making Chapter 20.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or.
Decision Analysis Chapter 15: Hillier and Lieberman Dr. Hurley’s AGB 328 Course.
Slides prepared by JOHN LOUCKS St. Edward’s University.
Chapter 4 Decision Analysis.
1 1 Slide Decision Analysis n Structuring the Decision Problem n Decision Making Without Probabilities n Decision Making with Probabilities n Expected.
Part 3 Probabilistic Decision Models
Chapter 8 Decision Analysis MT 235.
CHAPTER 19: Decision Theory to accompany Introduction to Business Statistics fourth edition, by Ronald M. Weiers Presentation by Priscilla Chaffe-Stengel.
1 1 Slide © 2009 South-Western, a part of Cengage Learning Slides by John Loucks St. Edward’s University.
Decision Analysis Chapter 3
Decision Tree Analysis. Decision Analysis Managers often must make decisions in environments that are fraught with uncertainty. Some Examples –A manufacturer.
1 Chapter 12 Value of Information. 2 Chapter 12, Value of information Learning Objectives: Probability and Perfect Information The Expected Value of Information.
The Value of Information The Oil Wildcatter revisited Imperfect information Revising probabilities Bayes’ theorem.
BA 452 Lesson C.4 The Value of Information ReadingsReadings Chapter 13 Decision Analysis.
Engineering Economic Analysis Canadian Edition
1 Imperfect Information / Utility Scott Matthews Courses: /
1 1 Slide © 2009 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Uncertainty and Consumer Behavior
1 Revising Judgments in the Light of New Information.
Decision Analysis Chapter 3
1 1 Slide © 2005 Thomson/South-Western EMGT 501 HW Solutions Chapter 12 - SELF TEST 9 Chapter 12 - SELF TEST 18.
MA - 1© 2014 Pearson Education, Inc. Decision-Making Tools PowerPoint presentation to accompany Heizer and Render Operations Management, Eleventh Edition.
Chapter 8 Decision Analysis n Problem Formulation n Decision Making without Probabilities n Decision Making with Probabilities n Risk Analysis and Sensitivity.
Lecture No. 41 Chapter 12 Contemporary Engineering Economics Copyright © 2010 Contemporary Engineering Economics, 5th edition, © 2010.
An Introduction to Decision Theory (web only)
An Introduction to Decision Theory
Decision Trees and Influence Diagrams Dr. Ayham Jaaron.
Decision Analysis (cont)
Chapter 3 Decision Analysis.
Chapter 5 Decision Making Under Uncertainty Dr. Ayham Jaaron.
1 1 Slide Decision Theory Professor Ahmadi. 2 2 Slide Learning Objectives n Structuring the decision problem and decision trees n Types of decision making.
Engineering Economic Analysis Canadian Edition
1 Chapter 7 Applying Simulation to Decision Problems.
Chapter 5 Choice Under Uncertainty. Chapter 5Slide 2 Topics to be Discussed Describing Risk Preferences Toward Risk Reducing Risk The Demand for Risky.
Chapter 6 Decision Trees and Influence Diagrams.
Contemporary Engineering Economics, 6 th edition Park Copyright © 2016 by Pearson Education, Inc. All Rights Reserved Decision-Tree Analysis Lecture No.
Chapter 3: DECISION ANALYSIS Part 2 1. Decision Making Under Risk  Probabilistic decision situation  States of nature have probabilities of occurrence.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 16-1 Chapter 16 Decision Making Statistics for Managers Using Microsoft.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 17-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 17-1 Chapter 17 Decision Making Basic Business Statistics 10 th Edition.
Decision theory under uncertainty
1 Civil Systems Planning Benefit/Cost Analysis Scott Matthews Courses: / / Lecture 12.
Copyright © 2009 Cengage Learning 22.1 Chapter 22 Decision Analysis.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc Chapter 23 Decision Analysis.
Models for Strategic Marketing Decision Making. Market Entry Decisions To enter first or to wait Sources of First-Mover Advantages –Technological leadership.
Decision Trees. Introduction Decision trees enable one to look at decisions: with many alternatives and states of nature which must be made in sequence.
Amity School Of Business Operations Research OPERATIONS RESEARCH.
Fundamentals of Decision Theory Chapter 16 Mausam (Based on slides of someone from NPS, Maria Fasli)
BUAD306 Chapter 5S – Decision Theory. Why DM is Important The act of selecting a preferred course of action among alternatives A KEY responsibility of.
QUANTITATIVE TECHNIQUES
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Decision Analysis Anderson, Sweeney and Williams Chapter 4 Read: Sections 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, and appendix 4.1.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 16 Decision Analysis.
1 1 Slide © 2005 Thomson/South-Western Chapter 13 Decision Analysis n Problem Formulation n Decision Making without Probabilities n Decision Making with.
QUANTITATIVE TECHNIQUES
Investment risks Investment decisions and strategies.
Steps to Good Decisions
John Loucks St. Edward’s University . SLIDES . BY.
MNG221- Management Science –
Decision Analysis Decision Trees Chapter 3
Chapter 17 Decision Making
Presentation transcript:

1 Chapter 8 Revising Judgments in the Light of New Information

2 In this chapter we will look at the process of revising initial probability estimates in the light of new information.

3 Bayes ’ theorem Prior probability New information Posterior probability

4 The components problem (Fig. 8.1)

5 In total, we would expect 410 (i.e ) components to fail the test. Now the component you selected is one of these 410 components. Of these, only 140 are 'OK7, so your posterior probability that the component is 'OK7 should be 140/410, which is 0.341, i.e. P(component OK|failed test) = 140/410 = 0.341

6 Applying Bayes ’ theorem to the components problem (Fig. 8.2)

7 The steps in the process which we have just applied are summarized below: (1) Construct a tree with branches representing all the possible events which can occur and write the prior probabilities for these events on the branches. (2) Extend the tree by attaching to each branch a new branch which represents the new information which you have obtained. On each branch write the conditiona1 probability of obtaining this information given the circumstance represented by the preceding branch. (3) Obtain the joint probabilities by multiplying each prior probability by the conditional probability which follows it on the tree. (4) Sum the joint probabilities. (5) Divide the 'appropriate' joint probability by the sum of the joint probabilities to obtain the required posterior probability.

8 Example: An engineer makes a cursory inspection of a piece of equipment and estimates that there is a 75% chance that it is running at peak efficiency. He then receives a report that the operating temperature of the machine is exceeding 80° C. Past records of operating performance suggest that there is only a 0.3 probability of this temperature being exceeded when the machine is working at peak efficiency. The probability of the temperature being exceeded if the machine is not working at peak efficiency is 0.8. What should be the engineer's revised probability that the machine is operating at peak efficiency? Refer to Fig. 8.3

9 Another example (more than two events ) A company's sales manager estimates that there is a 0.2 probability that sales in the coming year will be high, a 0.7 probability that they will be medium and a 0.1 probability that they will be low. She then receives a sales forecast from her assistant and the forecast suggests that sales will be high. By examining the track record of the assistant's forecasts she is able to obtain the following probabilities:

10 p(high sales forecast given that the market will generate high sales) = 0.9 p(high sales forecast given that the market will generate only medium sales) =0.6 p(high sales forecast given that the market will generate only low sales) = 0.3 Refer to Fig. 8.4

11 We obtain the following posterior probabilities: p(high sales) = 0.18/0.63 = P(medium sales) = 0.42/0.63 = p(low sales) = 0.03/0.63 =

12 The effect of new information on the revision of probability judgments It is interesting to explore the relative influence which prior probabilities and new information have on the resulting posterior probabilities. Consider that a situation where the geologist is not very confident about his prior probabilities and where the test drilling is very reliable.

13 Vague priors and very reliable information

14 The posterior probabilities depend only upon the reliability of the new information. The 'vague' prior probabilities have had no influence on the result.

15 A more general view of the relationship between the 'vagueness' of the prior probabilities and the reliability of the new information can be seen in Figure 8.6.

16 The effect of the reliability of information on the modification of prior probabilities (+)

17 If the test drilling has only a 50% probability of giving a correct result then its result will not be of any interest and the posterior probability will equal the prior, as shown by the diagonal line on the graph.

18 The more reliable the new information, the greater will be the modification of the prior probabilities. For any given level of reliability, however, this modification is relatively small either where the prior probability is high, or where the prior probability is very small.

19 At the extreme, if your prior probability of an event occurring is zero then the posterior probability will also be zero. In general, assigning prior probabilities of zero or one is unwise.

20 Applying Bayes ’ theorem to a decision problem Decision Low sales High sales Hold small stocks $80000 $ Hold large stocks $20000 $ Profit $20000 $80000 $ $ Utility

21 The retailer estimates that there is a 0.4 probability that sales will be low and a 0.6 probability that they will be high. What level of stocks should he hold? In Figure 8.7(a), It can be seen that his expected utility is maximized if he decides to hold a small stock of the commodity.

22 The retailer ’ s problem with prior probabilities

23 Before implementing his decision the retailer receives a sales forecast which suggests that sales will be high. P(forecast of high sales|high sales)=0.75 P(forecast of high sales|high sales)=0.2

24 Applying Bayes ’ theorem to the retailer ’ s problem

25 Applying posterior probabilities to the retailer ’ s problem

26 Assessing the value of new information New information can remove or reduce the uncertainty involved in a decision and thereby increase the expected payoff. Whether it is worth obtaining the information in the first place or, if there are several potential sources of information, which one is to be preferred.

27 The expected value of perfect information The concept of the expected value of perfect information (EVPI) can still be useful. A problem is used to show how the value of perfect information can be measured. For simplicity, we will assume that the decision maker is neutral to risk so that the expected monetary value criterion can be applied. Refer to the following figure. (Descriptions are in page 227)

28 Determining the EVPI (Fig. 8.8)

29 Calculating the EVPI

30 If the test is perfectly accurate it would not be worth paying them more than $ It is likely that the test will be less than perfect, in which case the information it yields will be of less value. Nevertheless, the EVPI can be very useful in giving an upper bound to the value of new information.

31 If the manager is risk averse or risk seeking or if he also has non-monetary objectives then it may be worth him paying more or less than this amount.

32 The expected value of imperfect information Suppose that, after making further enquiries, the farm manager discovers that the Ceres test is not perfectly reliable. If the virus is still present in the soil the test has only a 90% chance of detecting it, while if the virus has been eliminated there is a 20% chance that the test will incorrectly indicate its presence. How much would it now be worth paying for the test?

33 Deciding whether to buy imperfect information

34 If test indicates virus is present

35 If test indicates virus is absent

36 Determining the EVII

37 Expected profit with imperfect information = $ Expected profit without the information = $ Expected value of imperfect information (EVII) = $5 155 Refer to Page 232

38 It would not, therefore, be worth paying Ceres more than $5155 for the test. You will recall that the expected value of perfect information was $15 000, so the value of information from this test is much less than that from a perfectly reliable test. Of course, the more reliable the new information, the closer its expected value will be to the EVP1.

39 A summary of the main stages (1) Determine the course of action which would be chosen using only the prior probabilities and record the expected payoff of this course of action; (2) Identify the possible indications which the new information can give; (3) For each indication: (a) Determine the probability that this indication will occur; (b) Use Bayes' theorem to revise the probabilities in the light of this indication; (c) Determine the best course of action in the light of this indication (i.e. using the posterior probabilities) and the expected payoff of this course of action;

40 (4) Multiply the probability of each indication occurring by the expected payoff of the course of action which should be taken if that indication occurs and sum the resulting products. This will give the expected payoff with imperfect information; (5) The expected value of the imperfect information is equal to the expected payoff with imperfect information (derived in stage 4) less the expected payoff of the course of action which would be selected using the prior probabilities (which was derived in stage 1).