1 Civil Systems Planning Benefit/Cost Analysis Scott Matthews Courses: 12-706 / 19-702.

Slides:



Advertisements
Similar presentations
Utility Theory.
Advertisements

Chapter 8: Decision Analysis
Chapter 17 Decision Making 17.1 Payoff Table and Decision Tree 17.2 Criteria for Decision Making.
Managerial Decision Modeling with Spreadsheets
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
DSC 3120 Generalized Modeling Techniques with Applications
Decision analysis: part 2
Slides prepared by JOHN LOUCKS St. Edward’s University.
Part 3 Probabilistic Decision Models
Decision Analysis Chapter 3
Decision Tree Analysis. Decision Analysis Managers often must make decisions in environments that are fraught with uncertainty. Some Examples –A manufacturer.
Problem 4.15 A stock market investor has $500 to spend and is considering purchasing an option contract on 1000 shares of Apricot Computer. The shares.
1 Chapter 12 Value of Information. 2 Chapter 12, Value of information Learning Objectives: Probability and Perfect Information The Expected Value of Information.
Decision Analysis Introduction Chapter 6. What kinds of problems ? Decision Alternatives (“what ifs”) are known States of Nature and their probabilities.
1 Business System Analysis & Decision Making - Lecture 5 Zhangxi Lin ISQS 5340 July 2006.
Elements of Decision Problems
Engineering Economic Analysis Canadian Edition
EVPI and Utility Lecture 20 November 7, / /
1 Imperfect Information / Utility Scott Matthews Courses: /
CS 589 Information Risk Management 30 January 2007.
CS 589 Information Risk Management 23 January 2007.
1 Civil Systems Planning Benefit/Cost Analysis Scott Matthews / / Lecture 10.
1 Stochastic Dominance Scott Matthews Courses: /
Incomplete Contracts Renegotiation, Communications and Theory December 10, 2007.
Module 4 Topics: Creating case study decision tree
1 Civil Systems Planning Benefit/Cost Analysis Scott Matthews Courses: / / Lecture /5/2005.
Decision Analysis Chapter 3
1 Mutli-Attribute Decision Making Eliciting Weights Scott Matthews Courses: /
1 Decision Analysis Scott Matthews /
1 1 Slide © 2005 Thomson/South-Western EMGT 501 HW Solutions Chapter 12 - SELF TEST 9 Chapter 12 - SELF TEST 18.
1 Decision Analysis Here we study the situation where the probability of each state of nature is known.
1 Imperfect Information / Utility Scott Matthews Courses: /
Information Lecture 19 November 2, / /
Chapter 8 Decision Analysis n Problem Formulation n Decision Making without Probabilities n Decision Making with Probabilities n Risk Analysis and Sensitivity.
Sensitivity Analysis: Chapter 5
“ The one word that makes a good manager – decisiveness.”
Managing Risk. Objectives  To Describe Risk Management concepts and techniques  To calculate and analyze a project using Probability of completion 
Decision Analysis (cont)
Chapter 3 Decision Analysis.
Value of information Marko Tainio Decision analysis and Risk Management course in Kuopio
1 1 Slide Decision Theory Professor Ahmadi. 2 2 Slide Learning Objectives n Structuring the decision problem and decision trees n Types of decision making.
Engineering Economic Analysis Canadian Edition
1 Chapter 7 Applying Simulation to Decision Problems.
1 Mutli-Attribute Decision Making Scott Matthews Courses: / /
Games. Adversaries Consider the process of reasoning when an adversary is trying to defeat our efforts In game playing situations one searches down the.
1 Business System Analysis & Decision Making - Lecture 10 Zhangxi Lin ISQS 5340 July 2006.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 16-1 Chapter 16 Decision Making Statistics for Managers Using Microsoft.
Civil Systems Planning Benefit/Cost Analysis
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 17-1 Chapter 17 Decision Making Basic Business Statistics 10 th Edition.
Chap 4 Comparing Net Present Value, Decision Trees, and Real Options.
Quantitative Decision Techniques 13/04/2009 Decision Trees and Utility Theory.
Decision theory under uncertainty
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 14 From Randomness to Probability.
1 Civil Systems Planning Benefit/Cost Analysis Scott Matthews Courses: / / Lecture 12.
Introduction to Probabilistic Analysis Introduction to Probabilistic Analysis The third phase of the cycle incorporates uncertainty into the analysis.
Copyright © 2009 Cengage Learning 22.1 Chapter 22 Decision Analysis.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc Chapter 23 Decision Analysis.
Models for Strategic Marketing Decision Making. Market Entry Decisions To enter first or to wait Sources of First-Mover Advantages –Technological leadership.
BUAD306 Chapter 5S – Decision Theory. Why DM is Important The act of selecting a preferred course of action among alternatives A KEY responsibility of.
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 Chapter 8 Revising Judgments in the Light of New Information.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 16 Decision Analysis.
1 1 Slide © 2005 Thomson/South-Western Chapter 13 Decision Analysis n Problem Formulation n Decision Making without Probabilities n Decision Making with.
1 Civil Systems Planning Benefit/Cost Analysis Scott Matthews Courses: / / Lecture 16.
Decision Making Under Uncertainty
Q 2.1 Nash Equilibrium Ben
Chapter 5S – Decision Theory
MNG221- Management Science –
Making decisions using mathematics
Decision Analysis Decision Trees Chapter 3
Presentation transcript:

1 Civil Systems Planning Benefit/Cost Analysis Scott Matthews Courses: /

and Announcements  Recitation Friday  HW 3 Due Today (now)

and Risk Profiles (“pmf”)  Risk profile shows a distribution of possible payoffs associated with particular strategies.  Chances associated with possible consequences  A strategy is what you might plan to do going in to the decision. Holds your plans constant, allows chances to occur  Only eliminate things YOU wouldn’t do, not things “they” might not do (you cant control them).  Centered around the decision (not chance) nodes in the tree

and Risk Profiles (cont.)  There are only 3 “decision strategies” in the base Texaco case:  Accept the $2 billion offer (topmost branch of 1st dec. node)  Counteroffer $5 Billion, but plan to refuse counteroffer (lower branch of 1st node, upper branch of second)  Counteroffer $5B, but plan to accept counteroffer (lower branch of both decision nodes)

and Risk Profiles (cont.)  Key concept: you do not have complete control over outcome of “the game” or “the lottery” represented by the tree  BUT Consideration of risk profile for each strategy cuts out part of original tree  You can plan a strategy (i.e., which branches to choose) but the other side may make choices such that you do not exactly go in the tree where you intended.  Risk profile for “Accept $2 Billion” is obvious - get $2B with 100% chance.

and Profile for “Counteroffer $5B, refuse counteroffer”..  Below is just the part of original tree to consider when calculating the risk profile:

and Solving Risk Profile  Solve for discrete probabilities of outcomes  Make risk profile  25% chance of $0

and Cumulative Risk Profiles  Percent chance that “payoff is less than x”  “Accept $2B” RP and CRP are below (easy)  CRP Goes 0->1 at $2B (0% chance it is below $2B, 100% chance below anything > $2B.

and CRPs for Other 2 Strategies

and Dominance  To pick between strategies, it is useful to have rules by which to eliminate options  Let’s construct an example - assume minimum “court award” expected is $2.5B (instead of $0). Now there are no “zero endpoints” in the decision tree.

and Stochastic Dominance: Example #1  CRP below for 2 strategies shows “Accept $2 Billion” is dominated by the other.

and Stochastic Dominance “Defined”  A is better than B if:  Pr(Profit > $z |A) ≥ Pr(Profit > $z |B), for all possible values of $z.  Or (complementarity..)  Pr(Profit ≤ $z |A) ≤ Pr(Profit ≤ $z |B), for all possible values of $z.  A FOSD B iff F A (z) ≤ F B (z) for all z

and Example  L1 = (0, 1/6; 1, 1/3; 2, 1/2)  L2 = (0, 1/3; 1, 1/3; 2, 1/3)  Given these 2 lotteries, does one first- order stochastic dominate the other?

and Value of Information  We have been doing decision analysis with best guesses of probabilities  Have been building trees with chance and decision nodes, finding expected values  It is relevant and interesting to determine how important information might be in our decision problems.  Could be in the form of paying an expert, a fortune teller, etc. Goal is to reduce/eliminate uncertainty in the decision problem.

and Willingness to Pay = EVPI  We’re interested in knowing our WTP for (perfect) information about our decision.  The book shows this as Bayesian probabilities, but think of it this way..  We consider the advice of “an expert who is always right”.  If they say it will happen, it will.  If they say it will not happen, it will not.  They are never wrong.  Bottom line - receiving their advice means we have eliminated the uncertainty about the event.

and Notes on EVPI  Key is understanding what the relevant information is, and how it affects the tree.  Quotes from pp. 501, 509 of Clemen  “Redraw the tree so that the uncertainty nodes for which perfect information is (now) available come before the decision node(s).”  (When multiple uncertain nodes exist..) “Move those chance nodes for which information is to be obtained so that they (all) precede the decision node.”  Note: by “before” or “precede” we mean in the tree, from left to right (as opposed to in the tree solving process)

and

and

and Discussion  The difference between the 2 trees (decision scenarios) is the EVPI  $ $580 = $420.  That is the amount up to which you would be willing to pay for advice on how to invest.  If you pay less than the $420, you would expect to come out ahead, net of the cost of the information.  If you pay $425 for the info, you would expect to lose $5 overall!  Finding EVPI is really simple to do / PrecisionTree plug-in

and Is EVPI Additive? Pair group exercise  Let’s look at handout for simple “2 parts uncertainty problem” considering the choice of where to go for a date, and the utility associated with whether it is fun or not, and whether weather is good or not.  What is Expected value in this case?  What is EVPI for “fun?”; EVPI for “weather?”  What do the revised decision trees look like?  What is EVPI for “fun and Weather?”  Is EVPI fun + EVPI weather = EVPI fun+weather ?

and Additivity, cont.  Now look at p,q labels on handout for the decision problem (top values in tree)  Is it additive if instead p=0.3, q = 0.8?  What if p=0.2 and q=0.2?  Should make us think about sensitivity analysis - i.e., how much do answers/outcomes change if we change inputs..

and EVPI - Why Care?  For information to “have value” it has to affect our decision  Just like doing Tornado diagrams showed us which were the most sensitive variables  EVPI analysis shows us which of our uncertainties is the most important, and thus which to focus further effort on  If we can spend some time/money to further understand or reduce the uncertainty, it is worth it when EVPI is relatively high.

and Final Thoughts on Plugins  You can combine the decision trees and the sensitivity plugins.  Do “Sensitivity of Expected Values” by varying the probabilities (see end Chap 5)  Also - can do EVPI  Don’t need to do everything by hand!  But it helps to be able to.

and Visualizing Decision Tree Results zEMV Outdoors : 100p+70q zEMV Indoors : 40p + 50q + 60(1-p-q) zEMV O > EMV I : p > -(2/3)q+1/ Outdoors Indoors

and Similar: EVII  Imperfect, rather than perfect, information (because it is rarely perfect)  Example: our expert acknowledges she is not always right, we use conditional probability (rather than assumption of 100% correct all the time) to solve trees.  Ideally, they are “almost always right” and “almost never wrong”  e.g.. P(Up Predicted | Up) is less than but close to 1.  P(Up Predicted | Down) is greater than but close to 0

and Assessing the Expert

and Expert side of EVII tree This is more complicated than EVPI because we do not know whether the expert is right or not. We have to decide whether to believe her.

and Use Bayes’ Theorem  “Flip” the probabilities.  We know P(“Up”|Up) but instead need P(Up | “Up”).  P(Up|”Up”) =  P(“Up”|Up)*P(Up) P(“Up”|Up)*P(Up)+.. P(“Up”|Down)P(Down)  = 0.8* * * *0.2 =0.8247

and EVII Tree Excerpt

and Rolling Back to the Top

and Transition  Speaking of Information..  Facility case study for monday