11 Ahti Salo, Juuso Liesiö and Eeva Vilkkumaa Department of Mathematics and Systems Analysis Aalto University School of Science and Technology P.O. Box.

Slides:



Advertisements
Similar presentations
Testing Theories: Three Reasons Why Data Might not Match the Theory.
Advertisements

Chapter 15: Decisions Under Risk and Uncertainty McGraw-Hill/Irwin Copyright © 2011 by the McGraw-Hill Companies, Inc. All rights reserved.
1 Helsinki University of Technology Systems Analysis Laboratory Robust Portfolio Modeling for Scenario-Based Project Appraisal Juuso Liesiö, Pekka Mild.
McGraw-Hill/Irwin Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 15 Decisions under Risk and Uncertainty.
Value of flexibility in funding radical innovations E. Vilkkumaa, A. Salo, J. Liesiö, A. Siddiqui EURO INFORMS Joint meeting, Rome, Jul 1 st -4 th 2013.
Chapter 9 Hypothesis Tests. The logic behind a confidence interval is that if we build an interval around a sample value there is a high likelihood that.
PSY 307 – Statistics for the Behavioral Sciences
10 Hypothesis Testing. 10 Hypothesis Testing Statistical hypothesis testing The expression level of a gene in a given condition is measured several.
AN INTRODUCTION TO PORTFOLIO MANAGEMENT
Evaluating Hypotheses
Chapter 10 Hypothesis Testing
Aaker, Kumar, Day Seventh Edition Instructor’s Presentation Slides
IENG 486 Statistical Quality & Process Control
Inferences About Process Quality
Chapter 9 Hypothesis Testing.
“There are three types of lies: Lies, Damn Lies and Statistics” - Mark Twain.
Helsinki University of Technology Systems Analysis Laboratory A Portfolio Model for the Allocation of Resources to Standardization Activities Antti Toppila,
Item Analysis: Classical and Beyond SCROLLA Symposium Measurement Theory and Item Analysis Modified for EPE/EDP 711 by Kelly Bradley on January 8, 2013.
Standard Error of the Mean
Helsinki University of Technology Systems Analysis Laboratory 1 London Business School Management Science and Operations 1 London Business School Management.
Psy B07 Chapter 1Slide 1 ANALYSIS OF VARIANCE. Psy B07 Chapter 1Slide 2 t-test refresher  In chapter 7 we talked about analyses that could be conducted.
Aaker, Kumar, Day Ninth Edition Instructor’s Presentation Slides
Statistical inference: confidence intervals and hypothesis testing.
Hypothesis testing – mean differences between populations
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University ECON 4550 Econometrics Memorial University of Newfoundland.
Determining Sample Size
Theory testing Part of what differentiates science from non-science is the process of theory testing. When a theory has been articulated carefully, it.
CORRELATION & REGRESSION
Some Background Assumptions Markowitz Portfolio Theory
Testing Theories: Three Reasons Why Data Might not Match the Theory Psych 437.
STA Statistical Inference
Chapter 2 Risk Measurement and Metrics. Measuring the Outcomes of Uncertainty and Risk Risk is a consequence of uncertainty. Although they are connected,
Decision Making Under Uncertainty and Risk 1 By Isuru Manawadu B.Sc in Accounting Sp. (USJP), ACA, AFM
10.2 Tests of Significance Use confidence intervals when the goal is to estimate the population parameter If the goal is to.
© 2012 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Investment Analysis and Portfolio Management First Canadian Edition By Reilly, Brown, Hedges, Chang 6.
Bayesian evaluation and selection strategies in portfolio decision analysis E. Vilkkumaa, J. Liesiö, A. Salo EURO XXV, 8-11 July, Vilnius, Lituhania The.
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
Lecture 16 Section 8.1 Objectives: Testing Statistical Hypotheses − Stating hypotheses statements − Type I and II errors − Conducting a hypothesis test.
Economics 173 Business Statistics Lecture 4 Fall, 2001 Professor J. Petry
Correlation Assume you have two measurements, x and y, on a set of objects, and would like to know if x and y are related. If they are directly related,
1 Helsinki University of Technology Systems Analysis Laboratory Selecting Forest Sites for Voluntary Conservation with Robust Portfolio Modeling Antti.
Chapter 8 : Estimation.
Uncertainty Management in Rule-based Expert Systems
Optimal revision of uncertain estimates in project portfolio selection Eeva Vilkkumaa, Juuso Liesiö, Ahti Salo Department of Mathematics and Systems Analysis,
Bayesian evaluation and selection strategies in portfolio decision analysis E. Vilkkumaa, J. Liesiö, A. Salo INFORMS Annual meeting, Phoenix, Oct 14 th.
Helsinki University of Technology Systems Analysis Laboratory Portfolio and Scenario Analysis in the Cost-Effectiveness Evaluation of Weapon Systems Jussi.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
1 Helsinki University of Technology Systems Analysis Laboratory Fostering the Diversity of Innovation Activities through e-Participation Totti Könnölä,
The Analysis of Variance ANOVA
Statistical Inference Statistical inference is concerned with the use of sample data to make inferences about unknown population parameters. For example,
1 HETEROSCEDASTICITY: WEIGHTED AND LOGARITHMIC REGRESSIONS This sequence presents two methods for dealing with the problem of heteroscedasticity. We will.
1 School of Science and Technology Systems Analysis Laboratory Graduate school seminar presentation Current research topics in Portfolio Decision.
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
Helsinki University of Technology Systems Analysis Laboratory EURO 2009, Bonn Supporting Infrastructure Maintenance Project Selection with Robust Portfolio.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Hypothesis Tests. An Hypothesis is a guess about a situation that can be tested, and the test outcome can be either true or false. –The Null Hypothesis.
UNCERTAINTY OF MEASUREMENT Andrew Pascall Technical Director Integral Laboratories (Pty) Ltd
Chapter 10 Hypothesis Testing 1.
Chapter 15: Decisions Under Risk and Uncertainty
Decisions Under Risk and Uncertainty
Statistical Process Control
Introduction to Instrumentation Engineering
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Chapter 15 Decisions under Risk and Uncertainty
Chapter 8 Hypothesis Tests
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Chapter 15: Decisions Under Risk and Uncertainty
Chapter 9 Hypothesis Testing: Single Population
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Presentation transcript:

11 Ahti Salo, Juuso Liesiö and Eeva Vilkkumaa Department of Mathematics and Systems Analysis Aalto University School of Science and Technology P.O. Box 11000, Aalto FINLAND Selecting Better Portfolios Based on Uncertainty-Adjusted Performance Estimates

2 Characteristics project portfolio selection n Large number of proposals –Typically dozens or even hundreds of proposal n Only a fraction can be selected with available resources –Even other resources than money may matter (critical competences) n “Value” may be measured with regard to several criteria –International collaboration, innovativeness, feasibility of plans n Reliable information about value is difficult hard to obtain –Different experts may give different ratings –The accuracy of evaluation information may vary from one project to the next

3 n Projects offer different amounts of value (eg NPV) n Estimates about projects’ values are inherently uncertain n Yet decisions must be based on these uncertain estimates n In reality, projects whose values have been overestimated have a higher chance of getting selected n Thus the decision maker should expect to be disappointed with the performance of the selected portfolio Logic behind the optimizer’s curse

4 Example – choose 5 out of 12 projects

5 Value of information and optimality in DA n Optimizer’s curse: skepticism and postdecision surprise in decision analysis (Smith and Winkler, 2006) –Choose one out of many alternatives –Normally distributed values and errors –Positively correlated errors aggravate the curse n Value of information in project portfolio selection (Keisler, 2004) –For some selection rules, the selected portfolio has much higher value than for other selection rules –It pays off to devote attention to the design of the selection process

6

7 Emphasis in the Priority-Setting Process Salo, A. & J. Liesiö. A Case Study in Participatory Priority-Setting for a Scandinavian Research Programme, International Journal of Information Technology and Decision Making 5/1 (2006)

8 n High expectations may not be necessarily met –Eg., biotechnological research in Finland has not lead to the emergence of a strong industrial sector n Management questions: Relevance to funding agencies Should projects with higher evaluation uncertainties be selected together with lower evaluation uncertainties? Should the level of uncertainties be explicitly accounted for in project selection decisions?

9 What if evaluation uncertainties vary across projects? n Projects whose value has been overestimated are more likely to become selected n When the competition is strong, it is likely that more selections will be made from projects with high evaluation errors  these projects become overrepresented n Thus, one should pay attention not only to the estimates but also to the uncertainty of estimates n How can such uncertainties be systematically accounted for?

10 Example – choose 5 projects out of 12 Value V Estimate V+e Max estimates Optimal solution 19,628, ,677, ,1812, ,663, ,8611, ,4917, ,655, ,7322, ,4513, ,4614, ,2720, ,2413,5810 Estimated value87,56- Actual value70,1973,53 E[V]Std[V]Std[e] Estiamte Value : Projects with low evaluation uncertainties : Projects with high evaluation uncertainties

11 Select k out of n projects with the aim of maximizing the sum of the projects’ ‘true’ values μ i, i =1,..., n The values μ i are generally unknown Decisions are made based on estimates V i about μ i Selection process EstimatesPortfolio selectionValues t

12 n Assume that estimates are unbiased n Overestimated projects are more likely to get selected  Resulting portfolio value is less that what the estimates suggest (optimizer’s curse; cf. Smith and Winkler, 2006) where is the index set of the selected projects. Optimizer’s curse in project portfolio selection

13 n Choose 10 projects out of 100 n Values i.i.d with n Unbiased estimates Optimizer’s curse µ i ~ N(0,1) V i = µ i + ε i, ε i ~ N(0,σ 2 )

14 Optimal revision of the estimates n Estimates do not account for the uncertainties n Use Bayesian revised estimates instead as a basis for project selection For, the estimate and the prior mean m i are weighted according to their respective levels uncertainties, where

15 Selections based on revised estimates E[V]Std[V]Std[e] : Projects with low evaluation uncertainties : Projects with high evaluation uncertainties Value V Estimate V+e Revised estimate V* Max estimates Max revised estimates Optimum portfolio 19,628,938, ,677,537, ,1812,4312, ,663,063, ,8611,4211, ,4917,0716, ,655,658, ,7322,1814, ,4513,5311, ,4614,0211, ,2720,7214, ,2413,5811,40110 Estimated value87,5669,27- True value70,1970,9173,53 Arvio Arvo E[V]

16 Elimination of optimizer’s curse n With revised estimates, the optimizers’ curse is eliminated that is where is the index set of projects selected based on revised estimates n Previous example –Choosing 10 projects out of 100 –True values i.i.d. with –Unbiased estimates µ i ~ N(0,1) V i = µ i + ε i, ε i ~ N (0,σ 2 ) Portfolio value Standard deviation of estimation error

17 Revised estimates and portfolio composition n In this example, the projects’ values were identically distributed and the estimation errors had equal variances n In this case, the priorization of projects remains unchanged when using revised estimates, because n In general, revised estimates may result in a different project prioritization than the initial estimates

18 Example on the revision of estimates n Choose 3 projects out of 8 n True values are i.i.d. with µ i ~ N(0,1) n Left: All projects are equally difficult to evaluate  equal variances of errors n Right: Four project are harder to evaluate and have higher variances of error terms  steeper correction slopes V i = µ i + ε i, ε i ~ N(0,0.5 2 ) V i = µ i + ε i, ε i ~ N(0,1)

19 n Left: For equal variances, all estimates are revised towards the mean in the same way n Right: More uncertain “dashed” estimates are revised to the mean more strongly n Thus different portfolios of three projects would be selected, depending on whether or not estimates are revised Revision of estimates and portfolio selections

20 Portfolio for revised estimates n Revised estimates tend to yield higher portfolio value n Example –Select 10 out of 100 projects with values µ i ~ N(3,12) –Two sub-populations 1) ε i ~ N(0,0.1) - small errors 2) ε i ~ N(0,1) - large errors Optimal Estimates Revised estimates

21 Share of correct choices Using revised estimates increases the share of projects that belong to the optimal portfolio, i.e., where K is the index set of the projects in the optimal portfolio There is a statistically significant difference between the portfolios ( α =0.05) when the share of projects with high evaluation uncertainties is between 25-55% Share of correct choices [%] Share of projects with large error variance [%]

22 n Selection based on unrevised estimates –Optimizer’s curse: The value of the portfolio will, on average, be lower than expected –If the proposals come from populations with different levels of estimation errors, the selected portfolio is likely to contain too many projects from the population with high uncertainties n Improving the selection process –Account for evaluation uncertainties by using revised estimates –Build separate portfolios for sub-populations with different levels of evaluation errors (e.g., a separate budget for ‘high-risk’ projects) –But do we know how uncertain the evaluations are? Conclusion