Helsinki University of Technology Systems Analysis Laboratory 1 London Business School Management Science and Operations 1 London Business School Management.

Slides:



Advertisements
Similar presentations
CAPITAL BUDGETING TECHNIQUES
Advertisements

Hypothesis Testing making decisions using sample data.
1 Helsinki University of Technology Systems Analysis Laboratory Robust Portfolio Modeling for Scenario-Based Project Appraisal Juuso Liesiö, Pekka Mild.
Experimental Design, Response Surface Analysis, and Optimization
Helsinki University of Technology Systems Analysis Laboratory RPM – Robust Portfolio Modeling for Project Selection Pekka Mild, Juuso Liesiö and Ahti Salo.
Statistical Issues in Research Planning and Evaluation
Value of flexibility in funding radical innovations E. Vilkkumaa, A. Salo, J. Liesiö, A. Siddiqui EURO INFORMS Joint meeting, Rome, Jul 1 st -4 th 2013.
GoldSim 2006 User Conference Slide 1 Vancouver, B.C. The Submodel Element.
Content Based Image Clustering and Image Retrieval Using Multiple Instance Learning Using Multiple Instance Learning Xin Chen Advisor: Chengcui Zhang Department.
EPIDEMIOLOGY AND BIOSTATISTICS DEPT Esimating Population Value with Hypothesis Testing.
T-tests Computing a t-test  the t statistic  the t distribution Measures of Effect Size  Confidence Intervals  Cohen’s d.
Evaluating Hypotheses Chapter 9. Descriptive vs. Inferential Statistics n Descriptive l quantitative descriptions of characteristics.
Cal State Northridge  320 Ainsworth Sampling Distributions and Hypothesis Testing.
BCOR 1020 Business Statistics Lecture 22 – April 10, 2008.
Evaluating Hypotheses Chapter 9 Homework: 1-9. Descriptive vs. Inferential Statistics n Descriptive l quantitative descriptions of characteristics ~
Evaluating Hypotheses
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 9-1 Chapter 9 Fundamentals of Hypothesis Testing: One-Sample Tests Basic Business Statistics.
Aaker, Kumar, Day Seventh Edition Instructor’s Presentation Slides
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning 1 Evaluating Hypotheses.
Ch. 9 Fundamental of Hypothesis Testing
BCOR 1020 Business Statistics
Chapter 9 Flashcards. measurement method that uses uniform procedures to collect, score, interpret, and report numerical results; usually has norms and.
Inference about Population Parameters: Hypothesis Testing
Determining the Size of
Sampling Moazzam Ali.
Copyright © 2012 Pearson Education. All rights reserved Copyright © 2012 Pearson Education. All rights reserved. Chapter 10 Sampling Distributions.
Estimation Goal: Use sample data to make predictions regarding unknown population parameters Point Estimate - Single value that is best guess of true parameter.
ECONOMETRICS I CHAPTER 5: TWO-VARIABLE REGRESSION: INTERVAL ESTIMATION AND HYPOTHESIS TESTING Textbook: Damodar N. Gujarati (2004) Basic Econometrics,
Aaker, Kumar, Day Ninth Edition Instructor’s Presentation Slides
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project.
1 Helsinki University of Technology Systems Analysis Laboratory RPM-Explorer - A Web-based Tool for Interactive Portfolio Decision Analysis Erkka Jalonen.
Slide 1 Tutorial: Optimal Learning in the Laboratory Sciences The knowledge gradient December 10, 2014 Warren B. Powell Kris Reyes Si Chen Princeton University.
Statistical Inference Two Statistical Tasks 1. Description 2. Inference.
Inferential Statistics 2 Maarten Buis January 11, 2006.
CHAPTER 7 Probability and Samples: Distribution of Sample Means.
Helsinki University of Technology Systems Analysis Laboratory Determining cost-effective portfolios of weapon systems Juuso Liesiö, Ahti Salo and Jussi.
CORNERSTONES of Managerial Accounting 5e. © 2014 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part,
Statistical analysis Prepared and gathered by Alireza Yousefy(Ph.D)
1 Statistical Distribution Fitting Dr. Jason Merrick.
Engineering Economic Analysis Canadian Edition
1 Chapter 7 Applying Simulation to Decision Problems.
© 2012 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Bayesian evaluation and selection strategies in portfolio decision analysis E. Vilkkumaa, J. Liesiö, A. Salo EURO XXV, 8-11 July, Vilnius, Lituhania The.
Simulation is the process of studying the behavior of a real system by using a model that replicates the behavior of the system under different scenarios.
Research Seminars in IT in Education (MIT6003) Quantitative Educational Research Design 2 Dr Jacky Pow.
Inferential Statistics A Closer Look. Analyze Phase2 Nature of Inference in·fer·ence (n.) “The act or process of deriving logical conclusions from premises.
Lecture 16 Section 8.1 Objectives: Testing Statistical Hypotheses − Stating hypotheses statements − Type I and II errors − Conducting a hypothesis test.
1 Helsinki University of Technology Systems Analysis Laboratory Selecting Forest Sites for Voluntary Conservation with Robust Portfolio Modeling Antti.
Optimal revision of uncertain estimates in project portfolio selection Eeva Vilkkumaa, Juuso Liesiö, Ahti Salo Department of Mathematics and Systems Analysis,
Chap 8-1 Fundamentals of Hypothesis Testing: One-Sample Tests.
Bayesian evaluation and selection strategies in portfolio decision analysis E. Vilkkumaa, J. Liesiö, A. Salo INFORMS Annual meeting, Phoenix, Oct 14 th.
Helsinki University of Technology Systems Analysis Laboratory 1DAS workshop Ahti A. Salo and Raimo P. Hämäläinen Systems Analysis Laboratory Helsinki.
Helsinki University of Technology Systems Analysis Laboratory Portfolio and Scenario Analysis in the Cost-Effectiveness Evaluation of Weapon Systems Jussi.
1 BA 555 Practical Business Analysis Linear Programming (LP) Sensitivity Analysis Simulation Agenda.
Hypothesis Testing. Outline of Today’s Discussion 1.Logic of Hypothesis Testing 2.Evaluating Hypotheses Please refrain from typing, surfing or printing.
Statistical Techniques
Statistical Inference Statistical inference is concerned with the use of sample data to make inferences about unknown population parameters. For example,
11 Ahti Salo, Juuso Liesiö and Eeva Vilkkumaa Department of Mathematics and Systems Analysis Aalto University School of Science and Technology P.O. Box.
1 School of Science and Technology Systems Analysis Laboratory Graduate school seminar presentation Current research topics in Portfolio Decision.
Uncertainty and confidence Although the sample mean,, is a unique number for any particular sample, if you pick a different sample you will probably get.
Hypothesis Tests. An Hypothesis is a guess about a situation that can be tested, and the test outcome can be either true or false. –The Null Hypothesis.
1 Helsinki University of Technology Systems Analysis Laboratory Standardization Portfolio Management for a Global Telecom Company Ville Brummer Systems.
Chapter Two Copyright © 2006 McGraw-Hill/Irwin The Marketing Research Process.
UNCERTAINTY OF MEASUREMENT Andrew Pascall Technical Director Integral Laboratories (Pty) Ltd
Statistical Process Control
Discrete Event Simulation - 4
What are their purposes? What kinds?
OMGT LECTURE 10: Elements of Hypothesis Testing
Intro to Confidence Intervals Introduction to Inference
Presentation transcript:

Helsinki University of Technology Systems Analysis Laboratory 1 London Business School Management Science and Operations 1 London Business School Management Science and Operations Optimizer's Curse in Project Portfolio Selection Ahti Salo and Juuso Liesiö Systems Analysis Laboratory Helsinki University of Technology Bert de Reyck Management Science and Operations London Business School

Helsinki University of Technology Systems Analysis Laboratory 2 London Business School Management Science and Operations London Business School Management Science and Operations Characteristics project portfolio selection n Large number of proposals –Typically dozens or even hundreds of proposal n Only a fraction can be selected with available resources –Even other resources than money may matter (critical competences) n “Value” may be measured with regard to several criteria –International collaboration, innovativeness, feasibility of plans n Reliable information about value is hard to obtain –Different experts may give different ratings

Helsinki University of Technology Systems Analysis Laboratory 3 London Business School Management Science and Operations London Business School Management Science and Operations n Projects offer different amounts of value (eg NPV) n Estimates about projects’ values are uncertain n Decisions are based on these uncertain value estimates n Projects whose values have been overestimated have a higher chance of getting selected n Thus the DM should expect to be disappointed with the performance of the selected portfolio Logic behind the optimizer’s curse

Helsinki University of Technology Systems Analysis Laboratory 4 London Business School Management Science and Operations London Business School Management Science and Operations Example on choosing 6 out of 12 projects

Helsinki University of Technology Systems Analysis Laboratory 5 London Business School Management Science and Operations London Business School Management Science and Operations Value of information and optimality in DA n The optimizer’s curse: skepticism and postdecision surprise in decision analysis (Smith and Winkler, 2006) –Positively correlated errors aggravate the curse n Value of information in project portfolio selection (Keisler, 2004) –Different selection rules have an impact on the quality of the selected portfolio n How bad is the optimizer’s curse in project portfolio selection? n What selection rules are better than others?

Helsinki University of Technology Systems Analysis Laboratory 6 London Business School Management Science and Operations London Business School Management Science and Operations Approach and research questions n Key questions –How does (i) the number and (ii) quality of evaluation statements impact the optimal project portfolio? –What kinds of evaluation and selection procedures outperform others? n Concepts –True value: Value (e.g., quality, research output) which would be produced, if the project were to be funded –Estimated value: Value that the expert reports in his/her evaluation statement –Optimal portfolio: The portfolio that maximizes the aggregate sum of true values (typically not known, can be determined only if true values are known) –Selected portfolio: The portfolio that maximizes the sum of estimated values n Results based on simulation and optimization models

Helsinki University of Technology Systems Analysis Laboratory 7 London Business School Management Science and Operations London Business School Management Science and Operations n 100 project proposals –20 out of these will be selected (  approval rate 20 %) n At least one statement on each proposal –All statements have the same cost (e.g., about 0.5% of project costs) n The “true” underlying value distributed on the range 1-5 n Evaluation statements convey information about the true value –Statements also in the same range involve uncertainties n Statements inform decision making Illustration of project evaluation and selection

Helsinki University of Technology Systems Analysis Laboratory 8 London Business School Management Science and Operations London Business School Management Science and Operations Examples of selection mechanisms n One-phase (”batch-mode”) –Equally many evaluations (1 or several) on each proposal –Projects selected on the basis of the average of reported ratings on the evaluation statements n Two-phase 1.Discard 50 % of proposals based on a single evaluation statement 2.Acquire additional statements on the remaining 50 % 3.Select projects on the basis of the average of ratings on the reported statements Additional statements on the remaining 50% Discard 50% based on 1 statement Choose 20% Proposals Choose 20% Statements Proposals

Helsinki University of Technology Systems Analysis Laboratory 9 London Business School Management Science and Operations London Business School Management Science and Operations Distributions of underlying value and statements n Distribution of “true” value is modelled through a probability distribution n Evaluation statements depend on the true value –“Good” proposals are likely to have a higher rating on the 1-5 scale

Helsinki University of Technology Systems Analysis Laboratory 10 London Business School Management Science and Operations London Business School Management Science and Operations Optimizer’s curse in the average quality of projects (based on the distributions on the preceding slide) Evaluation cost (% of project cost) Average quality of selected projects 2-phase (real) 2-phase (estimated) 1-phase (real) 1-phase (estimated)

Helsinki University of Technology Systems Analysis Laboratory 11 London Business School Management Science and Operations London Business School Management Science and Operations Evaluations help approach the societal optimum Small uncertainties Large uncertainties Evaluation cost (% of project cost) 2-phase 1-phase Value of the selected portfolio as % of the optimum portfolio

Helsinki University of Technology Systems Analysis Laboratory 12 London Business School Management Science and Operations London Business School Management Science and Operations But justice to the individual is difficult to guarantee Small uncertainties Large uncertainties Share of selected projects (%) that are also in the optimal portfolio Evaluation cost (% of project cost) 2-phase 1-phase

Helsinki University of Technology Systems Analysis Laboratory 13 London Business School Management Science and Operations London Business School Management Science and Operations Impact of competitive tendering on productivity 1(3) n Include the effort of proposal preparation –Approval rate 20 % (select 20 projects out of 100 proposals) n When do the benefits of further statements exceed the cost of obtaining them? –Evaluation costs estimated here at 0.5% of project costs –A statement on a € project costs 500 € n Account for the efforts required by proposal preparation, too –Preparation efforts estimated at 5% of project costs ( € *0.05 = 5000€) –If one statement is obtained on all projects, the total cost will be 20* € + 100*5500€ = 2,55 M€

Helsinki University of Technology Systems Analysis Laboratory 14 London Business School Management Science and Operations London Business School Management Science and Operations Impact of competitive tendering on productivity 2(3) (based on larger uncertainties) Aggregate preparation and evaluation cost (% of project cost) 10% preparation cost 5% preparation cost 0% preparation cost 2-phase selection 1-phase selection random selection with no tendering Average value (research production) per unit of total expenditure

Helsinki University of Technology Systems Analysis Laboratory 15 London Business School Management Science and Operations London Business School Management Science and Operations Impact of competitive tendering on productivity 3(3) n Competitive tendering enhances productivity when –There is high variability in the quality of proposals –Approval rate is high enough –Proposal preparation does not require excessive efforts –Evaluation statements are reasonably good (i.e., correlated with actual quality) n Current situation –Productivity of Finnish research has declined? n Observations –Preceding results merely exemplify what kinds of questions can be answered –Parameters can be estimated from data (databases, expert judgements) –Lends support for improving evaluation and selection processes