Presentation is loading. Please wait.

Presentation is loading. Please wait.

Helsinki University of Technology Systems Analysis Laboratory Antti Punkka and Ahti Salo Systems Analysis Laboratory Helsinki University of Technology.

Similar presentations


Presentation on theme: "Helsinki University of Technology Systems Analysis Laboratory Antti Punkka and Ahti Salo Systems Analysis Laboratory Helsinki University of Technology."— Presentation transcript:

1 Helsinki University of Technology Systems Analysis Laboratory Antti Punkka and Ahti Salo Systems Analysis Laboratory Helsinki University of Technology P.O. Box 1100, 02015 HUT, Finland http://www.sal.hut.fi/ Incomplete Holistic Comparisons in Value Tree Analysis

2 Helsinki University of Technology Systems Analysis Laboratory 2 Overall goal (a 0 ) Attribute 1 (a 1 ) Value tree analysis Alternative 1 (x 1 ) Attribute 5 (a 5 ) Attribute 4 (a 4 ) Attribute 3 (a 3 ) Attribute 2 (a 2 ) Alternative 3 (x 3 ) Alternative 2 (x 2 )

3 Helsinki University of Technology Systems Analysis Laboratory 3 n n m alternatives, X={x 1,…,x m }, n attributes, A={a 1,…,a n } n n Additive value function n n Least and most preferred achievement levels – –all attributes relevant – – – –attribute weight w i represents the improvement in overall value when an alternative’s achievement with regard to attribute a i changes from the least to the most preferred level n n Value tree analysis

4 Helsinki University of Technology Systems Analysis Laboratory 4 Weight elicitation n Complete information –captured by point estimates –e.g., SMART (Edwards 1977) n Incomplete information –weight and weight ratio intervals »e.g., –e.g., PAIRS (Salo and Hämäläinen 1992), PRIME (Salo and Hämäläinen 2001) n Ordinal information –ask the DM to rank the attributes in terms of importance –e.g., rank sum weights (Stillwell et al. 1981) –incomplete ordinal information (RICH; Salo and Punkka 2003)

5 Helsinki University of Technology Systems Analysis Laboratory 5 Incomplete information n Complete information may be hard to acquire –alternatives and their impacts? –relative importance of attributes? »e.g., n Alternatives’ overall values can be represented as intervals –e.g., the smallest and the largest possible value can be solved through LP where S is the feasible region for the attribute weights based on the DM’s preference statements

6 Helsinki University of Technology Systems Analysis Laboratory 6 Pairwise dominance n Alternative x k dominates x j in the sense of pairwise dominance –dominated alternative is non-optimal whenever the DM’s preference statements are fulfilled => it can be discarded –e.g., a problem with two attributes, n Alternatives may remain non- dominated, however –decision rules assist the DM in selection of the most preferred one where S is the feasible region for scores and weights V w1w1 0.40.7 w2w2 0.60.3 x 1 dominates x 2

7 Helsinki University of Technology Systems Analysis Laboratory 7 Decision rules n Maximax –alternative with greatest maximum overall value n Maximin –alternative with greatest minimum overall value n Minimax regret –alternative for which the greatest possible loss of value against some other alternative is the smallest n Central values –alternative with greatest sum of maximum and minimum overall value

8 Helsinki University of Technology Systems Analysis Laboratory 8 Use of ordinal preference statements n Complete ordinal information –ask the DM to rank the attributes in terms of importance –derive a representative weight vector from the ranking »rank sum weights (Stillwell et al. 1981) »rank reciprocal (Stillwell et al. 1981) »rank-order centroid weights (Barron 1992) n Incomplete ordinal preference information –the DM may be unable to rank the attributes »”which is more important - economy or environmental impacts”

9 Helsinki University of Technology Systems Analysis Laboratory 9 Rank Inclusion in Criteria Hierarchies (RICH) n Associate possible rankings with sets of attributes –e.g., ”economy and environmental impacts are among the three most important attributes” –presumably easier and faster to give than numerical statements –easy to understand –statements define possibly non- convex feasible regions Supported by the decision support tool RICH Decisions http://www.rich.hut.fi “The most important of the three attributes is either attribute 1 or 2”

10 Helsinki University of Technology Systems Analysis Laboratory 10 Ordinal information in evaluation of the alternatives n Numerical evaluation may be difficult –may lead to erroneous approximations on alternatives’ properties (Payne et al. 1993) –allow the DM to use incomplete ordinal information n Score elicitation –associate sets of rankings with sets of alternatives »e.g., alternatives 1 and 2 are the two least preferred with regard to environmental impacts »e.g., alternatives 3 and 4 are the two most preferred with regard to environmental impacts and cost together –rank two alternatives in relative terms »e.g., alternative 1 is better than alternative 2 with regard to environmental impacts –can be subjected to »all attributes (holistic comparisons) »a (sub)set of attributes or a single attribute

11 Helsinki University of Technology Systems Analysis Laboratory 11 Incomplete holistic comparisons n Evaluate some alternatives without decomposition into “subproblems” –comparisons interpreted as (pairwise) dominance relations –e.g., alternative x 1 is better than alternative x 2 –e.g., alternative x 4 is not the most preferred one n Constraints on the feasible region –e.g., normalized scores known –three attributes –alternative x 1 is preferred to alternative x 2

12 Helsinki University of Technology Systems Analysis Laboratory 12 (Incomplete) ordinal information about the importance of attributes (RICH) (Incomplete) ordinal information about alternatives, score information in form of intervals (Incomplete) holistic comparisons LPs for 1) overall value intervals and 2) pairwise dominance relations Constraints on the feasible region Decision recommendations Different forms of incomplete ordinal information

13 Helsinki University of Technology Systems Analysis Laboratory 13 n Rank the alternatives subject to their properties –the most preferred alternative has the ranking one, etc. –e.g., alternatives x 1, x 2 and x 3 ranked with regard to cost: r=(r(x 1 ), r(x 2 ), r(x 3 ))=(1,2,3) »alternative x 1 is the preferred to x 2 which is preferred to x 3 –the alternative with a smaller rank with regard to some attributes has greater sum of scores with regard to these attributes –mathematically r A’ X’ is a bijection from X’  X onto {1,…,m’}, |X’|=m’ n Compatible rank-orderings –I  X’ is a set of alternatives, J  {1,…,m’} a set of rankings »if |I|<|J|, the rankings of alternatives in I are in J »if |I|  |J|, the rankings in J are attained by alternatives in I »many compatible rank-orderings »e.g., if m=3, I={x 1 }, J={1} for A’={a 1, a 2 }, then compatible rank-orderings are r A’ X =(1,2,3) and (1,3,2). Rank-orderings (1/2)

14 Helsinki University of Technology Systems Analysis Laboratory 14 Rank-orderings (2/2) n Feasible region associated with a rank-ordering r A’ X’ convex –can be used as an elementary set n R(I,J) contains the rank-orderings that are compatible with the sets I and J –feasible regions defined by R(I,J) not necessarily convex n Express statements as pairs of I i, J i, i=1,…,k –feasible region is the intersection of the corresponding S(I i,J i ):s

15 Helsinki University of Technology Systems Analysis Laboratory 15 Efficiency of preference statements n Monte Carlo study –randomly generated problem instances (e.g., Barron and Barret 1994) »statements are based on them n e.g., weight vector w=(0.32, 0.60, 0.08) approximated through the rank-ordering r=(2,1,3) »“correct choice”, x C (i) at round i, (i.e., the alternative with the highest overall value) can be obtained »x e (i) is the alternative recommended by a decision rule at round i n Measures –expected loss of value (ELV) –percentage of correct choices (PCC) –average number of non-dominated alternatives n C is the number of problems where x e (i)= x C (i) n s is the number of simulation rounds

16 Helsinki University of Technology Systems Analysis Laboratory 16 Efficiency of holistic comparisons n Questions –how effective are holistic comparisons? –differences between strategies in choosing the compared alternatives n Randomly generated problems –n=5,7,10 attributes; m=5,7,10,15,50 alternatives –each weight vector has the same probability –scores completely known, randomly generated »uniform distribution, Uni[0,1] »triangular distribution, Tri(0,1/2,1) n Three strategies for choosing the alternatives for pairwise comparisons –each applied in two different ways »“disconnected comparisons”, x 1 vs. x 2, x 3 vs. x 4, etc. »“chained comparisons”, x 1 vs. x 2, x 2 vs. x 3, etc.

17 Helsinki University of Technology Systems Analysis Laboratory 17 Simulation layout n Elicitation strategies –A. arrange the alternatives in a descending order by the sum of the scores (strategies SoS1 (disconnected) and SoS2 (chained)) –B. arrange the alternatives in a descending order by the score of the most important attribute (strategies MIA1 and MIA2) –C. arrange the alternatives randomly (strategies Rnd1 and Rnd2) n ELV and PCC was studied using central values, maximax, maximin and minimax regret decision rules n 100 problem instances (simulation rounds) –several linear programs are needed –results indicative –parameter variation (m,n and the number of comparisons) leads to 114 combinations, experiments

18 Helsinki University of Technology Systems Analysis Laboratory 18 Simulation results (1/2) n Sum of Scores is the best strategy –SoS1 outperforms MIA1 in 113 of 114 experiments in terms of ELV »in 82 of these the difference in loss of value significant n risk level at 2.5% for a 1-tailed t-test –SoS1 outperforms Rnd1 in every of the experiments in terms of ELV »in 92 of these the difference in loss of value significant –no clear difference between MIA1 and Rnd1 n Chained comparisons are better than disconnected comparisons in terms of ELV and percentage of correct choices

19 Helsinki University of Technology Systems Analysis Laboratory 19 Simulation results (2/2) n Holistic comparisons reduce the number of non-dominated alternatives efficiently –e.g., m=50, n=5, the average number of non-dominated alternatives was between 3.92 and 9.17 with 10 comparisons, depending on the strategy one comparison –e.g., m=50, n=5, with only one comparison the average number of non- dominated alternatives was between 20.57 and 23.69 »by discarding one alternative, an average of almost 30 were eliminated n Triangularity assumption increases efficiency

20 Helsinki University of Technology Systems Analysis Laboratory 20 Conclusion n Incomplete ordinal information enhances possibilities in preference elicitation –presumably easier and faster to give than numerical statements –easy to understand n Screening of alternatives –holistic comparisons efficient in discarding non-optimal alternatives –useful especially in problems with many alternatives »consequences of alternatives may be time-consuming to obtain »constraints on the feasible region n Further research directions –efficient computational procedures –simulation study on the efficiency of incomplete holistic comparisons –implementation of a decision support system –case studies

21 Helsinki University of Technology Systems Analysis Laboratory 21 Related references Barron, F. H., “Selecting a Best Multiattribute Alternative with Partial Information about Attribute Weights”, Acta Psychologica 80 (1992) 91-103. Barron, F. H. and Barron, B. E., “Decision Quality using Ranked Attribute Weights”, Management Science 42 (1996) 1515-1523. Edwards, W., “How to Use Multiattribute Utility Measurement for Social Decision Making”, IEEE Transactions on Systems, Man, and Cybernetics 7 (1977) 326-340. Payne, J. W., Bettman, J. R. and Johnson, E. J., “The Adaptive Decision Maker”, Cambridge University Press, New York (1993). Salo, A. ja R. P. Hämäläinen, "Preference Assessment by Imprecise Ratio Statements”, Operations Research 40 (1992) 1053-1061. Salo, A. and Hämäläinen, R. P., “Preference Ratios in Multiattribute Evaluation (PRIME) - Elicitation and Decision Procedures under Incomplete Information”, IEEE Transactions on Systems, Man, and Cybernetics 31 (2001) 533-545. Salo, A. and Punkka, A., “Rank Inclusion in Criteria Hierarchies”, (submitted manuscript; 2003). Stillwell, W. G., Seaver, D. A. and Edwards, W., “A Comparison of Weight Approximation Techniques in Multiattribute Utility Decision Making”, Organizational Behavior and Human Performance 28 (1981) 62-77.


Download ppt "Helsinki University of Technology Systems Analysis Laboratory Antti Punkka and Ahti Salo Systems Analysis Laboratory Helsinki University of Technology."

Similar presentations


Ads by Google