Download presentation
Presentation is loading. Please wait.
Published byGriselda Watts Modified over 9 years ago
1
1 Helsinki University of Technology Systems Analysis Laboratory INFORMS 2007 Seattle Efficiency and Sensitivity Analyses in the Evaluation of University Departments Ahti Salo and Antti Punkka Helsinki University of Technology Systems Analysis Laboratory 02015 TKK, Finland firstname.lastname@tkk.fi
2
Helsinki University of Technology Systems Analysis Laboratory 2 INFORMS 2007 Seattle Context n National efforts to increase the efficiency of universities –”Productivity programme” ~1500 positions to be slashed in 2007-10 –Efficiency studies commissioned by the Ministry of Finance »”Measurable Productivity in Universities” by Gov’t Econ. Research Cntr in 09/06 n Developments at Helsinki University of Technology (TKK) –Rector asked us to produce comments to the above report –TKK has been using various resource allocations models over the years –Considerable dissatisfaction with many of these models –Resources Committee requested to develop different principles n Tasks Develop value efficiency models in support of resource allocation Explore methodological extensions in view of decision making needs
3
Helsinki University of Technology Systems Analysis Laboratory 3 INFORMS 2007 Seattle
4
Helsinki University of Technology Systems Analysis Laboratory 4 INFORMS 2007 Seattle Efficiency of University Departments n Departments consume inputs in order to produce outputs n Valuation of inputs and outputs involves subjective preferences University / Department x 1 (Budget funding) y 1 (Master’s Theses) y 2 (Doctor’s Theses) y 3 (Int’l publications) x 2 (Project funding)
5
Helsinki University of Technology Systems Analysis Laboratory 5 INFORMS 2007 Seattle Data Envelopment Analysis (Charnes et al., 1978) n Approach –Multiple inputs x i and outputs y i of decision making units (DMU) aggregated by non-negative multipliers (’weights’) –Efficiency ratio of each DMU is maximed, subject to the condition that this ratio does not exceed one for any DMUs n Observations –Extending the set of inputs cannot worsen the efficiency of any DMU –In Value Efficiency Analysis, the DMs’ preferences are explicitly modelled (VEA, Halme et al., 1999; Korhonen and Syrjänen, 1998)
6
Helsinki University of Technology Systems Analysis Laboratory 6 INFORMS 2007 Seattle Valuation of Inputs and Outputs n Preferences elicited from the Resources Committee n How valuable are the different outputs in relative terms? –What is the value of an MSc degree relative to a PhD degree etc? –44 outputs from the reporting system using 3-year annual averages »Degrees granted – publications activity – international activities »Mitigation of impacts due to large annual fluctuations n How important are budget funding and project funding in terms of producing this output?
7
Helsinki University of Technology Systems Analysis Laboratory 7 INFORMS 2007 Seattle
8
Helsinki University of Technology Systems Analysis Laboratory 8 INFORMS 2007 Seattle
9
Helsinki University of Technology Systems Analysis Laboratory 9 INFORMS 2007 Seattle n Feasible valuations –Responses by individual respondents plus convex combinations thereof n Efficient departments (efficiency = 1) –For some feasible valuation of inputs and outputs, the efficiency ratio of a this Dept is either greater than or equal to that of all other Depts n Inefficient deparments (efficiency < 1) –For all feasible valuations, the efficiency ratio of some other Dept is strictly greater n If the aim is to maximize overall efficiency and Depts increase their outputs in proportion to the use of inputs, resources should be shifted from inefficient to efficient Depts Feasible Valuations and Efficiencies
10
Helsinki University of Technology Systems Analysis Laboratory 10 INFORMS 2007 Seattle Efficiencies of departments n Very significant differences in departmental efficiencies n Results still in alignment with resource allocation models
11
Helsinki University of Technology Systems Analysis Laboratory 11 INFORMS 2007 Seattle Motivations for Methodological Extensions n Results of Value Efficiency Analysis may not be robust –Introduction of an outlier may produce radical changes in efficiency results –Hence the results may appear counterintuitive to DMs Pairwise dominances among DMUs –It may be of interest to enable comparisons among all DMUs –Efficient DMUs need not be greatest relevance for very inefficient DMUs èRank-based information about relative efficiencies –Ranking lists (e.g., Shanghai Jian Tao University) have been influential –Yet this list (and many others) do not account for the value of inputs –Hence the interest to examine efficiencies in terms of rankings, too
12
Helsinki University of Technology Systems Analysis Laboratory 12 INFORMS 2007 Seattle Pairwise Efficiency Dominance of DMUs n For any feasible input and output valuations, the efficiency of DMU s is defined as n Definition: If DMU s and DMU t are such that for all feasible input and output valuations (with strict inequality for some feasible valuations), then DMU s dominates DMU t.
13
Helsinki University of Technology Systems Analysis Laboratory 13 INFORMS 2007 Seattle Pairwise Efficiency Dominance of DMUs n Definition: If the efficiency ratio of DMU s is greater than or equal to that of DMU t for all, (with strict inequality for some feasible valuations), then DMU s dominates DMU t. n This dominance holds if the minimum is positive
14
Helsinki University of Technology Systems Analysis Laboratory 14 INFORMS 2007 Seattle Pairwise Dominance n This minimization problem gives a lower bound on how much more efficient DMU s is incomparison with DMU t DMU t DMU s
15
Helsinki University of Technology Systems Analysis Laboratory 15 INFORMS 2007 Seattle Ranking of DMUs’ Efficiencies n Definition: Let , be a feasible valuation. The ranking of DMU t among DMUs S is The ranking of the most efficient DMU is 1 If several DMUs have the same efficiency ratio, they have a tie with the same ranking n Different feasible valuations assign different rankings to DMUs n Best and worst possible of rankings computed with an MILP model
16
Helsinki University of Technology Systems Analysis Laboratory 16 INFORMS 2007 Seattle Ranking Ranges of Rankings for TKK Departments
17
Helsinki University of Technology Systems Analysis Laboratory 17 INFORMS 2007 Seattle Ranking List of Shanghai Jia Tong-University
18
Helsinki University of Technology Systems Analysis Laboratory 18 INFORMS 2007 Seattle Weight Sensitivity of Rankings
19
Helsinki University of Technology Systems Analysis Laboratory 19 INFORMS 2007 Seattle Conclusions n Lessons learned –Different models complement each other –Thinking about the value of intangibles is useful - does our data matter? –Efficiency analysis alone does not suggest strategic changes n Useful methodological extensions –Inter-departmental comparisons supported by pairwise comparisons –Ranges of rankings show sensitivities in the relative efficiency of Depts n Possible extensions –Development of analyses to account for intermediate inputs/outputs –Explicit linkages to resource allocation through goal-setting –Interactive decision support tools with Internet-based user interfaces
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.