Presentation is loading. Please wait.

Presentation is loading. Please wait.

Data Envelopment Analysis: Introduction and an Application to University Administration By Emmanuel Thanassoulis, Professor in Management Sciences Aston.

Similar presentations


Presentation on theme: "Data Envelopment Analysis: Introduction and an Application to University Administration By Emmanuel Thanassoulis, Professor in Management Sciences Aston."— Presentation transcript:

1 Data Envelopment Analysis: Introduction and an Application to University Administration By Emmanuel Thanassoulis, Professor in Management Sciences Aston University, Birmingham, UK E.thanassoulis@aston.ac.uk

2 Outline Introduction to DEA: a graphical example DEA models: mathematical generalisation An Application of DEA to University Administration Introduction to DEA: a graphical example DEA models: mathematical generalisation An Application of DEA to University Administration

3 DEA: a method for comparing homogeneous operating units such as schools, hospitals, police forces, individuals etc on efficiency. DEA: a method for comparing homogeneous operating units such as schools, hospitals, police forces, individuals etc on efficiency. DMU Decision Making Unit Inputs The operating units typically use resources Outputs The operating units typically produce or otherwise enable some useful outcomes (e.g. goods or services such as education or health outcomes) DEA makes it possible to compare the operating units on the levels of outputs they secure relative to their input levels. DMUs Inputs and Outputs

4 Graphical Example on DEA  What is the efficiency of Hospital HO1 in terms of scope to reduce doctor and nurse numbers? DMUDoctorsNurses Outpatients (000) H0125601 H0210501 H0328161 H0430401 H0540701 H0630501 DMUDoctorsNursesOutpatients H01 30721200 H02 10501000 H03 35201250 H04 33441100 H05 52911300 H06 2440800 Normalise inputs per 1000 outpatients

5  We assume that interpolation of hospitals yields input output levels which are feasible in principle.  Example: take 0.5 of the input-output levels of hospital 2 and 0.5 of the input-output levels of hospital 3 and combine:  We assume that interpolation of hospitals yields input output levels which are feasible in principle.  Example: take 0.5 of the input-output levels of hospital 2 and 0.5 of the input-output levels of hospital 3 and combine: Deploy the Interpolation Principle to Create a set of Feasible ‘hospitals’ DMUDoctorsNurses Outpatients (000) H0125601 H0210501 H0328161 H0430401 H0540701 H0630501 0.5 HO2 +0.5HO3 =

6 Feasible units (hospitals) DMUDoctorNurseOutpatient H0125601000 H0210501000 H0328161000 H0430401000 H0540701000 H0630501000

7 Feasible units (Hospitals)  The interpolation assumption means that 19 doctors and 33 nurses, though observed at no hospital, are capable of handling 1000 outpatients. number of doctors  19. number of nurses  33 and outpatients  1000. Further, any hospital is also feasible if its input -output levels satisfy

8 A linear programming approach   H02 and H03 are efficient peers to H01. Efficient Doctor-Nurse levels for H01 are (16.1, 38.5) HosD’trsN’rses Outpat ients H0125601000 H0210501000 H0328161000 H0430401000 H0540701000 H0630501000 doctors: 25 1 + 10 2 + 28 3 + 30 4 + 40 5 + 30 6 nurses : 60 1 + 50 2 + 16 3 + 40  4 + 70 5 + 50 6 outpatients : 1000 1 + 1000 2 + 1000 3 + 1000 4 + 1000 5 + 1000 6 1, 2, 3, 4, 5, 6 >= 0. doctors: 25 1 + 10 2 + 28 3 + 30 4 + 40 5 + 30 6 nurses : 60 1 + 50 2 + 16 3 + 40  4 + 70 5 + 50 6 outpatients : 1000 1 + 1000 2 + 1000 3 + 1000 4 + 1000 5 + 1000 6 1, 2, 3, 4, 5, 6 >= 0. ≤ 25 θ ≤ 60 θ ≥1000 ≤ 25 θ ≤ 60 θ ≥1000 Min θ Such that: Min θ Such that:

9 SAMPLE AREAS WHERE DEA HAS BEEN USED Banking Education Regulation of ‘natural’ monopolies (e.g. water distribution, electricity, gas) Tax collection Courts Sales forces (e.g. insurance sales teams)

10 TYPES OF ANALYSIS Relative efficiency measure; Identification of efficient exemplar units; Target setting. decomposing efficiency in layered data (pupil, school, type of school etc); Decomposing productivity changes over time into those attributable to the unit and those to technology shift (Malmquist Indices) Identifying the increasing/decreasing returns to scale; Measuring scale elasticity; Identifying most productive scale size;

11 ASSESSING EFFICIENCY AND PRODUCTIVITY IN UK UNIVERSITY ADMINISTRATION 1999/00 TO 2004/05. Prof. Emmanuel Thanassoulis e.thanassoulis@aston.ac.uk

12 Aim of this project To identify units with cost-efficient good quality administrative services and disseminate good practice; To assess changes in the productivity of UK university administration over time (1999/00 to 2004/5).

13 Motivation for the Study Expenditure on Administration and Central Services is about 20% of total HEI expenditure in the UK Hence expenditure on administration is substantial and competing with direct expenditure on academic activities.

14 Questions to be addressed: Are efficiency savings possible? Is there a trade-off between expenditure and quality of service in administration ? Has there been any change over time in productivity of administrative services in UK Higher Education Institutions (HEIs)?

15 The Assessment Method We opted for Data Envelopment Analysis (DEA). This is a linear programming based benchmarking method. It identifies cost-efficient benchmark units and then assesses the scope for savings at the rest of the units relative to the benchmarks.

16 The virtual benchmark VBM6 offers the top levels of research grants and students when keeping to the mix of research grants to students of CAS6. VBM6 The distance from CAS6 to VBM6 reflects the scope for savings at CAS 6. DEA is illustrated in the graph below. An ‘efficient boundary’ is created from benchmark units (CAS12 and CAS2).

17 Unit of Assessment The unit of assessment includes core CAS and administration devolved away from the centre; It excludes Estates, Libraries, Residences and catering.

18 The Unit of Assessment Initial consultations with senior administrators and academics suggested the following conceptual framework for the unit of assessment: Support for research Support for non admin staff Support for students

19 Assessments were carried out by setting the Admin costs against the Measures of Admin Volume shown (at university level). INPUT – OUTPUT SET ADMIN COST MEASURE(S) (INPUTS): ACTIVITY VOLUME MEASURES (OUTPUTS): Admin staff costs Students Measure used: Total income from Students Staff Measure used: Non-administrative staff cost OPEX on administration Research grants and other services rendered: Measure used: Income from these sources Research volume and quality Measure used: QR component of HEFCE income Input – Output Set

20 Definition of Admin Staff costs Descriptive definition: Admin Staff costs = Total non-academic staff cost in academic departments + non-academic staff cost in administration and services (central, general education and facilities) + non-academic staff cost in Research grants and contracts + non-academic staff cost in Other Services rendered.

21 Definition of OPEX: Operating expenditure excluding staff costs and depreciation 1 OPEX in academic departments was apportioned between academic and non- academic purposes in line with the percentages of staff costs corresponding to academic and non-academic staff. Descriptive definition: TOTAL ADMIN OPEX = OPEX in academic departments ‘apportioned 1 ’ to non-academic staff costs + OPEX in administration and services (central, general education and facilities) + OPEX in research grants and contracts + OPEX in Other Services rendered.

22 Definition of Activity Volume Measures (Outputs) Total Income from Students is the sum of the block grant for teaching received from HEFCE plus income from non-HEFCE funded students. Research Grants, and Other Services Rendered. Sum of : special initiatives; Total Research grants and contracts; Total (other services rendered); IPR Non Administrative Staff Costs. Quality Related Research Income is the RAE-based component of research income. (This output has been restricted so that its unit resource cannot be higher than that of any one of the remaining three outputs)

23 Model Specification VRS model – Input oriented Outliers have been re-scaled to be on the boundary formed by the non- outlier units; A balanced panel between the first (1999/00) and the last (2004/5) year is reported upon here. 100 universities are therefore included in the assessment reported here.

24 Findings at University Level - illustration The efficiency here is a percentage. It means the observed admin staff costs can be reduced each year to the percentage indicated above (e.g. 57.57% in 1999/00). So relative to benchmark institutions for student income, non admin staff expenditure and research income, administrative staff costs could be about half. The scope for efficiency savings on Operating Expenditure is less than for administrative staff, at about 25% - 30%. If we assume Staff and OPEX can substitute for each other then the scope for efficiency savings is 30% or so on each one of administrative staff and OPEX. 1999/002000/012001/022002/032003/042004/05 Staff Model 57.5751.5349.9943.0641.1445.72 OPEX Model 72.4471.269.7372.5974.5775.1 Joint Model 77.8165.461.6969.6565.3165.6 Efficiency Scores at a sample admin unit

25 Comparing the sample university with its benchmark units on Administrative Staff Costs Data is indexed so that sample university of the previous slide = 1 on each variable. IndexPeer Admin Staff Cost OPEX Student Income Non Admin Staff Research Grants QR 2004/05 Benchmark A 0.470.671.571.190.420.11 Benchmark B 0.922.461.392.764.083.23 1999/00 Benchmark B 1.002.511.462.353.993.28

26 Sector Findings on Efficiency There are some 20 units with good benchmark performance without any scope for further efficiency savings. About half of these benchmark units are found to be consistently cost-efficient in all six years of our analysis. These units could prove examples of good operating practices in administration. On the other hand some of the units with large scope for efficiency savings are so on all the six years suggesting persistent issues of expenditure control. Visits to specific institutions did not support the notion that factors outside the model such as quality of service could explain differences in efficiency. When we take administrative staff and OPEX as substitutable we find that the scope for efficiency savings is about 15% per annum in each.

27 ] Productivity is defined as the ratio of output to resource used.

28 / ] Efficiency catch up X { Boundary shift Productivity change = Efficiency catch up x Boundary shift. } 0.5

29 Bottom two HEIs on productivity Data for 2004/5 – constant prices, Normalised so that 1999/00 = 1 Prod Index Admin staff costOPEX Student income Non Admin staff Res’arch grants QR 0.71.930.981.230.91.230.81 0.741.411.421.040.981.0510.8 In both universities the normalised data show substantial rise in admin staff costs against modest rise or even drop in activities.

30 Top three HEIs on productivity Data for 2004/5 – constant prices, Normalised so that 1999/00 = 1 Prod Index Admin staff costOPEX Student income Non Admin staff Res’arch grants QR 1.590.820.621.071.301.591.14 1.350.890.921.261.280.90.41 1.340.980.941.231.301.961.12 In these three universities that gain the most in admin productivity, admin costs fall over time while outputs rise, with only two exceptions on research income.

31 Productivity Change 1999/00 to 2004/5 – Staff Costs Model NB: An index of 1 means stable productivity between 1999/0 and 2004/5. An index below 1 means productivity drop. Eg. a value of 0.9 means the same work load is costing in 2004/5 11% more than it did in 1999/0. An index above 1 means productivity improvement. Improving Deteriorating

32 Productivity Change Decomposition – Staff Model Efficiency Catch-upBoundary Shift Scale Efficiency Catch-up Benchmark performance tends to deteriorate over time. HEIs with negative effects due to scale exceed those with positive effects HEIs tend to catch up with the boundary over time.

33 Productivity Change 1999/00 to 2004/5 – Staff and OPEX Combined Improving Deteriorating We have a virtually symmetrical distribution with half the units improving and the rest deteriorating in productivity.

34 Productivity Change Decomposition – Joint Model Efficiency Catch-upBoundary Shift Scale Efficiency Catch-up There is clear evidence that the boundary is improving in that best performance is more productive in 2004/5 than in 1999/0. Most units are falling behind the boundary. The catch-up and boundary shift results are the reverse from when staff costs were the sole input. This suggests perhaps that non staff expenses were managed better than staff costs, over time. Scale efficiency deteriorates over time.

35 Summary of Findings on Productivity at Sector Level Between 1999/00 and 2004/5 we find a divergent picture between administrative staff cost on the one hand and OPEX or the two inputs taken jointly on the other. In the case of administrative staff taken as the sole input we find that there is on average a drop in productivity so that for given levels of the proxy output variables staff cost is about 95% in 1999/00 compared to what it is in 2004/5, at constant prices. Looking deeper it is found that generally units do keep up with the benchmark units but it is the benchmark units that are performing less productively by almost 7% in 2004/5 compared to 1999/00. The situation is not helped by a slight loss of productivity through scale sizes becoming less productive, by about 1.5% in 2004/5 compared to 1999/00. In contrast when we look at OPEX as a sole input or indeed at administrative staff and OPEX as joint resources we find that productivity is more or less stable between 1999/00 and 2004/5. There is a slight loss of about 2% but given the noise in the data this is not significant.

36 What is significantly different between administrative staff on the one hand and OPEX or the two joint inputs on the other is that benchmark performance improves on average by about 8% between 1999/00 and 2004/5. Unfortunately non benchmark units cannot quite keep up with this higher productivity. Also there is a slight deterioration again in scale size efficiency and the net effect is that despite the improved benchmark productivity, productivity on average is stable to slightly down between 1999/00 and 2004/5. As with cost efficient practices here too our analysis has identified a small set of units which register significant productivity gains and others which register significant productivity losses between 1999/00 and 2004/5. An investigation of the causes in each case would be instrumental for offering advice to other units on how to gain in productivity over time and avoid losses.

37 Thank you…


Download ppt "Data Envelopment Analysis: Introduction and an Application to University Administration By Emmanuel Thanassoulis, Professor in Management Sciences Aston."

Similar presentations


Ads by Google