Data Envelopment Analysis: Introduction and an Application to University Administration By Emmanuel Thanassoulis, Professor in Management Sciences Aston.

Slides:



Advertisements
Similar presentations
Efficiency and Productivity Measurement: Data Envelopment Analysis
Advertisements

Quantitative Capability Assessment
On the pulse of the property world Transaction based indices for the UK commercial property market Steven Devaney (University of Aberdeen) Roberto Martinez.
Key Performance Indicators for the Junior Resource Sector: Q Update Prepared by Mike Doggett August 27, 2014.
SYST 660 Airline Operating Costs and Airline Productivity
CASPA Comparison and Analysis of Special Pupil Attainment SGA Systems SGA Systems Limited A brief overview of CASPA's graphs and reports.
Elif Kongar*, Mahesh Baral and Tarek Sobh *Departments of Technology Management and Mechanical Engineering University of Bridgeport, Bridgeport, CT, U.S.A.
Financial Distress Announcement, Transaction Mode Change, and Aggregate Shareholder Wealth : Empirical Evidence from TAIEX-Listed Companies Gili Yen University.
Financial performance and ownership structure of European Airports Mikhail Zolotko.
Communicating the outcomes of the 2008 Research Assessment Exercise A presentation to press officers in universities and colleges. Philip Walker, HEFCE.
© 2009 Pearson Education, Inc. Publishing as Prentice Hall Principles of Macroeconomics 9e by Case, Fair and Oster 12 PART III THE CORE OF MACROECONOMIC.
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
Output and the Exchange Rate in the Short Run. Introduction Long run models are useful when all prices of inputs and outputs have time to adjust. In the.
317_L6_Jan 18, 2008 J. Schaafsma 1 Review of the Last Lecture Are discussing the production of health: section III of the course outline have discussed.
Frontier efficiency measurement in deposit- taking financial mutuals: A review of techniques, applications, and future research directions Professor Andrew.
1 ALTERNATIVE METHODS FOR MEASURING EFFICIENCY AND AN APPLICATION OF DEA IN EDUCATION Emmanuel Thanassoulis Professor in Management Sciences Aston Business.
The Determination of Costs Howard Davies. Objectives n To examine the relationship between inputs and outputs n To identify the most important determinants.
Regulating the engineering profession 1 EC UK Experience in Accreditation of Engineering Programmes Professor Ian Freeston University of Sheffield, UK.
Summer Ventures  Optimal – most favorable or desirable.  Efficient - performing or functioning in the best possible manner with the least waste.
James Cordeiro, State University of New York (Brockport), USA Joseph Sarkis, Clark University, USA Diego Vazquez & Jeroen Dijkshoorn, BRASS Center, Cardiff.
Tough choices ahead Illustrating the choices and trade-offs in the next spending review Kayte Lawton and Amna Silim September 2012.
REF Information Session August Research Excellence Framework (REF)
Presentation Outline  Introductions  Why are “price sensitivity” and “value” important?  Strategic pricing & value enhancement framework  From research.
Results-Based Management
Unemployment ● Causes of Unemployment ● The Phillips Curve ● Natural Rate of Unemployment ● Okun's Law.
MUTUAL FUNDS. HISTORY UTI WAS THE ONLY MUTUAL FUND OPERATING SINCE IT IS AN OPEN ENDED FUND- UNITS CAN BE PURCHASED AND SOLD BACK TO UTI ANY TIME.
Analysing college and university finances UCU training Stephen Court UCU senior research officer.
Research Quality Assessment following the RAE David Sweeney Director, Research, Innovation, Skills.
Copyright © 2008 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin 12 Financial and Cost- Volume-Profit Models.
Business Funding & Financial Awareness CAPITAL BUDETING J R Davies May 2011.
Inflation Report November Output and supply.
Research Assessment Exercise RAE Dr Gary Beauchamp Director of Research School of Education.
Farm-Level Data Use in Individual and Group Extension Settings 2005 AAEA Organized Symposium Michael Langemeier Professor Kansas State University.
COURSE: JUST 3900 INTRODUCTORY STATISTICS FOR CRIMINAL JUSTICE Instructor: Dr. John J. Kerbs, Associate Professor Joint Ph.D. in Social Work and Sociology.
Transforming Community Services Commissioning Information for Community Services Stakeholder Workshop 14 October 2009 Coleen Milligan – Project Manager.
Efficiency Analysis of a Multisectoral Economic System Efficiency Analysis of a Multisectoral Economic System Mikulas Luptáčik University of Economics.
Tutorial 1 Introduction to Economics 1. LEARNING OUTCOMES The term “economy” 2. Difference between microeconomics and macroeconomics; 3.The three basic.
Chapter The Market Forces of Supply and Demand 4.
© The McGraw-Hill Companies, Inc., 2007 McGraw-Hill/Irwin Chapter 22 Cost-Volume-Profit Analysis.
CHAPTER TEN Capital Budgeting: Basic Framework J.D. Han.
Chapter 14 Equilibrium and Efficiency. What Makes a Market Competitive? Buyers and sellers have absolutely no effect on price Three characteristics: Absence.
Prof. Emmanuel Thanassoulis Aston University
1 DEA Based Approaches and Their Applications in Supply Chain Management Dr. Sri Talluri Professor of Supply Chain Management Presentation at the Helsinki.
Contribution. Definition of contribution Contribution is the difference between sales revenue and variable cost Contribution is the difference between.
Slide 1Copyright © 2004 McGraw-Hill Ryerson Limited Chapter 12 Monopoly.
Cost and Management Accounting: An Introduction, 7 th edition Colin Drury ISBN © 2011 Cengage Learning EMEA Cost and Management Accounting:
1 Spending for Development in Papua Presentation for the Tangguh Independent Advisory Panel (TIAP) World Bank Poverty Reduction and Economic Management,
Profit and Loss Account. Introduction The Profit and loss account is one of the thee most important financial statements The Profit and loss account is.
Lecture 5 Dustin Lueker. 2 Mode - Most frequent value. Notation: Subscripted variables n = # of units in the sample N = # of units in the population x.
International Economics Tenth Edition
How much of the variation in hospital financial performance is explained by service mix? Presented by: Richard Lindrooth Medical University of South Carolina.
2.13 The Balance of Payments on Current Account Why do countries trade with each other?
Evaluation Institute Qatar Comprehensive Educational Assessment (QCEA) 2008 Summary of Results.
CHAPTER Prepared by: Jerry Zdril, CGA Tools for Business Decision-Making Third Canadian Edition MANAGERIAL ACCOUNTING Weygandt-Kimmel-Kieso-Aly 6.
Project VIABLE - Direct Behavior Rating: Evaluating Behaviors with Positive and Negative Definitions Rose Jaffery 1, Albee T. Ongusco 3, Amy M. Briesch.
PERFORMANCE MEASURES GROUP 4. DEFINITION PERFORMANCE MEASURES. These are regular measurements of outcomes and results which generates reliable data on.
Commonwealth Financial Network ® does not provide legal or tax advice. You should consult a legal or tax professional regarding your individual situation.
VAT IN THE FINANCIAL SECTOR
Benchmarking of Indian Urban Water Sector: Performance Indicator System versus Data Envelopment Analysis By: Dr. Mamata Singh, Dr. Atul K. Mittal, and.
Descriptive Statistics (Part 2)
An Investigation into the Impact of the Medium-Term Expenditure Framework on Budgetary Outcomes Jim Brumby World Bank.
Managing grade profiles
Process Capability.
2008 ARCC Report Findings February 2, 2009
2009 ARCC Report Findings October 5, 2009
Loss Coverage Why Insurance Works Better with Some Adverse Selection
Weygandt-Kimmel-Kieso-Aly
FY 2018/19 Recommended Budget Town of Manchester, Connecticut
Investigation of student engagement with programming in TU100 The impact of using a graphical programming environment? Helen Jefferis, Soraya Kouadri.
Price and Volume Measures
Presentation transcript:

Data Envelopment Analysis: Introduction and an Application to University Administration By Emmanuel Thanassoulis, Professor in Management Sciences Aston University, Birmingham, UK

Outline Introduction to DEA: a graphical example DEA models: mathematical generalisation An Application of DEA to University Administration Introduction to DEA: a graphical example DEA models: mathematical generalisation An Application of DEA to University Administration

DEA: a method for comparing homogeneous operating units such as schools, hospitals, police forces, individuals etc on efficiency. DEA: a method for comparing homogeneous operating units such as schools, hospitals, police forces, individuals etc on efficiency. DMU Decision Making Unit Inputs The operating units typically use resources Outputs The operating units typically produce or otherwise enable some useful outcomes (e.g. goods or services such as education or health outcomes) DEA makes it possible to compare the operating units on the levels of outputs they secure relative to their input levels. DMUs Inputs and Outputs

Graphical Example on DEA  What is the efficiency of Hospital HO1 in terms of scope to reduce doctor and nurse numbers? DMUDoctorsNurses Outpatients (000) H H H H H H DMUDoctorsNursesOutpatients H H H H H H Normalise inputs per 1000 outpatients

 We assume that interpolation of hospitals yields input output levels which are feasible in principle.  Example: take 0.5 of the input-output levels of hospital 2 and 0.5 of the input-output levels of hospital 3 and combine:  We assume that interpolation of hospitals yields input output levels which are feasible in principle.  Example: take 0.5 of the input-output levels of hospital 2 and 0.5 of the input-output levels of hospital 3 and combine: Deploy the Interpolation Principle to Create a set of Feasible ‘hospitals’ DMUDoctorsNurses Outpatients (000) H H H H H H HO2 +0.5HO3 =

Feasible units (hospitals) DMUDoctorNurseOutpatient H H H H H H

Feasible units (Hospitals)  The interpolation assumption means that 19 doctors and 33 nurses, though observed at no hospital, are capable of handling 1000 outpatients. number of doctors  19. number of nurses  33 and outpatients  Further, any hospital is also feasible if its input -output levels satisfy

A linear programming approach   H02 and H03 are efficient peers to H01. Efficient Doctor-Nurse levels for H01 are (16.1, 38.5) HosD’trsN’rses Outpat ients H H H H H H doctors: nurses :  outpatients : , 2, 3, 4, 5, 6 >= 0. doctors: nurses :  outpatients : , 2, 3, 4, 5, 6 >= 0. ≤ 25 θ ≤ 60 θ ≥1000 ≤ 25 θ ≤ 60 θ ≥1000 Min θ Such that: Min θ Such that:

SAMPLE AREAS WHERE DEA HAS BEEN USED Banking Education Regulation of ‘natural’ monopolies (e.g. water distribution, electricity, gas) Tax collection Courts Sales forces (e.g. insurance sales teams)

TYPES OF ANALYSIS Relative efficiency measure; Identification of efficient exemplar units; Target setting. decomposing efficiency in layered data (pupil, school, type of school etc); Decomposing productivity changes over time into those attributable to the unit and those to technology shift (Malmquist Indices) Identifying the increasing/decreasing returns to scale; Measuring scale elasticity; Identifying most productive scale size;

ASSESSING EFFICIENCY AND PRODUCTIVITY IN UK UNIVERSITY ADMINISTRATION 1999/00 TO 2004/05. Prof. Emmanuel Thanassoulis

Aim of this project To identify units with cost-efficient good quality administrative services and disseminate good practice; To assess changes in the productivity of UK university administration over time (1999/00 to 2004/5).

Motivation for the Study Expenditure on Administration and Central Services is about 20% of total HEI expenditure in the UK Hence expenditure on administration is substantial and competing with direct expenditure on academic activities.

Questions to be addressed: Are efficiency savings possible? Is there a trade-off between expenditure and quality of service in administration ? Has there been any change over time in productivity of administrative services in UK Higher Education Institutions (HEIs)?

The Assessment Method We opted for Data Envelopment Analysis (DEA). This is a linear programming based benchmarking method. It identifies cost-efficient benchmark units and then assesses the scope for savings at the rest of the units relative to the benchmarks.

The virtual benchmark VBM6 offers the top levels of research grants and students when keeping to the mix of research grants to students of CAS6. VBM6 The distance from CAS6 to VBM6 reflects the scope for savings at CAS 6. DEA is illustrated in the graph below. An ‘efficient boundary’ is created from benchmark units (CAS12 and CAS2).

Unit of Assessment The unit of assessment includes core CAS and administration devolved away from the centre; It excludes Estates, Libraries, Residences and catering.

The Unit of Assessment Initial consultations with senior administrators and academics suggested the following conceptual framework for the unit of assessment: Support for research Support for non admin staff Support for students

Assessments were carried out by setting the Admin costs against the Measures of Admin Volume shown (at university level). INPUT – OUTPUT SET ADMIN COST MEASURE(S) (INPUTS): ACTIVITY VOLUME MEASURES (OUTPUTS): Admin staff costs Students Measure used: Total income from Students Staff Measure used: Non-administrative staff cost OPEX on administration Research grants and other services rendered: Measure used: Income from these sources Research volume and quality Measure used: QR component of HEFCE income Input – Output Set

Definition of Admin Staff costs Descriptive definition: Admin Staff costs = Total non-academic staff cost in academic departments + non-academic staff cost in administration and services (central, general education and facilities) + non-academic staff cost in Research grants and contracts + non-academic staff cost in Other Services rendered.

Definition of OPEX: Operating expenditure excluding staff costs and depreciation 1 OPEX in academic departments was apportioned between academic and non- academic purposes in line with the percentages of staff costs corresponding to academic and non-academic staff. Descriptive definition: TOTAL ADMIN OPEX = OPEX in academic departments ‘apportioned 1 ’ to non-academic staff costs + OPEX in administration and services (central, general education and facilities) + OPEX in research grants and contracts + OPEX in Other Services rendered.

Definition of Activity Volume Measures (Outputs) Total Income from Students is the sum of the block grant for teaching received from HEFCE plus income from non-HEFCE funded students. Research Grants, and Other Services Rendered. Sum of : special initiatives; Total Research grants and contracts; Total (other services rendered); IPR Non Administrative Staff Costs. Quality Related Research Income is the RAE-based component of research income. (This output has been restricted so that its unit resource cannot be higher than that of any one of the remaining three outputs)

Model Specification VRS model – Input oriented Outliers have been re-scaled to be on the boundary formed by the non- outlier units; A balanced panel between the first (1999/00) and the last (2004/5) year is reported upon here. 100 universities are therefore included in the assessment reported here.

Findings at University Level - illustration The efficiency here is a percentage. It means the observed admin staff costs can be reduced each year to the percentage indicated above (e.g % in 1999/00). So relative to benchmark institutions for student income, non admin staff expenditure and research income, administrative staff costs could be about half. The scope for efficiency savings on Operating Expenditure is less than for administrative staff, at about 25% - 30%. If we assume Staff and OPEX can substitute for each other then the scope for efficiency savings is 30% or so on each one of administrative staff and OPEX. 1999/002000/012001/022002/032003/042004/05 Staff Model OPEX Model Joint Model Efficiency Scores at a sample admin unit

Comparing the sample university with its benchmark units on Administrative Staff Costs Data is indexed so that sample university of the previous slide = 1 on each variable. IndexPeer Admin Staff Cost OPEX Student Income Non Admin Staff Research Grants QR 2004/05 Benchmark A Benchmark B /00 Benchmark B

Sector Findings on Efficiency There are some 20 units with good benchmark performance without any scope for further efficiency savings. About half of these benchmark units are found to be consistently cost-efficient in all six years of our analysis. These units could prove examples of good operating practices in administration. On the other hand some of the units with large scope for efficiency savings are so on all the six years suggesting persistent issues of expenditure control. Visits to specific institutions did not support the notion that factors outside the model such as quality of service could explain differences in efficiency. When we take administrative staff and OPEX as substitutable we find that the scope for efficiency savings is about 15% per annum in each.

] Productivity is defined as the ratio of output to resource used.

/ ] Efficiency catch up X { Boundary shift Productivity change = Efficiency catch up x Boundary shift. } 0.5

Bottom two HEIs on productivity Data for 2004/5 – constant prices, Normalised so that 1999/00 = 1 Prod Index Admin staff costOPEX Student income Non Admin staff Res’arch grants QR In both universities the normalised data show substantial rise in admin staff costs against modest rise or even drop in activities.

Top three HEIs on productivity Data for 2004/5 – constant prices, Normalised so that 1999/00 = 1 Prod Index Admin staff costOPEX Student income Non Admin staff Res’arch grants QR In these three universities that gain the most in admin productivity, admin costs fall over time while outputs rise, with only two exceptions on research income.

Productivity Change 1999/00 to 2004/5 – Staff Costs Model NB: An index of 1 means stable productivity between 1999/0 and 2004/5. An index below 1 means productivity drop. Eg. a value of 0.9 means the same work load is costing in 2004/5 11% more than it did in 1999/0. An index above 1 means productivity improvement. Improving Deteriorating

Productivity Change Decomposition – Staff Model Efficiency Catch-upBoundary Shift Scale Efficiency Catch-up Benchmark performance tends to deteriorate over time. HEIs with negative effects due to scale exceed those with positive effects HEIs tend to catch up with the boundary over time.

Productivity Change 1999/00 to 2004/5 – Staff and OPEX Combined Improving Deteriorating We have a virtually symmetrical distribution with half the units improving and the rest deteriorating in productivity.

Productivity Change Decomposition – Joint Model Efficiency Catch-upBoundary Shift Scale Efficiency Catch-up There is clear evidence that the boundary is improving in that best performance is more productive in 2004/5 than in 1999/0. Most units are falling behind the boundary. The catch-up and boundary shift results are the reverse from when staff costs were the sole input. This suggests perhaps that non staff expenses were managed better than staff costs, over time. Scale efficiency deteriorates over time.

Summary of Findings on Productivity at Sector Level Between 1999/00 and 2004/5 we find a divergent picture between administrative staff cost on the one hand and OPEX or the two inputs taken jointly on the other. In the case of administrative staff taken as the sole input we find that there is on average a drop in productivity so that for given levels of the proxy output variables staff cost is about 95% in 1999/00 compared to what it is in 2004/5, at constant prices. Looking deeper it is found that generally units do keep up with the benchmark units but it is the benchmark units that are performing less productively by almost 7% in 2004/5 compared to 1999/00. The situation is not helped by a slight loss of productivity through scale sizes becoming less productive, by about 1.5% in 2004/5 compared to 1999/00. In contrast when we look at OPEX as a sole input or indeed at administrative staff and OPEX as joint resources we find that productivity is more or less stable between 1999/00 and 2004/5. There is a slight loss of about 2% but given the noise in the data this is not significant.

What is significantly different between administrative staff on the one hand and OPEX or the two joint inputs on the other is that benchmark performance improves on average by about 8% between 1999/00 and 2004/5. Unfortunately non benchmark units cannot quite keep up with this higher productivity. Also there is a slight deterioration again in scale size efficiency and the net effect is that despite the improved benchmark productivity, productivity on average is stable to slightly down between 1999/00 and 2004/5. As with cost efficient practices here too our analysis has identified a small set of units which register significant productivity gains and others which register significant productivity losses between 1999/00 and 2004/5. An investigation of the causes in each case would be instrumental for offering advice to other units on how to gain in productivity over time and avoid losses.

Thank you…