Download presentation
Presentation is loading. Please wait.
Published byColeen Holmes Modified over 9 years ago
1
Performance-based funding of public research in tertiary education institutions: Country experiences Presentation to the Norwegian Fagerberg Committee: 5 November 2010 Contact: Sarah Box, Directorate for Science, Technology and Industry, sarah.box@oecd.org
2
Higher education R&D: a small but important share of national R&D Source: OECD Main Science and Technology Indicators (May 2010) Higher education expenditure on R&D (HERD), 1998 and 2008
3
Institutional funding remains a key funding mechanism Source: OECD, Working Party of National Experts on Science and Technology (NESTI) project on public R&D funding, 2009. Note: This is an experimental indicator. International comparability is currently limited. Government funded R&D in higher education, by type of funding, 2008
4
This raises policy questions about the design of funding Interest in ways of directing institutional funding: performance-based approaches OECD RIHR project: –Models of performance-based research funding systems (PRFSs) –Performance indicators used in PRFSs –Impacts of PRFSs –A country survey on funding models A compilation of country experiences – debates and gaps in our knowledge
5
Project findings (1): Models 13 systems of ex post research output evaluation identified Motivated by resource concentration, publication and pursuit of excellence Methods align with level of focus, e.g. peer review for individuals System costs rarely discussed Limited funding can have strong effects
6
Challenges and tensions No consensus on best practice –But peer judgment + indicators at department level emerging as state of art Still work to do on assessment of humanities and broader research impacts Tension: complexity vs practicality Internal university systems matter Do marginal gains fall over time? Alternatives? – rankings? centres of excellence? Performance-based funding can’t do everything – use suite of funding tools
7
Project findings (2): Indicators Concept of performance remains ambiguous First order indicators (inputs, processes, structures, outputs, effects) Second order indicators (indices) Third order indicators (peer review) To date, mainly use of peer review or monitoring inputs/outputs, but increasing use of monitoring effects Need to understand performance paradox and how indicators deteriorate
8
Project findings (3): Impacts Relative scarcity of evidence-based analysis – a challenge to distinguish reality vs perception and evidence vs anecdotes Impact is multi-faceted – aspects include funding, HR, productivity, quality… Universities aim to maximise funding Research as part of a wider system – broader impacts of performance assessment Need more bibliometric analysis and cross-country studies
9
Australian university responses to government funding initiatives Source: Butler (in OECD 2010). Note: Q1-4 indicate the quartile of journals, as measured by citation impact. Q1 is the highest quartile; Q4 is the lowest.
10
Project findings (4): Survey Features and impacts of performance schemes – 12 + 1 survey responses Most operating since 2000, annual funding, covering all research types & fields, with similar indicators (but different combinations and weightings) Differences in budget impact & role of institutions in development and admin. Positive effects on outputs and research management BUT attribution? Negative effects e.g. narrow research focus
11
Policy issues and debates Can there be multiple goals for schemes? Which indicators have the biggest impact on incentives? How to build a system of complementary indicators of performance, while keeping costs down? How do different funding streams work together to affect incentives? How can government co-ordination mechanisms be arranged to avoid friction between funding streams?
12
Policy issues and debates, cont. What level of budget impact is required? Absolute funding levels vs marginal effects? Fixed funding pool vs rewarding all improvements? How do internal university management systems influence impacts of performance systems? When do the marginal returns of performance-based funding become zero?
13
Any lessons? A combination of strategic funding (forward-looking), performance-based funding (backward-looking) and stable funding may be best Be clear on what performance is desired and what will be funded Performance funding can assist quality in smaller, local fields of research Need to analyse institutional changes under performance funding
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.