Download presentation
Presentation is loading. Please wait.
Published byRudolph Ellis Modified over 9 years ago
1
The Evaluation of Publicly Funded Research Berlin, 26/27 September 2005 Evaluation for a changing research base Paul Hubbard Head of Research Policy, HEFCE, UK
2
Evaluation for a changing research base Background: some assumptions Some models of funding and evaluation Change drivers: research methods Change drivers: policy related Challenges to the evaluator A remark about tools and indicators
3
Background: some assumptions Governments fund academic research for its benefits to economic strength and social cohesion Demand for funding will outpace budgets, forcing choices about what to support and increasing pressure to show what the funding buys Research will increasingly be internationally competitive
4
Three models of funding and evaluation (1) We commission R&D projects to find out things that we want to know. The findings will be a basis for further work or applications. Funding at project and programme level with research approaches defined at the outset – evaluation can be built in. Assumes the next programme can learn from the last one.
5
Three models (2) We fund excellent research (originality, significance and rigour) where we find it, based on whole sector peer review High autonomy, freedom to fail, room for innovation and dynamism Long evaluation-feedback times; hard to prove specific benefits have been gained
6
Three models (3) We know what we want from the whole R&D base and will judge it on how well it delivers Tends to mean we identify measurable desired outcomes and impacts May prefer relevance over excellence and privilege user requirements New figures annually so quick feedback possible –but may not see the whole picture
7
Change drivers: the research base Increasing diversity of subject focus – new disciplines, interdisciplinary studies Organisation – research collaboration across institutional boundaries and structured collaborative units
8
Change drivers: the research base (2) New forms of dissemination –IT enabled publication and citation analyses –Benefits of shared datasets
9
Change drivers: the policy environment Increased emphasis on showing: What public funding buys – including economic and social impact How funded research meets specific policy aims – strategically important knowledge How funded research meets needs of other major stakeholders (industry, health services) International standing of the national research effort
10
Challenges to the evaluator Carrying the community with us –From college to consultant –Lessons from 20 years of RAE (UK)
11
Challenges to the evaluator Looking to the paymasters: Targets and indicators – maintaining a balanced view Making the broader case for public investment: new approaches to demonstrating impact
12
Challenges to the Evaluator What approaches will we require to evaluate innovations in research organisation? How can we identify and measure innovative capacity?
13
Challenges to the Evaluator When budgets are tight, how shall we make the case for speculative investment in “blue skies” research?
14
A remark about tools and indicators We have at our disposal: Project and programme evaluation approaches Peer review Expert informed assessment Bibliometric/ citation indices Quantitative output measures Numerical and qualitative esteem indicators The “balanced scorecard”
15
A remark about tools and indicators (2) International comparisons – how much do they really tell us? Do we need more tools and indicators and if so where shall we find them?
16
Summary
17
No firm conclusions but some urgent and important questions
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.