Research Support Conference, 21 June 2011

Slides:



Advertisements
Similar presentations
Assessing Excellence with Impact Ian Diamond ESRC.
Advertisements

GSOE Impact Workshop Impact and the REF 19 th May 2010 Lesley Dinsdale.
Working with the Research Excellence Framework Dr Ian Carter Director of Research and Enterprise Sussex Research Hive Seminars 10 March 2011.
Research Excellence Framework Jane Boggan Planning Division Research Staff Forum - January 2010.
Assessing student learning from Public Engagement David Owen National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
The REF impact pilot findings Chris Taylor, Deputy REF manager.
Excellence with Impact Declan Mulkeen January 2011.
Impact after REF: Issues and Opportunities
Guidance on submissions Chris Taylor, Deputy REF Manager Graeme Rosenberg, REF Manager.
CUHP Cambridge University Health Partners (CUHP) unites a world-leading University and three high- performing NHS Foundation Trusts centred on the Cambridge.
Communicating the outcomes of the 2008 Research Assessment Exercise A presentation to press officers in universities and colleges. Philip Walker, HEFCE.
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
Teaching and Learning Grants Workshop Teaching and Learning Enhancement at UQ Professor Deborah Terry 8 February 2008.
Achieving and Demonstrating Research Impact John Scott.
Orvill Adams, Orvill Adams & Associates B.V. Orvill Adams Orvill Adams & Associates B.V. Measuring the Products of Medical Education.
The Higher Education Innovation Fund Vinnova and British Embassy seminar 21 March 2006.
Demonstrating research impact in the REF Graeme Rosenberg REF Manager
The Research Excellence Framework. Purpose of REF The REF replaces the RAE as the UK-wide framework for assessing research in all disciplines. Its purpose.
Research Impact. We take our impact seriously! Prof Anne-Marie Kilday, PVC (Impact), Dean Humanities & Social Science University strategy for impact PVC.
REF Information Session August Research Excellence Framework (REF)
Writing Impact into Research Funding Applications Paula Gurteen Centre for Advanced Studies.
Rónán Ó Dubhghaill, Ext June 2012 Draft University Strategic Plan Overview Rónán Ó Dubhghaill Director of Strategic Planning & Institutional.
Research Quality Assessment following the RAE David Sweeney Director, Research, Innovation, Skills.
The REF assessment framework and guidance on submissions Linda Tiller, HEFCW 16 September 2011.
Introduction to the Research Excellence Framework.
Research Assessment Exercise RAE Dr Gary Beauchamp Director of Research School of Education.
Research Funding 101 Coventry University | 7 th June 2014 | Dr Lynsey McCulloch.
HECSE Quality Indicators for Leadership Preparation.
Page 1 RESEARCH EXCELLENCE FRAMEWORK : RESEARCH IMPACT ASESSMENT LESSONS FROM THE PILOT EXERCISE Professor John Marshall Director Academic Research Development.
REF Impact Pilot Laura Tyler Marketing & New Media Manager University of Glasgow.
NSF IGERT proposals Yang Zhao Department of Electrical and Computer Engineering Wayne State University.
Graduates for the 21 st Century - Perspective from Research Ian Diamond RCUK.
The Research Excellence Framework Expert Advisory Groups round 1 meetings February 2009 Paul Hubbard Head of Research Policy.
Professor Andrew Wathey Vice-Chancellor and Chief Executive Northumbria University.
The Research Excellence Framework Impact: the need for evidence Professor Caroline Strange 22 June 2011.
12/9/10 Pilot assessment impact- paperwork Findings of the expert panels- report + appendix Lessons learned- feedback from pilot institutions Examples.
The REF assessment framework (updated 23 May 2011)
Delivering Strength Across the Piece David Sweeney Director, Research, Education and Knowledge Exchange HEPI, Royal Society 31 March 2015.
Main Panel A Criteria and Working Methods Cardiff School of Biosciences Ole H Petersen Chair.
What is impact? What is the difference between impact and public engagement? Impact Officers, R&IS.
Research Impact Sarah Hall Research Impact Strategy and Policy Manager.
Research quality and Impact: The measure of contemporary universities in globalised world Dr Joseph S. Agbenyega.
Research Excellence Framework 2014 Michelle Double Hyacinth Gale Sita Popat Edward Spiers Research and Innovation Support Conference.
Impact and the REF Consortium of Institutes of Advanced Study 19 October 2009 David Sweeney Director (Research, Innovation and Skills)
Current R& KE Issues David Sweeney
Small Charities Challenge Fund (SCCF) Guidance Webinar
Intellectual Merit & Broader Impact Statements August 2016
Towards REF 2020 What we know and think we know about the next Research Excellence Framework Dr. Tim Brooks, Research Policy & REF Manager, RDCS Anglia.
Name Job title Research Councils UK
COLLEGE OF ENGINEERING GEORGIA TECH Academic Year
A Practical Guide to Evidencing Impact
Impact and the REF Tweet #rfringe17
Small Charities Challenge Fund (SCCF) Guidance Webinar
WP2. Excellent university for the researchers
REF 2021 What we know and thought we knew, in preparation for the next Research Excellence Framework Dr. Tim Brooks, Research Policy & REF Manager, RDCS.
Law Sub-panel Generic Feedback - Impact
Intellectual Merit & Broader Impact Statements August 2018
PE and Impact – using the RDF to identify and develop the skills required Thursday, 28 February Heather Pateman, Project Manager, Vitae.
Research Update GERI May 2010.
Pathways to Impact and the REF
Towards Excellence in Research: Achievements and Visions of
Teaching Excellence & the Career Map
Intellectual Merit & Broader Impact Statements August 2017
REF and research funding update
Research Funding and Assessment: Beyond 2008
How does practice research fit into HEFCE’s future research policy?
Understanding Impact Stephanie Seavers, Impact Manager.
RIA Foundations Jonathan Grant.
CEng progression through the IOM3
Intellectual Merit & Broader Impact Statements August 2019
Presentation transcript:

Research Support Conference, 21 June 2011 Preparing for the REF:   Experience from the HEFCE Impact Pilot Dr Andrew Walsh Director Research & Business Engagement Support Services

Nature of the pilot Five proposed REF UOAs were chosen in which to pilot the assessment of impact information: Clinical Medicine Physics Earth Systems and Environmental Sciences Social Work and Social Policy & Administration English Language & Literature Twenty-nine HEIs from across the UK were selected to take part in the pilot, to provide a spread of at least 10 institutions with differing characteristics submitting to each pilot UOA. Earth Systems and Environmental Sciences and English Language and Literature were piloted in Manchester

Submission contents For each UoA: overview information relating to the submitted unit as a whole a number of (template-structured) case studies illustrating specific impacts (one for every ten 2008 Cat A staff) Impacts must have occurred between Jan 2005 and Dec 2009 The underpinning research could date back to 1993 and should be >2* in quality Impact defined broadly as “any identifiable benefit to or positive influence on the economy, society, public policy or services, culture, the environment or quality of life.” Does not include impacts within the academic sphere or the advancement of scientific knowledge

Impact Indicators Delivering highly skilled people Creating new businesses, improving the performance of existing businesses, or commercialising new products or processes Attracting R&D investment from global business Better informed public policy-making or improved public services Improved patient care or health outcomes Progress towards sustainable development, including environmental sustainability Cultural enrichment, including improved public engagement with science and research Improved social welfare, social cohesion or national security Other quality of life benefits

Impact quality levels Criteria = ‘range’ and ‘significance’ of impacts Four star/Exceptional: Ground-breaking or transformative impacts of major value or significance with wide-ranging relevance have been demonstrated Three star/Excellent: Highly significant or innovative (but not quite ground-breaking) impacts relevant to several situations have been demonstrated Two star/Very good: Substantial impacts of more than incremental significance or incremental improvements that are wide-ranging have been demonstrated One star/Good: Impacts in the form of incremental improvements or process innovation of modest range have been demonstrated Unclassified: The impacts are of little or no significance or reach; or the underpinning research was not of high quality; or research-based activity within the submitted unit did not make a significant contribution to the impact.

Attribution Specific research-based activity within the institution must have made a significant contribution to achieving the impact. Either: Through research (fixed): Underpinning research that was undertaken by staff within the institution. This could be any type of research that meets the Frascati principles, including basic research, practice-based research, applied or translational research. or Through researchers (portable): Other research-based activity undertaken by research-active staff at the institution which drew or built substantially on their own research – e.g. the contribution of an individual as an expert advisor on a committee, where this was based to a significant degree on their personal research record.

Research 1 HEI X Impact Research 2 HEI Y Research 3 HEI Z influence

HEI Y + HEI Z collaboration Research 1 HEI X Impact influence Research 2 HEI Y + HEI Z collaboration influence

Research 1 HEI X Impact Exploited by HEI Z Research 2 HEI Y Research 3 HEI Z

Research 1 HEI X Impact Exploited by HEI Z Research 2 HEI Y

Impact Research 1(a) Prof A @ HEI X Research 1(b) Prof A @ HEI Y 1990-1995 1995-2000 Impact Research 1(a) Prof A @ HEI X Research 1(b) Prof A @ HEI Y

Impact Research 1(a) Prof A @ HEI X Research 1(b, c, d) Prof A @ HEI Y 1990-1995 1995-2000 2005-2009 Impact Research 1(a) Prof A @ HEI X Research 1(b, c, d) Prof A @ HEI Y

Impact 1 Impact 3 Research 1 HEI X Impact 2 1990-1995 1995-2000 2000-2005 2005-2009 Impact 1 Impact 3 Research 1 HEI X Impact 2

Research 1 Physics @ HEI X Impact Exploitation Engineering @ HEI X

‘Basic’ Research 1 Physics @ HEI X Impact ‘Applied’ Research 2 Engineering @ HEI X

Knowledge pool Public policy change Prof A @ HEI X Expert advisor influence

Knowledge pool Public policy change Prof A @ HEI X Expert advisor influence Research by Prof A @ HEI X

Knowledge pool Public policy change Prof A @ HEI X Expert advisor influence Research by Prof A @ HEI X, Y, Z

Developing submissions - challenges? Relatively low level of appreciation of what counts as impact for REF HEFCE advice ambiguous in some areas. Difficult always to distinguish fully between inputs (which “are not the focus of assessment”) and interim or full outcomes (which are) Relevant indicators of impact range and significance not readily available, sufficient or widely understood in the institution – ‘standing start’ Difficulty and burden of contacting past researchers and end users in order to identify and evidence impact Difficult to determine what some stakeholders do with data provided by the institution or what access we might have to impact indicators

Developing submissions – challenges(2)? Tendency, in case studies, for attribution of impacts to be stressed more than their range and significance Desire to avoid ‘mere’ knowledge transfer can lead to an overly inhibited account of the contribution of our research to major impacts Uncertainty about appropriate scope for case studies or the need for these to be representative of UoA activity as a whole Tendency to focus on recent research activity by current members of staff with strategic academic potential. Are we missing profitable case studies relating to research areas that may now be dormant?

Earth Systems & Environmental Science 4* 3* 2* 1* U UOA average 18 28 24 15 15 Institution A 0 50 0 0 50 Institution B 50 0 0 0 50 Institution C 25 0 25 50 0 Institution D 35 25 25 15 0 UoM 25 25 50 0 0 Institution F 25 25 25 25 0 Institution G 0 0 25 25 50 Institution H 0 50 25 0 25 Institution I 0 35 30 0 35 Institution J 0 100 0 0 0 * Impact statements not scored

English Language & Literature UOA average 19 30 30 19 2 Institution A 25 0 50 25 0 UoM 0 0 20 80 0 Institution C 20 40 40 0 0 Institution D 0 0 100 0 0 Institution E 10 20 50 20 0 Institution F 0 20 80 0 0 Institution G 40 60 0 0 0 Institution H 25 50 25 0 0 Institution I 35 50 15 0 0 Institution J 40 0 60 0 0 Institution K 0 60 0 40 0 Institution L 20 30 20 15 15 Institution M 0 0 0 100 0

What HEFCE and panels learnt from pilot Panels broadly content that impact can be assessed in REF by mixed academic/user panels Case study approach is the best mechanism for this assessment ‘Reach and ‘significance’ are appropriate broad criteria 15 year time-lag between research and impact reasonable in most cases Unconvinced of the need for separate impact statement (elements should be incorporated into environment (‘RA5’) statement) *rejected* Recommend that weighting for impact assessment should be lower than proposed 25% for first REF (but note CSR politics – likely to be 20%) Highest scoring cases studies provided “a coherent narrative with evidence of specific benefits” Many of the poorest case studies took the form: “Professor X is very well-known, here is a list of his/her media appearances and publications”

What have we learnt from the pilot? Variability of outcomes and likelihood of significant weighting indicate that this is a very significant challenge to REF performance Reasonable performance in Earth Systems and Environmental Sciences, including a 4* case study, provides useful exemplar for cognate disciplines Disappointing performance in English Language and Literature indicates that efforts to argue for dissemination per se as impact are unlikely to be successful. We must: “make a case for the benefits arising from public engagement activity… [T]his must go beyond showing how the research was disseminated” show what “distinctive contribution the department’s research made to the public engagement activity” and that the engagement went beyond “business as usual” activities (such as public lectures) HEFCE have published a selection of top-rated case studies from each panel

How might we move forward? Improve data on impact collected from Research Profiling Exercise Begin developing long-list of candidate case studies and collect evidence, e.g: Collection of impact statements and reports for all grant-funded research Information on commercialisation impacts (e.g. spinouts - involving UMIP) Information on business and other relationships where research has not been commercialised by the University but where impacts are significant (involving external and business relations, regional affairs, media relations, etc) Maintain an accurate picture of postgraduate alumni and research staff destinations, especially if entering areas of national strategic importance (involving careers and employability) Consider use of consultants to help with quantifying the commercial impact of our work or involvement of science journalists and other professionals to help case studies communicate impact or to review past 20 years’ of research for eligible impacts UoM process for reviewing draft case studies in Summer 2011 Further development in the light of panel criteria (released for consultation in July 2011)