Wednesday 27th June 2018 12pm - 1.30pm

Slides:



Advertisements
Similar presentations
Assessing student learning from Public Engagement David Owen National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
Advertisements

Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Project Monitoring Evaluation and Assessment
Improvement Service / Scottish Centre for Regeneration Project: Embedding an Outcomes Approach in Community Regeneration & Tackling Poverty Effectively.
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
REF Information Session August Research Excellence Framework (REF)
Ian Hodgkinson HMI 19 June 2015
Writing Impact into Research Funding Applications Paula Gurteen Centre for Advanced Studies.
What do we mean when we talk about IMPACT? IAS Public Engagement and Impact 6 th November 2014.
Reflections on the Independent Strategic Review of the Performance-Based Research Fund by Jonathan Adams Presentation to Forum on “ Measuring Research.
Research Quality Assessment following the RAE David Sweeney Director, Research, Innovation, Skills.
Page 1 RESEARCH EXCELLENCE FRAMEWORK : RESEARCH IMPACT ASESSMENT LESSONS FROM THE PILOT EXERCISE Professor John Marshall Director Academic Research Development.
Page 1 RCUK : PATHWAYS TO IMPACT WHAT IT MEANS AND WHAT TO DO NOW Professor John Marshall Director Academic Research Development CREDO workshop May 2011.
The Research Excellence Framework Expert Advisory Groups round 1 meetings February 2009 Paul Hubbard Head of Research Policy.
Professor Andrew Wathey Vice-Chancellor and Chief Executive Northumbria University.
Hosted by: Funded by: Research into Reality Overcoming the challenges of knowledge exchange Foundation for the Future Research Conference 26 th – 28 th.
The REF assessment framework (updated 23 May 2011)
What is impact? What is the difference between impact and public engagement? Impact Officers, R&IS.
Impact and the REF Consortium of Institutes of Advanced Study 19 October 2009 David Sweeney Director (Research, Innovation and Skills)
Small Charities Challenge Fund (SCCF) Guidance Webinar
Responsible Procurement:
Knowledge Transfer Partnership Project Nottingham Trent University and Nottinghamshire County Council Dr Adam Barnard Rachel Clark Catherine Goodall 19/4/16.
Towards REF 2020 What we know and think we know about the next Research Excellence Framework Dr. Tim Brooks, Research Policy & REF Manager, RDCS Anglia.
Interdisciplinary learning (primary version)
Resource 1. Involving and engaging the right stakeholders.
Queen’s Teaching Awards 2017
Southampton City Council School School Improvement Service
Name Job title Research Councils UK
New developments in the UK Higher Education
Evaluating Better Care Together
Right-sized Evaluation
Investment Logic Mapping – An Evaluative Tool with Zing
The University of the Future: Preparing for Curriculum Refresh
The future of the REF: view from HEFCE
Research Day 2017 Generating Impact breakout session
The NICE Citizens Council and the role of social value judgements
Student QEP Workshop Developing Student Engagement in Quality Assurance and Enhancement Student/Staff Strategic Analysis Session Eve Lewis Director.
Writing for Impact Research Active Staff Workshop
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
A Practical Guide to Evidencing Impact
Leaving no one behind: The value for money of disability-inclusive development Glasgow, 1st February 2017.
Impact and the REF Tweet #rfringe17
Small Charities Challenge Fund (SCCF) Guidance Webinar
GSF Results and Financial Monitoring Workshop
WP2. Excellent university for the researchers
What Is Workforce Development?
REF 2021 What we know and thought we knew, in preparation for the next Research Excellence Framework Dr. Tim Brooks, Research Policy & REF Manager, RDCS.
Law Sub-panel Generic Feedback - Impact
Learning Link Scotland
Research Update GERI May 2010.
Towards Excellence in Research: Achievements and Visions of
school self-evaluation and improvement toolkit
4.2 Identify intervention outputs
Establish you aims from the outset
CATHCA National Conference 2018
Pathways to Impact Lynne McCorriston
Public Health Intelligence Adviser
A Whole School Approach
Understanding Standards: Nominee Training Event
REF and Impact Richard Bond
Learning design as a foundation for the future success of e-learning
Standard for Teachers’ Professional Development July 2016
TEACHNG AWARDS 2019 BRIEFING SESSION
REF and research funding update
How does practice research fit into HEFCE’s future research policy?
Understanding Impact Stephanie Seavers, Impact Manager.
Seminar on the Evaluation of AUT STEM Programme
RIA Foundations Jonathan Grant.
Planning for Evaluation: An Introduction to Logic Models
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Presentation transcript:

Wednesday 27th June 2018 12pm - 1.30pm Venue: Kedleston Road-Sports Centre- Field Meeting Room REF Impact Case Studies: A session for researchers and research leaders Dr Tracey Wond   In this session, we will begin to explore how impact can be understood, managed and measured in the REF case study context. Drawing upon prior case studies, impact evaluation approaches and participants’ prior experiences, we will examine the nature of impact and approaches to capturing it. The session should appeal to any researchers looking to develop their own research impact as well as research leaders developing (and supporting others to develop) REF case studies. Refreshments will be provided. Limited to 20 places. REF Impact Case Studies For researchers and research leaders Dr Tracey Wond

DR TRACEY WOND t.wond@derby.ac.uk Joint UoA lead 17 REF2021 UoA lead 19 REF2014 TODAY PART 1 - Understand REF impact requirements PART 2 - Exploring impact PART 3 - Consider how we might evaluate and evidence REF Impact - Learn from prior case studies (REF2014)

Introducing REF REF = Research Excellence Framework Subjects are divided into Units of Assessment (UoAs). UoAs are clustered into panels and each panel issues its own guidance on what research excellence assessment looks like (including impact). Sub-panels representing UoAs work into these main panels. REF supports institutions to draw down QR funding (quality related funding). Provide accountability. Impact introduced in REF2014, to assess wider impact of research. Impact presented through case studies. These are then expertly reviewed. The case study is not about showcasing all of the research in a unit.

Introducing REF Impact Impact increased from REF2014 Reflects growing government priority for research that makes a difference. E.g. UKRI ambition – ‘economic impact’, ‘social impact’. Steven Hill, Head of Research Policy, HEFCE – ‘Delivering benefits to society must remain at the heart of the research endeavour’. In REF2014, panel guidance further outlined and exemplified impact. For REF2021, update on impact expected in summer 2018. Impact eligible in the institution(s) it was conducted. Impact within and beyond the submitting institution (teaching impact) Underpinned with at least 2 star ‘excellent’ research produced between 01/01/00 and 31/12/20. Impacts to have occurred 01/08/13 and 31/07/20. Assessed by research and significance (further update expected) Able to continue or develop a REF2014 case study (but show additionality) In REF2014 this was Impact: 20%; Outputs: 65%: Env: 15% UK Research and Innovation (UKRI) explicitly prioritises research as supporting economic and social impact. The REF2021 methodology is still emerging and updates and clarifications have been released gradually. In relation to impact we will know more about impact in summer 2018. See Initial Decisions document: http://www.ref.ac.uk/publications/2017/initialdecisionsontheresearchexcellenceframework2021.html : Impact A key recommendation of the Stern review was to ensure the REF could better capture the multiple and diverse pathways and mechanisms through which impact arises from a body of work, and through which real benefits to the UK and wider world are delivered. This aim was widely supported by respondents to the consultation, and the funding bodies will seek to implement this in the exercise, as follows: We will work with the panels to provide additional guidance on: The criteria for impact of ‘reach and significance’. Impact arising from public engagement. The guidance on submitting impacts on teaching will be widened to include impacts within, as well as beyond, the submitting institution. We will also work with the panels to develop appropriate guidance on demonstrating evidence against the criteria for this type of impact. Also decision to ‘harmonise’ definition of impact with research councils.

Defining REF Impact In REF 2014, Impact was defined as: “an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia” Panel criteria further articulate. In REF2014 this was Impact: 20%; Outputs: 65%: Env: 15% UK Research and Innovation (UKRI) explicitly prioritises research as supporting economic and social impact. The REF2021 methodology is still emerging and updates and clarifications have been released gradually. In relation to impact we will know more about impact in summer 2018. See Initial Decisions document: http://www.ref.ac.uk/publications/2017/initialdecisionsontheresearchexcellenceframework2021.html : Impact A key recommendation of the Stern review was to ensure the REF could better capture the multiple and diverse pathways and mechanisms through which impact arises from a body of work, and through which real benefits to the UK and wider world are delivered. This aim was widely supported by respondents to the consultation, and the funding bodies will seek to implement this in the exercise, as follows: We will work with the panels to provide additional guidance on: The criteria for impact of ‘reach and significance’. Impact arising from public engagement. The guidance on submitting impacts on teaching will be widened to include impacts within, as well as beyond, the submitting institution. We will also work with the panels to develop appropriate guidance on demonstrating evidence against the criteria for this type of impact. Also decision to ‘harmonise’ definition of impact with research councils.

‘Underpinning Research’ Extracts from REF2014. REF2021 not available yet!

Nature of Impact Witnessed UoAs getting this wrong in REF2014 Outputs usually come after activities. Here it appears they are trying to demonstrate outputs as actual research outputs – i.e. underpinning research. Diagram source: in Collecting Research Impact Evidence: Best Practice Guidance for the Research Community https://www.research-strategy.admin.cam.ac.uk/files/collecting_research_impact_evidence_best_practice_guidance.pdf Some sub panel chairs have reported that they didn’t see impact in some cases. Exploring past case studies, available publicly this is also observed. Kelloggs’ Foundation Logic Model is commonly recommended to understand more about logic modelling and distinguish inputs, outputs, activity, outcomes and impact. Number of tweets indicates activity as opposed to impact. https://www.wkkf.org/resource-directory/resource/2006/02/wk-kellogg-foundation-logic-model-development-guide Witnessed UoAs getting this wrong in REF2014 Note doing things (activities) is not the same as impact. REF panel chairs want to see distinction shown between activities and impact.

Logic Modelling Plots how your activity is meant to work Within evaluation we explore logic models a lot, and tend to start off with them. Logic models are popular in evaluation practice; It plots how your activity is meant to work and can look similar to the impact arrow on the previous page; With the addition of context and rationale a theory can be built on how impact should happen If your case study only covers up to here it may be worth reviewing

Attribution – Making the Link Growing demand for quantitative data. Make the link between the underpinning research, the impact, and the impact evidence Clarity about how the impact occurred. Specific evidence – clearly support the impact and narrative.

CMO – A Realist Evaluation Approach CONTEXT MECHANISM OUTCOMES   What is the context in which the impact occurs and what considerations does this give rise to? Consideration for setting (e.g. entrepreneurs, SMEs, regional business support provision, policy etc.) Consideration for other features of the context of the impact – reach, novelty, type of organisation, location. Consideration for intention for change/impact which will influence the evaluative research questions, e.g.: To what extent did team y’s work on entrepreneurial learning influence the way in which business support was delivered (in the regions targeted, and beyond)? How was the change effected? Using what mechanisms? This might be planned or unplanned. If planned then it may be worth revisiting the ‘theory of change’. Mechanisms might include outputs such as: Satisfaction with new approach No. attending session No. of retweets a post gained. What impact was created and how can this be evidenced? REF usually provide guidance on what impact may look like. Evidence that the research led to the change. Where possible with consideration for causality – linking the research intervention with the impact. Outcomes Mechanism Context Flexibility acknowledged in REF documentation;

Impact Evidence- Types Collecting Research Impact Evidence: Best Practice Guidance for the Research Community https://www.research-strategy.admin.cam.ac.uk/files/collecting_research_impact_evidence_best_practice_guidance.pdf Flexibility acknowledged in REF documentation; See handout – ‘Guidance on the types of evidence that could be collected’ Source: Vertigo Ventures and Digital Science (2016)

Impact Evidence – REF2014 analysis Collecting Research Impact Evidence: Best Practice Guidance for the Research Community https://www.research-strategy.admin.cam.ac.uk/files/collecting_research_impact_evidence_best_practice_guidance.pdf Recognise variations across subject areas/Panels. Analysis by Vertigo Ventures and Digital Science (2016)

Tips for Impact See handout – ‘Research Impact Process’ http://www.bath.ac.uk/ris/impact/toolkit/ See handout – ‘Research Impact Process’

Further Resources University of East Anglia: Impact Case Study Packs available at: https://portal.uea.ac.uk/documents/6207125/#gsc.tab=0&gsc.q=impact%20case%20s tudy%20pack&gsc.sort= University of Bath: Impact toolkit: http://www.bath.ac.uk/ris/impact/toolkit/