So, did it work? Academic Development Officer, HEA 1 April 2014

Slides:



Advertisements
Similar presentations
Demanding Questions and Difficult Answers Alan Maddocks Carol Newbold Loughborough University.
Advertisements

The Brookes Student Learning Experience Strategy.
Evaluation What, How and Why Bother?.
National Curriculum Framework n Exploring and Developing ideas n Investigating and making art, craft & design n Evaluating and developing work n Knowledge.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Ascend and IOE Partnership Evaluating impact: how do you know you are making a difference? Sue Hellman, David Godfrey and Sarah Seleznyov London Centre.
The Student Experience Project Overview for Kosovo Higher Education visit Mark Wilkinson October 2014.
Predicting Competitors’ Actions.
Impact assessment framework
Kent and Medway Progression Federation (KMPF) An overview for partner schools.
Untitled by Vicken Parsons Arts Council Collection, Hayward Gallery, London.
SUPI Coordination Day April 2014 Dr Jenni Chambers Senior Policy Manager, RCUK PER
Toolkit Series from the Office of Migrant Education Webinar: Program Evaluation Toolkit August 9, 2012.
Scenario Planning. A pioneer in business war gaming and an expert in developing custom simulation technologies for business and other.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Experiences and Outcomes Curriculum for Excellence Support for Trialling Expressive Arts.
Improving Local Indicators Project 3 rd Consultation Workshop David Hume Chair of Project Board.
The linking learning SET CPD activity Benchmark Reflecting on evidence Self Evaluation Tools (SET)
Auto-evaluation School Libraries Model in England David Streatfield Information Management Associates, UK.
Reading Discussion – s519 by Peter Hall – 1/21/09 Ryan, J., McClure, C. R., & Bertot, J. C. (2001). Choosing measures to evaluate networked information.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
The Engagement Cycle : engaging with patients and public throughout the commissioning process In collaboration with NHS Institute and DH.
Teaching Enhancement Project Funding
Integration, cooperation and partnerships
Understanding different types and methods of research
Learning type: Acquisition
Evaluating outreach initiatives
Professional Recognition and Development (PRD) Scheme
Ruth Geraghty Data Curator Children’s Research Network
GCSE science End of Year 10 tests
Children and Families Bill SEND provision: how we work together
What is HEA Fellowship? What’s the UK PSF?
Monitoring, Evaluation and Learning
English Hub School networks A-level English Language
Approval of Assessments
Tuning II ( ) Thematic Networks and Tuning: How to adapt and how to adopt the Tuning methodology? Management Committee.
Reflective writing The Early Years Teacher Programme: Reflective Practice Reflective Writing for the PG Certificate.
Evaluation of Research Methods
Developing Thinking Thinking Skills for 21st century learners
The UKPSF and the HEA Fellowship scheme
Evaluation Styles Logic Model Generic Social Developmental
Supplier Management Briefing PACK.
Listening Speaking Reading Class Preparation Class Preparation Class Preparation Class Preparation Online Tools Online Tools Online Tools Online Tools.
Professor Emeritus Business School and REF2014 C19 panel member
Lines of Inquiry in our PPA
APPROACHES, METHODS AND TOOLS FOR CLIMATE CHANGE IMPACT, VULNERABILITY
Museums, health and wellbeing: Methodologies for measuring impact
Learning Link Scotland
CEF Valuation Sub-Group Societal Valuation Programme Update
Ruth Geraghty Data Curator Children’s Research Network
Resource 1. Evaluation Planning Template
Developing Thinking Thinking Skills for 21st century learners Literacy
Developing an integrated approach to identifying and assessing Carer health and wellbeing ADASS Yorkshire and The Humber Carers Leads Officers Group, 7.
Raising the Profile of GRT Children in your LA Tackling the Inequalities, Identifying the Needs, Improving Life Chances.
Professional Tutor Conference 20th September 2018
Coordination Group for Biodiversity and Nature
What is Evaluation?.
Approval of Assessments
How could quality standards benefit target populations?
Standard for Teachers’ Professional Development July 2016
JISC and SOA A view Robert Sherratt.
Strategy
Writing Impact into Funding Applications
HEA Fellowship Workshop
Building a Strategic Plan
How do managers plan? Planning Objectives Plan
Workshop Set-Up: The aim is that at each table we have a variety of disciplines / subjects represented by (ideally) four participants. Ensure a mixture.
Strategy April 2018 – March 2022.
Units 3 & 4 Business management transition
Presentation transcript:

So, did it work? Academic Development Officer, HEA 1 April 2014 Jenni Carr Academic Development Officer, HEA 1 April 2014

Evaluation and/or impact assessment? Evaluation concentrates on processes and outcomes. It seeks to find out what is happening or has happened as a result of the activity, and what is (or is not) working and what did (or did not) work. the objectives (the intended achievements) of the activity being undertaken; the inputs (resources) into the activity (including money and people’s time); the outputs (activities and products) of the activity; the outcomes of the activity (what happens as a result of the activity). The importance of distinguishing between evaluation and impact assessment. Assessing impact (highlighted in bold) requires a focus on ‘measuring’ change and so when planning activities you need a model that is designed to do this.

Impact is about change What do we want to achieve? What needs to change in order to achieve this? What activity is needed in order to bring about those changes? Theory of change model. Sometimes the planning process, especially when related to bids for funding, leads to a focus on designing activity first. This model is designed to make planning processes focus on impact first. Useful both at the beginning of the project, but also for reviewing process in light of interim evaluations. Always starts with ‘what do we want to achieve?’

Participants in change (In)visible witnesses Children and young people Educators Funding bodies Programme-makers HEA Social Sciences strategic priorities Project teams The wider higher education community Lecturers Deans PVCs Firstly you need to identify who will be the participants in change. They will participate at different levels e.g. HEA strategic priorities – project teams/lecturers, we aim to change their practice, but Deans/PVCs, we aim to increase their understanding. So how do we plan activities for impact at these different levels?

Generic learning outcomes (GLO) All impact/change can be related to generic learning outcomes model (GLO) This version is particularly useful as developed as part of the work of Museums, Libraries and Archives Council and would be relevant to the SUPI projects. URL takes you to an interactive version of the model with more detailed description of each category. Further useful resources on this website include: Guidance on recording and analysing both qualitative and quantitative data; activities for practising your GLO coding and further advice on measuring outcomes. http://www.inspiringlearningforall.gov.uk/toolstemplates/genericlearning/

Some thoughts on methods The value of quantitative – but remember to measure change Social networking Games, including competitions Simulations “Making is doing” creative activities If you are going to use surveys/questionnaires to generate quantitative data, do remember to design one section of questions that focuses on measuring change. This is generally more effective if you combine with opportunities for free text answers, so you do need to think about how you will handle the qualitative data. Some level of co-ordination across the different projects that are part of the initiative could be useful. Sharing questionnaires that can then be adapted if needed not only allows for some comparison, but helps with workload! There’s also a question bank available on the MLA website: http://www.inspiringlearningforall.gov.uk/resources/research.html There are numerous tools for capturing data from social networking – see LSE Impact blog: http://blogs.lse.ac.uk/impactofsocialsciences/ including excellent ‘basics’ guide for Twitter. Game-based learning – remember to have maximum impact participants need to have some idea of what the game is designed to achieve otherwise it is ‘just a game’ – and you need to think about how you measure the outcomes/changes. Alex Moseley – University of Leicester - has useful blog and has always been helpful in responding to queries. http://moerg.wordpress.com/contact/ Young people in particular seem to enjoy simulation activities. If scenarios and activity sheets are well-written these activities give them a high degree of autonomy. Combine with reflective activity to help ‘focus down’ on outcomes. David Gauntlett’s Art Lab website has some useful ideas http://www.artlab.org.uk/ and MLA case studies also have useful ideas: http://www.inspiringlearningforall.gov.uk/successstories/ and download section has good resources for measuring outcomes: http://www.inspiringlearningforall.gov.uk/resources/research.html, particularly useful for interpreting visual images.