Impact assessment framework

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

HELPING THE NATION SPEND WISELY Performance audit and evaluation: common ground with Internal Audit ? The UK National Audit Office experience Jeremy Lonsdale.
Presentation by Pippa Lord, Senior Research Officer National Foundation for Educational Research Listening to Learners Conference University of East London.
Customised training: Learner Voice and Post-16 Citizenship.
Dr Linda Allin Division of Sport Sciences The value of real life evaluation research for student learning and employability in Sports Development.
PQF Induction: Small group delivery or 1-1 session.
Collecting and Analysing Data Chris Dayson Research Fellow Presentation to: Involve/CRESR Social Impact Masterclass 26th September 2013.
The Aged Care Standards and Accreditation Agency Ltd Continuous Improvement in Residential Aged Care.
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Morag Ferguson and Susan Shandley Educational Projects Managers
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
The Index for Inclusion. Why have an Index Forum? Purpose To offer regular opportunities to discuss school improvement with other neighbouring schools,
Screen 1 of 43 Reporting Food Security Information Reporting Formats for Food Security Report Types Learning Objectives At the end of this lesson you will.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Project Monitoring Evaluation and Assessment
ECVET WORKSHOP 2 22/23/24 November The European Quality Assurance Reference Framework.
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
LEVEL 1 All powerpoints in one.
Evaluation. Practical Evaluation Michael Quinn Patton.
Pestalozzi Children‘s Foundation emPower 2012 Monitoring & Evaluation Lecturers: Beatrice Schulter.
Research and Museums Galleries Scotland KT Scotland: Policy and Practice Conference 23 April 2010 Alison Turnbull Head of Research & Standards.
Development of Competence Profile Quality managers in VET-institutions Project no: PL1-LEO This publication [communication] reflects the.
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Needs Analysis Session Scottish Community Development Centre November 2007.
Planning and submitting a shadow report Charlotte Gage Women’s Resource Centre.
Building Research Capacity in social care: An untapped potential? Jo Cooke &Linsay Halladay University of Sheffield Others in the research team: Ruth Bacigalupo.
Stuart Hollis Where are we now? An exploration of the provision of teacher training programmes for the Learning and Skills Sector following the 2007 Workforce.
Developing Business Practice –302LON Introduction to Business and Management Research Unit: 6 Knowledgecast: 2.
Results-Based Management
Evaluation – the why’s, the what’s and the how’s 2014 Dr Basma Ellahi (PI) Cat Crum (PhD Student)
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Developing Business Practice –302LON Using data in your studies Unit: 5 Knowledgecast: 2.
Guidance for AONB Partnership Members Welsh Member Training January 26/
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Curriculum for Excellence Developing our Learning Communities Moira Lawson Curriculum for Excellence Development Officer.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
ArtFULL – finding and using evidence of learning Centre for Education and Industry University of Warwick.
Paris Project Meeting January 2012 Item – Statistics Objective 5 B. Proia With financial support from Criminal Justice Programme 2008 European Commission.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Museums and Galleries Education Programme 2 Final Report Centre for Education and Industry University of Warwick.
Introduction to Evaluation Odette Parry & Sally-Ann Baker
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
The Impact of Health Coaching
Model Name Transition Project Learning Network Workshop 3 IYF Transition Project.
Evaluation and Impact of Entrepreneurial Education and Training Malcolm Maguire Transnational High Level Seminar on National Policies and Impact of Entrepreneurship.
Evaluation Workshop Self evaluation – some workable ideas and approaches.
Presenters: Pauline Mingram & KG Ouye July 25, 2011 California State Library Public Access Technology Benchmarks Webinar.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Big Lottery Fund Greenwich Action for Voluntary Service 17 th April 2015.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Kathy Corbiere Service Delivery and Performance Commission
Māori Board Members and the District Health Board Model: Experiences, Issues and Challenges Te Mata o Te Tau Weekly Seminar Series 27 July 2006 Dr Amohia.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
THINK EFFICIENCY Administration Review Customer & Support Services Scrutiny Committee 24 th November 2008.
Jane Holdsworth 20 January The terminology of outcomes and impact How to evidence outcomes and impact Methods for collecting evidence Sources of.
Prove Your Point 24 th Feb 2016 Bexhill Museum Emily Leach Kate Pontin.
Presentation By L. M. Baird And Scottish Health Council Research & Public Involvement Knowledge Exchange Event 12 th March 2015.
IFLA: International Advocacy Programme. Address the information gap of library workers at community, national and regional levels Build capacity among.
Inspiring Learning for All Jonathan Douglas Head of Learning and Access Museums, Libraries and Archives Council.
Torbay Council Partnerships Review August PricewaterhouseCoopers LLP Date Page 2 Torbay Council Partnerships Background The Audit Commission defines.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Evaluation Planning Checklist (1 of 2) Planning Checklist Planning is a crucial part of the evaluation process. The following checklist (based on the original.
European Agency for Development in Special Needs Education Project updates Marcella Turner-Cmuchal.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Supporting community action on AIDS in developing countries ‘Evaluating HIV and AIDS-related advocacy’ Skills Building Session International.
Program Evaluation Essentials-- Part 2
Monitoring and Evaluating FGM/C abandonment programs
Presentation transcript:

Impact assessment framework Natasha Innocent Senior Policy Adviser Learning and Skills MLA

Background How libraries support learning Strong roots into communities Expertise in working in partnership with a range of other partners Critical role with digital participation Support the wider benefits of learning Provide choice and flexibility Strong links to Ageing Society

What does the framework do? Helps libraries describe the impact of their learning activities on individuals and communities A common language to talk to policy makers and funders about the role of libraries in supporting learning outcomes A methodology to support libraries to plan, evaluate and develop learning activity

Inspiring Learning This framework is based on Inspiring Learning commissioned by MLA and launched in 2003 Inspiring learning is based on a broad and inclusive definition of learning Launched 5 generic learning outcomes 3 more recent generic social outcomes www.inspiringlearningforall.gov.uk

Generic learning outcomes Knowledge and understanding Increase in skills Attitudes and values Enjoyment, inspiration, creativity Activity, behaviour and progression Learning activities in libraries include any combination of the above

decide on data collection methods present and use your findings develop your story select SMART indictors decide on data collection methods analyse your results present and use your findings

Develop your story NEEDS why you are developing this learning activity and who it is for? INPUTS AND ACTIVITIES what resources and activities do you need to develop to deliver this service OUTPUTS AND OUTCOMES what you expect you will achieve

Selecting SMART indicators Indicators are used to measure, simplify and articulate your story Indicators can be descriptive, performance or efficiency based Any good indicator should be SMART – simple to implement, measurable, action-focused, relevant and time-bound

ENTITLE output indicators Number of people who have participated in library activities designed to improve their ICT skills Number of people who have participated in library activities designed to increase their interest in reading and/or improve their reading abilities

ENTITLE outcome indicators These are specific changes in the attitude, behaviours, knowledge, skills and enjoyment of users as a result of library services ENTITLE has agreed that the key generic outcomes relating to library ICT and reading activity are knowledge and understanding, skills and attitudes

How to collect your data Review the data you already collect to see to see if you already collect evidence of generic learning outcome indicators This will help you to decide on whether to adapt a data collection method you already use or develop a new method of data collection

Data you may already collect Data source Collected when Does it give evidence of learning? Comments book Daily Yes Annual library survey Yearly Maybe Evaluation forms Post project Letters from users

Quantitative research Requires a large number of respondents to ensure any sample you take is representative and averages meaningful. Usually collected by questionnaires with multiple choice questions or by structured interviews with a series of closed or multiple choice questions

Qualitative research Useful for understanding something in depth Involves talking at greater length to a smaller group of people Qualitative information is gathered by face to face or telephone interviews, focus group discussions, comments cards or books, open ended questions on questionnaires

Ethics It is important that library staff consider ethical or legal issues when undertaking research. Laws vary from country to country but there are shared principles on Consent Interviewing children Confidentiality Data Protection

Analyse your results Analysis of the data you collect will reveal if the ‘story’ you mapped out at the beginning actually happened. That means you can test your hypothesis The approach to analyse will vary according to the type of tool you used and the type of data you collected – whether it is quantitative or qualitative

Things to think about when analysing data In interpreting results you need to decide if 64% is a high or low result The meaning and significance of the results – you may need to group the data into categories depending on who the programme is targeted at Template 2 provides examples of how to link outputs and outcomes to an indicator, data collection methods and results

Present and use your findings Highlight your key messages in an executive summary Describe the purpose of the learning activity that you have evaluated Describe your chosen methodology Think about how you could effectively present your findings Conclusions – relate to your original story

Going further Make your evidence more robust by Examining the long-term impact of the activity by re-surveying people to check if there has been any further progress Share your findings with other libraries locally and nationally to establish a benchmark Compare the results for users and non-users and identify if any significant differences

Advocacy Make the most of the evidence you collect Advocacy is speaking out to win influence, gain support, recognition, partners and funding Combine your outcomes, research, policy and statistical evidence to create a powerful argument for the role your library plays in supporting lifelong learning.

For further information contact Natasha Innocent Senior Policy Officer: Learning and Skills MLA Council natasha.innocent@mla.gov.uk