Introduction to Evaluation

Slides:



Advertisements
Similar presentations
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Advertisements

How to Write Goals and Objectives
How to Develop the Right Research Questions for Program Evaluation
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Logic Models and Theory of Change Models: Defining and Telling Apart
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Making it Count! Program Evaluation For Youth-Led Initiatives.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Allison Nichols, Ed.D. Extension Specialist in Evaluation.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Supporting community action on AIDS in developing countries ‘Evaluating HIV and AIDS-related advocacy’ Skills Building Session International.
Logic Models How to Integrate Data Collection into your Everyday Work.
How to show your social value – reporting outcomes & impact
Incorporating Evaluation into a Clinical Project
Project monitoring and evaluation
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Designing Effective Evaluation Strategies for Outreach Programs
Agcas Scotland Knowing your outcomes
Resource 1. Involving and engaging the right stakeholders.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Gathering a credible evidence base
PowerPoint to accompany:
Module 2 Basic Concepts.
Gender-Sensitive Monitoring and Evaluation
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Evaluation Emma King.
Training Trainers and Educators Unit 8 – How to Evaluate
Investment Logic Mapping – An Evaluative Tool with Zing
Session 1 – Study Objectives
Making Housing Matter 5 July 2016, Skainos Centre.
Building the foundations for innovation
PURPOSE AND SCOPE OF PROJECT MONITORING AND EVALUATION M&E.
Using Logic Models in Program Planning and Grant Proposals
Introduction to Program Evaluation
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
Monitoring and measuring success in the Sport for Development sector
Program Evaluation Essentials-- Part 2
Strategic Prevention Framework - Evaluation
Strategic Prevention Framework - Implementation
Training Trainers and Educators Unit 8 – How to Evaluate
Evaluating CLE Community Legal Education Masterclass
Outcomes and Evidence Based Programming
Logic Models and Theory of Change Models: Defining and Telling Apart
SUCCESSFUL MEASURING JAMES HARDIN, DIRECTOR, HEALTH IMPACT
Team Up for School Nutrition Success: Skilled Helper Model
General Notes Presentation length - 10 – 15 MINUTES
Strategies Achieving our Goals
Resource 1. Evaluation Planning Template
4.2 Identify intervention outputs
Building Knowledge about ESD Indicators
Grantee Guide to Project Performance Measurement
Introduction to M&E Frameworks
CATHCA National Conference 2018
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
My Performance Appraisal How to write SMART objectives
Standard for Teachers’ Professional Development July 2016
Training for 2018 Funded Program Evaluation form
Using Logic Models in Project Proposals
What is your impact pathway?
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
M & E Plans and Frameworks
Presentation transcript:

Introduction to Evaluation West Metro Neighbourhood Houses Centre for Evaluation and Research December 2018

Acknowledgement of Country The CER team are honored to acknowledge the rich culture, traditions and elders of Victoria’s Aboriginal community. We acknowledge the Traditional Owners and custodians of the land on which we gather today. Embracing the spirit of reconciliation we recognise and value the ongoing and enriching contribution of Aboriginal communities to Victorian life.

Introductions Who are we? Training outline Gina Mancuso, Jane Howard and Ellie McDonald The Centre for Evaluation and Research Victorian Department of Health and Human Services Training outline An overview of how to develop a program logic An overview of how to develop evaluation questions Evaluation next steps – including data sources and reporting Link using the program logic model to help develop your evaluation questions and indicators to answer your evaluation questions and the data sources for those indicators How doing your program logic model helps to clarify what you want to know about your program and how you might measure that – make sure you measure what is appropriate and relevant Logic models help to make explicit assumptions we have about the program, the people involved, how it will operate Logic models can help you decide what focus your evaluation questions need to have So if you need to show what you are doing is effective or how you can improve your program you might want to develop questions that focus of how your program was implemented Alternately, if you need to demonstrate to your funders that your program is achieving its stated outcomes or aims then your evaluation questions will focus on the outcomes or expected benefits from your program (i.e. changes in behaviour, increased skills, changed attitudes or improved conditions)

Components of a program logic model Inputs – resources necessary to implement activities and outputs – money, staff, volunteers, facilities, equipment, supplies Activities – linked to outputs (reframe output into an active task) – services, training, education (e.g. provide job training, educate public about healthy eating or exercise, create mentoring partnerships for youth) Outputs – direct products of program activities (i.e. units of service like hours of service provided or products – number of participants, number of education/training sessions held, number and types of educational or training material distributed, number of clients linked with mentors) Outcomes – short to medium term (e.g. changes in knowledge, attitudes and skills; changes in actions, behaviours or policies; changes in environmental, social, economic conditions Assumptions – factors that you take for granted that mean the chain of events takes place So why we think activity X will lead to output Y and/or outcome Z Are there other explanations for why activity X might lead to outcome Z? Will activity X always lead to outcome Z or only under some circumstances and with some target groups? External factors – factors that may impact the outputs and outcomes (positive or negative) and over which you have little control These can be funding, policy landscape, changes at local or state government Demographic makeup, family circumstances, values, political environment, background and experiences of participants, media, policies and priorities Remember the logic model is just that, a model. It is not intended to portray a program in minute detail, rather it is a way of understanding how a program might achieve its desired outcomes So a logic model needs to be clear and understandable and best fits the user and use There is no best model – we learn through trial and error and practice, program logic models may also change as a program evolves

The If and Then of logic models Certain resources are needed to implement your program; If you have access to them, then you can use them to accomplish your planned activities; If you accomplish your planned activities, then you will hopefully deliver the amount of product and/or service that you intended; If you accomplish your planned activities to the extent you intended, then your participants will benefit in certain ways; If these benefits to participants are achieved, then certain changes in organisations, communities, or systems might be expected. This is the foundation of logic models and is our attempt to articulate a casual association between what we do and our expected outcomes Outcomes can be split into short and long term If you have certain resources, then you will be able to provide activities, produce services or products for targeted individuals or groups. If you reach those individuals or groups, then they will benefit in certain specific ways in the short term. If the short-term benefits are achieved to the extent expected, then the medium-term benefits can be accomplished. If the medium-term benefits for participants/organizations/decision-makers are achieved to the extent expected, then you would expect the longer-term improvements and final impact

Developing your logic model: Start with your outcome/impact and work back from there Issue/situation: Describe the problem your program is trying to address Impact: Specify what you want your program to achieve as an indicator of your program’s success (its impact, long-term outcome) Outcomes: What short to medium term outcomes need to happen if the impact is to be achieved Outputs: What outputs are needed to achieve these outcomes – you will need to identify what actual activities will be delivered, who will be engaged (i.e. who is your target group) and what kind of response or interaction will be need from participants if the program is to be successful (example output for an education program might be: X number of target participants completed the course) Inputs: Be as specific as possible about what elements will make up the program, resources needed (i.e. people, partner organisations, skills, equipment, technology) and what activities Does it all make sense? Does the logic you have outlined make sense? Are your casual links plausible?

Activity: develop your own program logic Identify a program you would like to evaluate Begin filling out the columns in your program logic template Use any program documentation you’ve brought along to complete your program logic We will come around and answer any questions you have Start with your outcome and work back from there Issue/situation: Describe the problem your program is trying to address Impact: Specify what you want your program to achieve as an indicator of your program’s success (its impact, long-term outcome) Outcomes: What short to medium term outcomes need to happen if the impact is to be achieved Outputs: What outputs are needed to achieve these outcomes – you will need to identify what actual activities will be delivered, who will be engaged (i.e. who is your target group) and what kind of response or interaction will be need from participants if the program is to be successful (example output for an education program might be: X number of target participants completed the course) Inputs: Be as specific as possible about what elements will make up the program, resources needed (i.e. people, partner organisations, skills, equipment, technology) and what activities Does it all make sense? Does the logic you have outlined make sense? Are your casual links plausible?

Definitions: Remember a logic model needs to be clear and understandable and best fits the user and use Inputs – resources necessary to implement activities and generate outputs – Examples: money, staff, volunteers, facilities, equipment, supplies Activities – linked to outputs Examples: services, training, education Outputs – direct products of program activities Examples: units of service (e.g. hours of service provided or products), number of participants, number of education/training sessions held, number and types of educational or training material distributed, number of clients linked with mentors – These can become your indicators Outcomes Short to medium term - changes in knowledge, attitudes and skills Longer term changes in actions, behaviours Assumptions – underlying theories and beliefs about the program and its context Why we think activity X will lead to output Y and/or outcome Z? Are there other explanations for why activity X might lead to outcome Z? Will activity X always lead to outcome Z or only under some circumstances and with some target groups? External factors – factors that may impact the program and over which you have little control Cultural norms Political climate Funding Background and experiences of participants

Tea break

Developing evaluation questions Evaluation questions should draw out the information that needs to be collected to effectively measure a program’s contribution to change How do we develop evaluation questions? Identify what you want to know Identify the type of evaluation (formative/process/impact)? Identify how the questions fit in terms of your program logic model Consult with program stakeholders Prioritise most important questions (limit questions) WHAT IS AN EVALUATION QUESTION? Evaluation questions should focus on what you want to learn from a particular program e.g. if you want to understand how effective a program was you could ask ‘to what extent did the programs inputs, activities and outputs lead to the desired outcomes?’ Evaluation questions act as a overarching guide for the evaluation, and help you determine an appropriate methodology and data collection methods It is important not to ask too many questions, as this can create a lot of work and also lose the focus of your evaluation How do we develop evaluation questions? Identify what you want to know Identify the type of evaluation (formative/process/impact)? Identify how the questions fit in terms of your program logic model Consult with program stakeholders Prioritise most important questions (limit questions)

Key ways to break down evaluation questions When developing evaluation questions, it is useful to break them down into the five categories of: Appropriateness – the extent to which a program is appropriate for specific target population and context Effectiveness – the extent to which a program has been effective Efficiency – the extent to which the program has been cost effective Impact – the extent to which a program is on track to achieve its goal/or has achieved its goal Sustainability – the extent to which there is funding or policy support for a program beyond the trial period It can be hard to know exactly what to ask, even if you are clear on what you want to know. These five categories can be a helpful guide to help you cover off on all the relevant areas. Typically formative or process evaluations focus on appropriateness, effectiveness and efficiency questions Outcome evaluations can focus on all five categories if appropriate but don’t need to. READ OUT EXAMPLES FROM EXAMPLE SHEET

Let’s develop some evaluation questions To practice this process, we’ve provided an example of a program logic on your tables. Does anyone want to have a go at coming up with an evaluation question for this program? E.g. if we want to understand the impact of this program, we could ask ‘to what extent did VCT training decrease HIV transmission rates?’

Activity: develop your evaluation questions Use the ‘Evaluation checklist’ to start drafting your evaluation questions Draw on the handout with example questions for inspiration We will float around to support development of evaluation questions Now we’re going to have a go at developing evaluation questions for our own programs using the evaluation checklist and the handout with example questions for inspiration Start by thinking about what you want to know at the end of your evaluation and what your stakeholders would like to know. What will you need to report on to your stakeholders? We will come around and provide guidance and answer any questions you have

Overview of data sources What types of data will help you answer your evaluation question and provide evidence about whether your indicator(s) has been met? Think about a range of data sources: Monitoring data including attendance rates, demographic information Quantitative data from surveys Qualitative data from interviews, focus groups Documentary data from annual reports, case notes, student progress reports, student work/output On your evaluation checklists you will see columns to continue planning your evaluation. These include identifying your relevant stakeholders, identifying indicators e.g. how you will know if your program is working, and data sources. Indicators sound like a tricky one but really they are quite simple. Indicators are measurements or values that tell you whether something is occurring. Use your outputs from your program logic as a guide for indicators. E.g. an indicator could be that 60 people participated in HIV training – this indicates that the training was attended by a high number of people and suggests that more people are better educated, which the program hopes will decrease HIV transmission rates. Data sources is a good next step. Looking at each evaluation question, how will you best answer it? With what data source? It’s good to keep in mind what data is already available from the program and what data collection and data sources are feasible to draw on within the timeframes of the evaluation. Look at your list of key stakeholders when thinking about data sources and think about the range of information and opinions that would be relevant to explore and document.

Next steps… Now that you have started developing your: Program logic and; Evaluation questions The evaluation report template will support the continuation of your evaluation planning. The template provides guidance and prompts for areas including: Methodology Data collection and analysis Stakeholder engagement Governance Ethical considerations

Thank you Any questions?