Introduction to Outcomes Planning & evaluation Contribution Analysis H

Slides:



Advertisements
Similar presentations
Implementing NICE guidance
Advertisements

Monitoring and evaluation of carers’ services and projects Dr Andrea Wigfield - Associate Professor of Social Policy Centre for International Research.
Contribution analysis Anita Morrison and Jackie Horne, Office of the Chief Researcher, Scottish Government.
SAFER CITIES MODEL. SAFER CITIES TOOLS SAFER CITIES TRAINING MANUAL AND TOOLKIT Overall development objective is to facilitate effective strategy development.
Evaluation. Practical Evaluation Michael Quinn Patton.
Bond.org.uk The Bond Effectiveness Programme: developing a sector wide framework for assessing and demonstrating effectiveness July 2011.
Urban-Nexus – Integrated Urban Management David Ludlow and Michael Buser UWE Sofia November 2011.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
1 OPHS FOUNDATIONAL STANDARD BOH Section Meeting February 11, 2011.
Contribution Analysis: An introduction Anita Morrison Scottish Government.
Strategic Prevention Framework Overview Paula Feathers, MA.
Health and Well-Being in Schools Steering Group 18 May 2009 Neil Craig Policy Evaluation & Appraisal Team NHS Health Scotland Logic Modelling for Outcomes-
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
The P Process Strategic Design
Scottish Improvement Science Collaborating Centre Strengthening the evidence base for improvement science: lessons learned Dr Nicola Gray, Senior Lecturer,
Childhood Neglect: Improving Outcomes for Children Presentation P21 Childhood Neglect: Improving Outcomes for Children Presentation Measuring outcomes.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
THE SOUND OF SILENCE: AN EVALUATION OF CDC’S PODCAST INITIATIVE Quynh-Chau, M., Myers, Bradford A. (2013). The Sound of Silence: an evaluation of CDC's.
Logic Models How to Integrate Data Collection into your Everyday Work.
CHW Montana CHW Fundamentals
Evaluating the Quality and Impact of Community Benefit Programs
Integration, cooperation and partnerships
Knowledge for Healthcare: Driver Diagrams October 2016
MODULE 15 – ORGANISATIONAL COMMUNICATION
How to show your social value – reporting outcomes & impact
Session VII: Formulation of Monitoring and Evaluation Plan
Impact and the Physical Sciences
Social Return on Investment (SROI) Evaluation and Impact Manager
Resource 1. Involving and engaging the right stakeholders.
Gathering a credible evidence base
Training Trainers and Educators Unit 8 – How to Evaluate
Workforce & Practice Transformation
Developing Sustainable Behaviour Change Training
Making Housing Matter 5 July 2016, Skainos Centre.
Health Education THeories
Chapter 17 Evaluation and Evidence-Based Practice
HEALTH IN POLICIES TRAINING
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
Communications Strategy
Kate Yorke, Project Manager – MECC
Training Trainers and Educators Unit 8 – How to Evaluate
MOSH Leading Practices Adoption System
Communications Strategy
Kate Yorke, Project Manager – MECC
Planning a Learning Unit
Strategic Communication Planning
Claire NAUWELAERS, independent policy expert
Logic Models and Theory of Change Models: Defining and Telling Apart
4.2 Identify intervention outputs
Evaluation – embed within an equity framework
CATHCA National Conference 2018
GENERAL INTRODUCTION TO ADVOCACY
What is PACE EH? PACE EH is a process for assessing and analyzing the environmental health of communities and for creating plans to address threats and.
1. Reduce harms from the main preventable causes of poor health
January 2019 ROSC Seminar.
IMPROVING SCOTLAND’S HEALTH Rights, Respect and Recovery
Recruitment Information Pack
monitoring & evaluation THD Unit, Stop TB department WHO Geneva
Building Capacity for Quality Improvement A National Approach
Introduction to Evaluation
Understanding your Impact on Well-Being of Future Generations (Wales) Act 2015 Contributing to and Reporting on the 7 Wellbeing Goals, Local Wellbeing.
Using Logic Models in Project Proposals
Understanding Impact Stephanie Seavers, Impact Manager.
By: Andi Indahwaty Sidin A Critical Review of The Role of Clinical Governance in Health Care and its Potential Application in Indonesia.
Getting Knowledge into Action for Healthcare Quality
CEng progression through the IOM3
Presentation transcript:

Introduction to Outcomes Planning & evaluation Contribution Analysis H Introduction to Outcomes Planning & evaluation Contribution Analysis H.McIntosh & S.Wilson 20/10/15 Adapted in part from material prepared by John Connolly, UWS . W

agenda 10:00 Welcome & Introductions 10:10 Overview of contribution analysis 10:40 Quiz 10.55 Participative demonstration: evaluating the impact of applying knowledge to practice—Clinical Enquiry & Response Service (CLEAR) 11:15 Coffee 11:30 Group work 12:30 Overview of performance reporting 13:00 Lunch 13:45 Performance Reporting 15:40 Next steps 16:00 Close

objectives By the end of the session you will be able to: List the key features of contribution analysis Appreciate how an outcomes planning approach facilitates evaluation for learning and continuous improvement as well as accountability Explain the outcomes chain to a colleague Outline a basic evaluation framework (outcomes chain and monitoring & evaluation plan) for a project relevant to your own area of work

Attribution and Contribution Questions about cause and effect are critical to evaluating impact Attribution analysis Did the intervention cause the observed results? Contribution analysis Has the intervention contributed to the observed results? Has the intervention made a difference? Outcomes Planning & Evaluation (OPE) is the process of undertaking contribution analysis

What is OPE? A pragmatic approach to plan and evaluate the impact of programmes that operate in complex settings Focuses on outcomes – does our programme make a difference? Participative – involves stakeholders Uses theory of change to map out the expected pathways to the intended outcomes within a time-based framework Uses evidence to identify effective pathways Facilitates evaluation for learning and continuous improvement, not just accountability Supports performance reporting focussed on outcomes; the difference the programme is making

Theory of change The theory of how and why an intervention works Explains how a programme, as designed, will theoretically result in the intended outcomes Shows the plausible links between the programme’s activities and subsequent outcomes Identifies underlying assumptions Takes account of the context in which interventions work–how external factors may affect results

Monitoring & Evaluation OPE Process Current Situation Outcome Plan Outcome Planning Theory of Change Monitoring & Evaluation Gather evidence & develop performance story Gather more evidence Performance reporting & sharing learning Performance Story

Tools for OPE Outcomes chain Outcome planning Theory of change Monitoring & Evaluation Plan Developed from the outcomes chain Measuring success—identifying indicators Outcomes Focussed Performance Reporting template

The difference you make OUTCOMES CHAIN SPHERES OF INFLUENCE Direct influence Direct control Indirect influence What do you do? The difference you make Expected impacts Resources you need What you do What you produce Who you reach Changes in knowledge, skills, awareness Changes in behaviour, practice Changes in health outcomes Inputs Activities Outputs Reach Short term outcomes Medium term outcomes Long term outcomes Political, technological, socio-economic, environmental, other factors Organisational resources, skills, systems Existing practices and capacity External influences 9

Outcome types Short term—usually associated with changes in knowledge, skills, attitudes, awareness Changed attitudes to alcohol and drinking Librarian’s acquire skills in evidence synthesis to support Practice Based Small Group Learning (PBSGL) Medium term—usually associated with changes in behaviour and practice at an individual or organisational level Reduced alcohol-related violence and abuse Librarian’s provide evidence search & synthesis service support to PBSGL Long term—aspirational outcome, the overarching goal Safer and happier families and communities More widespread application of evidence-based practice

Assumptions, Risks and competing explanations What assumptions do you make to get from one step in the outcomes chain to the next? What are the risks that getting from one step to the next won’t happen as you imagine? Any barriers you might anticipate? How might you mitigate these risks? What are the other influencing factors? Other programmes, policies, services Social, political factors Enablers as well as barriers What alternative explanations could there be for how the intervention works?

Partner contributions to shared outcomes Inputs Activities/ Outputs Reach Short-term Outcomes Medium-term outcomes Long-term Outcomes National/Local Outcomes Culture Practice & Behaviours Environments Partner 1 Partner 2 Partner 3 Outcomes chain 12 12

Cross-sector partner contributions – Alcohol Long-term Outcomes Improved mental wellbeing Reduced inequalities in healthy life expectancy Reduced inequalities in alcohol-related deaths and hospital admissions Behaviour Reduced alcohol consumption levels Less drunkenness; less drink-driving Environments Physical: Reduced exposure to alcohol-related hazards Economic: Reduced availability/affordability of alcohol Social: Drunkenness less attractive; sensible drinking the norm Medium-term outcomes Vol orgs Detox, Intensive support Addiction services Adults with alcohol problems Increased sobriety & stability Police Enforcement of drink driving laws Random breath testing Drivers Increased detection rate Local authorities Enforcement of planning controls & licensing laws Enforcement actions Licensed trade Compliance with laws SG, UK govts, EU Industry regulation Taxation,displays, promotions, advertising Alcohol industry Increased price Reduced incentives Service uptake & engagement Short-term Outcomes Understanding risks, attitudes to drinking Hazardous and harmful drinkers Reach General public - targeted Preventive services Sensible drinking messages Activities & Outputs Screening & Brief advice Media campaigns Inputs Scottish Govt NHS Outcomes chain

Definitions recap Outputs are things produced as a result of programme activities Outcomes are the benefits or results of a programme—the difference the programme makes the changes that occur in the short, medium and long term as a result of the programme’s activities and outputs Outputs are not outcomes

OPE Process: Monitoring & Evaluation Current Situation Outcome Plan Outcome Planning Theory of Change Monitoring & Evaluation Gather evidence & develop performance story Gather more evidence Performance reporting & sharing learning Performance Story

Monitoring & evaluation plan Developed from the outcomes chain Define indicators to monitor progress and measure success Identify how and from where you will gather the evidence Agree who has responsibility for data gathering/storage Establish the analysis plan: how, when, who?

indicators Outcome indicators Reach Reaction Outcomes are the benefits or results of a programme—changes that occur in the short, medium and long term Indicator is the measure of the extent to which the outcome is being achieved Reach Measures of who was engaged/involved Reaction Measures of people’s reaction to the engagement Resource use, activities undertaken, outputs produced Measures of the extent to which the programme was delivered as intended

Defining indicators What evidence do you need to be able to report meaningfully on the impact of your programme? Involve key stakeholders Consider what data are already collected! Indicators need to be Measurable—recorded and analysed quantitatively or qualitatively Precise—understood in the same way by all stakeholders Sensitive and Consistent What is ideal versus what is feasible Do I have to evidence everything? Do I need balancing measures? Is hard data better than soft data? How many indicators do I need? Can I use proxy indicators? 18

Example: K2A support for PBSGL Outcomes Development of PBSGL modules routinely involves support from knowledge services Relationship between librarians and practitioners is strengthened Indicators Increase in proportion of modules produced that draw on knowledge services’ evidence search & synthesis service Change in librarians’ perception of their working relationship with practitioners as a result of their involvement in module production Qualitative evidence that practitioners value collaboration with knowledge services

Monitoring & Evaluation Plan OUTCOMES CHAIN   PROGRESS MEASUREMENT Indicators Data source Responsibility Long-term outcomes Medium-term outcomes Short-term outcomes Reach Outputs Activities Inputs

Quiz & participative demonstration

Impact of a clinical enquiry service For each step in the chain, what will success look like? NHS staff in primary care or remote and rural areas Evidence informed clinical decisions Improved patient care Useful and relevant service which saves practitioner time Improved understanding of diagnosis, prognosis & therapies Web based evidence summaries answering clinical enquiries Develop service: processes, train staff, develop website Team of information professionals and researchers

Impact of a clinical enquiry service For each step in the chain, what will success look like? Team of information professionals and researchers Develop service: processes, train staff, develop website Web based evidence summaries answering clinical enquiries NHS staff in primary care or remote and rural areas Quality service which saves practitioner time Improved understanding Informed clinical decisions Improved care

Assumptions, Risks, Indicators…. Practitioners value evidence based ways of working Staff indicate their practice was informed by the information provided. Number of requests received. Number of visits to website (reuse). Number of repeat users. Visits to website. Staff state they would recommend service and saves time. Staff indicate they received new information or resources. External factors override ability to apply new knowledge Inaccurate information reported in summaries Number of requests by professional group or health board Practitioners will apply new learning to patient care

Assumptions, Risks, Indicators…. Practitioners value evidence based ways of working Practitioners will apply new learning to patient care Inaccurate information reported in summaries External factors override ability to apply new knowledge Number of requests received. Number of visits to website (reuse). Number of requests by professional group or health board Number of repeat users. Visits to website. Staff state they would recommend service and saves time. Staff indicate they received new information or resources. Staff indicate their practice was informed by the information provided.

OPE Process: performance reporting Current Situation Outcome Plan Outcome Planning Theory of Change Monitoring & Evaluation Gather evidence & develop performance story Gather more evidence Performance reporting & sharing learning Performance Story

Outcomes focussed Performance reporting Performance report focussed on what difference you made Template to help set out your report A headline about your work What you were trying to do and why Your activities/outputs Your results Other factors/challenges Learning Adaptable to brief and longer formats for different audiences How often should we prepare a performance report?

summary OPE is a pragmatic, participative and outcomes focussed approach to plan and evaluate the impact of programmes that operate in complex settings Enables credible assessment of the contribution a programme makes to observed results Facilitates evaluation for learning and continuous improvement as well as accountability

The difference you make OUTCOMES CHAIN What do you do? The difference you make Expected impacts Resources you need What you do What you produce Who you reach Changes in knowledge, skills, awareness Changes in behaviour, practice Changes in health outcomes Inputs Activities Outputs Reach Short term outcomes Medium term outcomes Long term outcomes Handout only 29

SPHERES OF INFLUENCE External influences Direct influence Direct control Indirect influence Inputs Activities Outputs Reach Short term outcomes Medium term outcomes Long term outcomes Handout only Political, technological, socio-economic, environmental, other factors Organisational resources, skills, systems Existing practices and capacity External influences 30