Download presentation
Presentation is loading. Please wait.
Published byKellie Day Modified over 9 years ago
1
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health Initiatives, Institute of Medicine 7-8 January, London, UK
2
The nature of the intervention: 1.Focus of objectives 2.Governance 3.Consistency of implementation What do we mean by complex interventions? How it works: 4.Necessariness 5.Sufficiency 6.Change trajectory Photo: Les Chatfield - http://www.flickr.com/photos/elsie/
3
What are the challenges of evaluating complex interventions? Describing what is being implemented Getting data about impacts Attributing impacts to a particular programme Photo: Les Chatfield - http://www.flickr.com/photos/elsie/
4
Wikipedia: Evaluation Methods Why a framework is needed
5
Image: Simon Kneebone. http://simonkneebone.com/ Why a framework is needed
6
The Rainbow Framework
7
DEFINE what is to be evaluated
8
Why do we need to start with a clear definition? Photo: Hobbies on a Budget / Flickr
9
1.Develop initial description 2.Develop program theory or logic model 3.Identify potential unintended results
10
Options for representing logic models Pipeline / results chainLogical frameworkOutcomes hierarchy / theory of changeRealist Matrix
11
FRAME what is to be evaluated
12
Source: Hobbies on a Budget / Flickr Frame Decision Make Decision Frame Evaluation Design Evaluation
13
1.Identify primary intended users 2.Decide purpose(s) 3.Specify key evaluation questions 4.Determine what ‘success’ looks like
14
DESCRIBE what happened
15
1.Sample 2.Use measures, indicators or metrics 3.Collect and/or retrieve data 4.Manage data 5.Combine qualitative and quantitative data 6.Analyze data 7.Visualize data
16
Combine qualitative and quantitative data Enrich Examine Explain Triangulate Parallel Sequential Component Integrated
17
UNDERSTAND CAUSES of outcomes and impacts
18
Outcomes Impacts
19
As a profession, we often either oversimplify causation or we overcomplicate it!
20
“In my opinion, measuring attribution is critical, and we can't do that unless we use control groups to compare them to.” Comment in an expert discussion on The Guardian online, May 2013
21
1.Check that the results support causal attribution 2.Compare results to the counterfactual 3.Investigate possible alternative explanations
22
SYNTHESIZE data from one or more evaluations
23
Was it good? Did it work? Was it effective? For whom did it work? In what ways did it work? Was it value for money? Was it cost-effective? Did it succeed in terms of the Triple Bottom Line?
24
24 How do we synthesize diverse evidence about performance? All intended impacts achieved Some intended impacts achieved No negative impacts Overall synthesis GOOD?? BAD
25
1.Synthesize data from a single evaluation 2.Synthesize data across evaluations 3.Generalize findings
26
REPORT and SUPPORT USE of findings
27
I can honestly say that not a day goes by when we don’t use those evaluations in one way or another
28
1.Identify reporting requirements 2.Develop reporting media 3.Ensure accessibility 4.Develop recommendations 5.Support use
30
MANAGE your evaluation
31
1.Understand and engage with stakeholders 2.Establish decision making processes 3.Decide who will conduct the evaluation 4.Determine and secure resources 5.Define ethical and quality evaluation standards 6.Document management processes and agreements 7.Develop evaluation plan or framework 8.Review evaluation 9.Develop evaluation capacity
33
DESCRIBE UNDERSTAND CAUSES SYNTHESIZE REPORT & SUPPORT USE Descriptive Questions – Was the policy implemented as planned? Causal questions – Did the policy change contribute to improved health outcomes? Synthesis questions – Was the policy overall a success? Action questions – What should we do? Making decisions Look at type of questions
34
Making decisions Compare pros and cons
35
Participant Questionnaire Key Informant Interviews Project Records Observation of program implementation KEQ1 What was the quality of implementation? ✔✔✔✔ KEQ2 To what extent were the program objectives met? ✔✔✔ KEQ3 What other impacts did the program have? ✔✔ KEQ4 How could the program be improved? ✔✔ ✔ Making decisions Create an evaluation matrix
36
www.betterevaluation.org
37
Examples Descriptions Tools Guides Comments R & D Documenting Sharing Events
38
Founding Partners Financial Supporters
39
For more information : www.betterevaluation.org/start_here
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.