Download presentation
Presentation is loading. Please wait.
Published byBryce Johns Modified over 9 years ago
1
Four key tasks in impact assessment of complex interventions Background to a proposed research and development project 26 September 2008 Bioversity, Rome, Italy Professor Patricia Rogers Royal Melbourne Institute of Technology, Australia
2
What is impact? …the positive and negative, primary and secondary long-term effects produced by a development intervention, directly or indirectly, intended or unintended. These effects can be economic, socio-cultural, institutional, environmental, technological or of other types. DAC definition
3
Increasing attention to impact assessment in international development Center for Global Development Center for Global Development producers of ‘When Will We Ever Learn?’ report (WWWEL) that argued for more use of RCTs (Randomised Control Trials) NONIE –Network Of Networks on Impact Evaluation NONIE –Network Of Networks on Impact Evaluation all UN agencies, all multilateral development banks and all international aid agencies of OECD countries supporting better quality impact evaluation, including sharing information and producing Guidelines for Impact Evaluation 3IE – the International Initiative on Impact Evaluation 3IE – the International Initiative on Impact Evaluation new organisation funding and promoting rigorous impact evaluation Poverty Action Lab Poverty Action Lab Stated purpose is to advocate for the wider use of RCTs European Evaluation Society European Evaluation Society formal statement cautioning against inappropriate use of RCTs
4
Different types of impact assessment may need different methods Purpose : Knowledge building for replication and upscaling (by others?) Knowledge building for learning and improvement Accountability – to whom, for what, how? Timing : Ex-ante Built into implementation Retrospective – soon afterwards, many years later
5
Different aspects of intervention may need different methods Simple aspects that can be tightly specified and standardized and that work the same in all places (Metaphors: a recipe; Microsoft Word) Complicated aspects that are part of a larger multi- component impact pathway (Metaphors: a rocket ship; a jigsaw) Complex aspects that are highly adaptive, responsive and emergent (Metaphors: raising a child)
6
Four key tasks in impact assessment A) A)Decide impacts to be included in assessment - conceptualise valued impacts B) Gather evidence of impacts - describe and/or measure actual impacts C) Analyse causal attribution or contribution D) Report synthesis of impact assessment and support use Each of these tasks requires appropriate methods and involves values and evidence.
7
A. Decide impacts to include. Need to: Include different dimensions – eg not just income but livelihoods Include the sustainability of these impacts, including environmental sustainability Not only focus on stated objectives – also unintended outcomes (positive and negative) Recognise the values of different stakeholders in terms of Desirable and undesirable impacts Desirable and undesirable processes to achieve these impacts Desirable and undesirable distribution of benefits Identify the ways in which these impacts are understood to occur and what else needs to be included in the analysis
8
A. Decide impacts to include. Some approaches: Program theory (impact pathway) - possibly developing multiple models of the program, eg Soft Systems, negotiate boundaries (eg Critical Systems Heuristics) Participatory approaches to values clarification –eg Most Significant Change
9
B. Gather evidence of impacts. Need to: Balance credibility (especially comprehensiveness) and feasibility (especially timeliness and cost) Prioritise which impacts (and other variables) will be studied empirically and to what extent Deal with time lags before impacts are evident Avoid accidental or systematic distortion of level of impacts
10
B. Gather evidence of impacts. Some approaches: Program theory (impact pathway) – identify short-term results that can indicate longer-term impacts Participatory approaches – engaging community in evidence gathering to increase reach and engagement Real world evaluation – mixed methods, triangulation, making maximum use of existing data, strategic sampling, rapid data collection methods
11
C.Analyse causal contribution or attribution Need to: Avoid false negatives (erroneously thinking it doesn’t work) and false positives (erroneously thinking it does work) Systematically search for disconfirming evidence and analysis of exceptions Distinguish between theory failure and implementation failure Understand the contribution of context: implementation environment, participant characteristics and other interventions
12
C.Analyse causal contribution or attribution Some approaches: Addressing through design eg experimental designs (random assignment) and quasi-experimental designs (construction of comparison group eg propensity scores) Addressing through data collection eg participatory Beneficiary Assessment, expert judgement Addressing through iterative analysis and collection eg Contribution Analysis, Multiple Levels and Lines of Evidence (MLLE), List of Possible Causes (LOPC) and General Elimination Methodology (GEM), systematic qualitative data analysis, realist analysis of testable hypotheses
13
D. Report synthesis and support use Need to: Provide useful information to intended users Provide a synthesis that summarised evidence and values Balance overall pattern and detail Assist uptake/translation of evidence
14
D. Report synthesis and support use Some approaches: Use focus -Utilization-focused evaluation - Identification and involvement of intended users from the start Synthesis - Qualitative Weight and Sum and other techniques to determine overall worth Reporting - Layered reports (1 page, 5 pages, 25 pages); Scenarios showing different outcomes in different contexts; Workshopping report to support knowledge translation
15
Intervention is both necessary and sufficient to produce the impact Impact Intervention ‘Silver bullet’ simple impacts No impact No intervention
16
Causal packages Favourable context Intervention Impacts ‘Jigsaw’ complicated impacts
17
Intervention is necessary but not sufficient to produce the impact Impact Intervention ‘Jigsaw’ complicated impacts Favourable context Intervention Unfavourable context No impact
18
Intervention is sufficient but not necessary to produce the impact Impact Intervention ‘Parallel’ complicated impacts No intervention Impact Alternative activity
19
Plan C Plan B Impact Plan A ‘Life is a path you beat by walking’ complex impacts Intermediate results
20
FINDING: If two potted plants are randomly assigned to either a treatment group that receives daily water, or to a control that receives none, EXEMPLAR 1: POTTED PLANTS and both groups are placed in a dark cupboard, the treatment group does not have better outcomes than the control. CONCLUSION: Watering plants is ineffective in making them grow.
21
FINDING: When classes were randomly assigned to have the teacher using flipcharts, or to a control that received none, and both groups continued to experience the other factors limiting student achievement, the treatment group did not have better outcomes than the control. CONCLUSION: Flip charts are ineffective in improving student achievement. EXEMPLAR 2: FLIPCHARTS IN KENYA
22
How Exemplar 2 has been presented ‘Good studies distinguish real successes from apparent successes. Poorly done evaluations may mistakenly attribute positive impacts to a program when the positive results are due to something else. For example, retrospective studies in Kenya erroneously attributed improved student test scores to the provision of audiovisual aids. More rigorous random-assignment studies demonstrated little or no effect, signaling policymakers of the need to consider why there was no impact and challenging program designers to reconsider their assumptions (Glewwe and others 2004). ‘ (WWWEL report)
23
Proposed research and development project 3 year project to: Trial methods for impact assessment of participatory agricultural research and development projects and programs Trial methods for impact assessment of participatory agricultural research and development projects and programs Synthesise learnings from impact assessments of these types of projects and programs Synthesise learnings from impact assessments of these types of projects and programs Support capacity development (resources and training) in impact assessment for these types of projects and programs Support capacity development (resources and training) in impact assessment for these types of projects and programs
24
References Glouberman, S. and Zimmerman, B. (2002) Complicated and Complex Systems:What Would Successful Reform of Medicare Look Like? Commission on the Future of Health Care in Canada. Discussion Paper 8. Available at http://www.healthandeverything.org/pubs/Glouberman_E.pdfComplicated and Complex Systems:What Would Successful Reform of Medicare Look Like? http://www.healthandeverything.org/pubs/Glouberman_E.pdf Mackie, J. (1974). The Cement of the Universe. Oxford University Press, Oxford. Mark MR. 2001. What works and how can we tell? Evaluation Seminar 2. Victoria Department of Natural Resources and Environment. Rogers, P.J. (2008) ‘Using programme theory for complicated and complex programmes’ Evaluation: the international jourmal of theory, research and practice. 14 (1): 29-48. Rogers, P.J. (2008) ‘Impact Evaluation Guidance. Subgroup 2’. Meeting of NONIE (Network of Networks on Impact Evaluation), Washington, DC. Rogers, P.J. (2001) Impact Evaluation Research Report Department of Natural Resources and Environment, Victoria. Ross, H. L., Campbell, D. T., & Glass, G. V (1970). Determining the social effects of a legal reform. In S. S. Nagel (Ed.), Law and social change (pp. 15-32). Beverly Hills, CA: SAGE.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.