Four key tasks in impact assessment of complex interventions Background to a proposed research and development project 26 September 2008 Bioversity, Rome,

Slides:



Advertisements
Similar presentations
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Advertisements

Mywish K. Maredia Michigan State University
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Designing an Effective Evaluation Strategy
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Latest Trends in Evaluation: Interviews with Industry Leaders Don Snodgrass and Zan Northrip October 2, 2008 DAI.
Global Poverty Action Fund Community Partnership Window Funding Seminar January 2014 Global Poverty Action Fund Community Partnership Window Funding Seminar.
CENTRAL EUROPE PROGRAMME SUCCESS FACTORS FOR PROJECT DEVELOPMENT: focus on activities and partnership JTS CENTRAL EUROPE PROGRAMME.
Program Evaluation Unit, Univerity of Melbourne Evaluating organisational context Measurement approaches: evaluation David Dunt.
Risk Management and Strategy Prioritisation Intelligence Step 8 - Risk Management and Strategy Prioritisaiton Considering the risks associated with action.
Action Writing Action Statements Writing action statements is the first step in the second (action) stage of the public health nutrition (PHN) intervention.
Seminar on selected evaluation methodology issues 评估方法论研讨会 Independent Office of Evaluation of IFAD Beijing, 16 July 2014 国际农业发展基金独立评估办公室 2014 年 7 月 16.
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
Proposal Writing for Competitive Grant Systems
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Thinking systemically: Seeing from simple to complex in impact evaluation Professor Patricia Rogers Royal Melbourne Institute of Technology, Australia.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Shelter Training 08b – Belgium, 16 th –18 th November, 2008 based on content developed by p This session describes the benefits of developing a strategic.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Criteria for Assessing The Feasibility of RCTs. RCTs in Social Science: York September 2006 Today’s Headlines: “Drugs education is not working” “ having.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Impact evaluation: External and internal stakes Impact evaluation seminar - 2 to 6 December, Phnom Penh.
Center on Knowledge Translation for Disability and Rehabilitation Research The Work of the EPPI-Centre A webcast sponsored by SEDL’s Center on Knowledge.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
8 TH -11 TH NOVEMBER, 2010 UN Complex, Nairobi, Kenya MEETING OUTCOMES David Smith, Manager PEI Africa.
Making Good Use of Research Evaluations Anneli Pauli, Vice President (Research)
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Impact Evaluations and Development Draft NONIE Guidance on Impact Evaluation Cairo Conference: Perspectives on Impact Evaluation Tuesday, March 31, 2009.
Potential and Pitfalls of Experimental Impact Evaluation: Reflections on the design and implementation of an experimental Payments for Environmental Services.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Reflection and Learning Workshop April 2015 Imperial Botanical Garden Hotel, Entebbe.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Approach to GEF IW SCS Impact Evaluation Aaron Zazueta Reference Group Meeting Bangkok, Thailand September 27, 2010.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Draft NONIE guidance March 2009 Discussant comments Patricia Rogers.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Current practices in impact evaluation Howard White Independent Evaluation Group World Bank.
Evaluation design and implementation Puja Myles
Focusing the question Janet Harris
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Prof. (FH) Dr. Alexandra Caspari Rigorous Impact Evaluation What It Is About and How It Can Be.
Addressing the Evaluation Gap Responding to the paper by William D. Savedoff and Ruth Levine: “When Will We Ever Learn? Closing the Evaluation Gap”, Center.
Implications of complication and complexity for evaluation Patricia J. Rogers CIRCLE (Collaboration for Interdisciplinary Research, Consulting and Learning.
Quality Evaluations in Education Interventions 1 March 2016 Dr Fatima Adam Zenex Foundation.
Representing Simple, Complicated and Complex Aspects in Logic Models for Evaluation Quality Presentation to the American Evaluation Association conference,
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Ex-ante Evaluation of the EU Structural Funds: the New Challenges of 2014–2020 Haroldas Brožaitis, PPMI (Lithuania) International Evaluation Conference.
Evaluation What is evaluation?
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Impact evaluations of the UNICEF-IKEA Foundation programme on Improving Adolescents Lives in Afghanistan, India and Pakistan: Integrating an equity and.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Performance Indicators
Evaluation: For Whom and for What?
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Right-sized Evaluation
Draft NONIE guidance March 2009
Internal assessment criteria
Evaluation of Nutrition-Sensitive Programs*
Country-led Development Evaluation The Donor Role in Supporting Partner Ownership and Capacity Mr. Hans Lundgren March 2009.
Strategic Environmental Assessment (SEA)
Draft NONIE guidance March 2009
Presentation transcript:

Four key tasks in impact assessment of complex interventions Background to a proposed research and development project 26 September 2008 Bioversity, Rome, Italy Professor Patricia Rogers Royal Melbourne Institute of Technology, Australia

What is impact? …the positive and negative, primary and secondary long-term effects produced by a development intervention, directly or indirectly, intended or unintended. These effects can be economic, socio-cultural, institutional, environmental, technological or of other types. DAC definition

Increasing attention to impact assessment in international development Center for Global Development Center for Global Development producers of ‘When Will We Ever Learn?’ report (WWWEL) that argued for more use of RCTs (Randomised Control Trials) NONIE –Network Of Networks on Impact Evaluation NONIE –Network Of Networks on Impact Evaluation all UN agencies, all multilateral development banks and all international aid agencies of OECD countries supporting better quality impact evaluation, including sharing information and producing Guidelines for Impact Evaluation 3IE – the International Initiative on Impact Evaluation 3IE – the International Initiative on Impact Evaluation new organisation funding and promoting rigorous impact evaluation Poverty Action Lab Poverty Action Lab Stated purpose is to advocate for the wider use of RCTs European Evaluation Society European Evaluation Society formal statement cautioning against inappropriate use of RCTs

Different types of impact assessment may need different methods Purpose : Knowledge building for replication and upscaling (by others?) Knowledge building for learning and improvement Accountability – to whom, for what, how? Timing : Ex-ante Built into implementation Retrospective – soon afterwards, many years later

Different aspects of intervention may need different methods Simple aspects that can be tightly specified and standardized and that work the same in all places (Metaphors: a recipe; Microsoft Word) Complicated aspects that are part of a larger multi- component impact pathway (Metaphors: a rocket ship; a jigsaw) Complex aspects that are highly adaptive, responsive and emergent (Metaphors: raising a child)

Four key tasks in impact assessment A) A)Decide impacts to be included in assessment - conceptualise valued impacts B) Gather evidence of impacts - describe and/or measure actual impacts C) Analyse causal attribution or contribution D) Report synthesis of impact assessment and support use Each of these tasks requires appropriate methods and involves values and evidence.

A. Decide impacts to include. Need to: Include different dimensions – eg not just income but livelihoods Include the sustainability of these impacts, including environmental sustainability Not only focus on stated objectives – also unintended outcomes (positive and negative) Recognise the values of different stakeholders in terms of Desirable and undesirable impacts Desirable and undesirable processes to achieve these impacts Desirable and undesirable distribution of benefits Identify the ways in which these impacts are understood to occur and what else needs to be included in the analysis

A. Decide impacts to include. Some approaches: Program theory (impact pathway) - possibly developing multiple models of the program, eg Soft Systems, negotiate boundaries (eg Critical Systems Heuristics) Participatory approaches to values clarification –eg Most Significant Change

B. Gather evidence of impacts. Need to: Balance credibility (especially comprehensiveness) and feasibility (especially timeliness and cost) Prioritise which impacts (and other variables) will be studied empirically and to what extent Deal with time lags before impacts are evident Avoid accidental or systematic distortion of level of impacts

B. Gather evidence of impacts. Some approaches: Program theory (impact pathway) – identify short-term results that can indicate longer-term impacts Participatory approaches – engaging community in evidence gathering to increase reach and engagement Real world evaluation – mixed methods, triangulation, making maximum use of existing data, strategic sampling, rapid data collection methods

C.Analyse causal contribution or attribution Need to: Avoid false negatives (erroneously thinking it doesn’t work) and false positives (erroneously thinking it does work) Systematically search for disconfirming evidence and analysis of exceptions Distinguish between theory failure and implementation failure Understand the contribution of context: implementation environment, participant characteristics and other interventions

C.Analyse causal contribution or attribution Some approaches: Addressing through design eg experimental designs (random assignment) and quasi-experimental designs (construction of comparison group eg propensity scores) Addressing through data collection eg participatory Beneficiary Assessment, expert judgement Addressing through iterative analysis and collection eg Contribution Analysis, Multiple Levels and Lines of Evidence (MLLE), List of Possible Causes (LOPC) and General Elimination Methodology (GEM), systematic qualitative data analysis, realist analysis of testable hypotheses

D. Report synthesis and support use Need to: Provide useful information to intended users Provide a synthesis that summarised evidence and values Balance overall pattern and detail Assist uptake/translation of evidence

D. Report synthesis and support use Some approaches: Use focus -Utilization-focused evaluation - Identification and involvement of intended users from the start Synthesis - Qualitative Weight and Sum and other techniques to determine overall worth Reporting - Layered reports (1 page, 5 pages, 25 pages); Scenarios showing different outcomes in different contexts; Workshopping report to support knowledge translation

Intervention is both necessary and sufficient to produce the impact Impact Intervention ‘Silver bullet’ simple impacts No impact No intervention

Causal packages Favourable context Intervention Impacts ‘Jigsaw’ complicated impacts

Intervention is necessary but not sufficient to produce the impact Impact Intervention ‘Jigsaw’ complicated impacts Favourable context Intervention Unfavourable context No impact

Intervention is sufficient but not necessary to produce the impact Impact Intervention ‘Parallel’ complicated impacts No intervention Impact Alternative activity

Plan C Plan B Impact Plan A ‘Life is a path you beat by walking’ complex impacts Intermediate results

FINDING: If two potted plants are randomly assigned to either a treatment group that receives daily water, or to a control that receives none, EXEMPLAR 1: POTTED PLANTS and both groups are placed in a dark cupboard, the treatment group does not have better outcomes than the control. CONCLUSION: Watering plants is ineffective in making them grow.

FINDING: When classes were randomly assigned to have the teacher using flipcharts, or to a control that received none, and both groups continued to experience the other factors limiting student achievement, the treatment group did not have better outcomes than the control. CONCLUSION: Flip charts are ineffective in improving student achievement. EXEMPLAR 2: FLIPCHARTS IN KENYA

How Exemplar 2 has been presented ‘Good studies distinguish real successes from apparent successes. Poorly done evaluations may mistakenly attribute positive impacts to a program when the positive results are due to something else. For example, retrospective studies in Kenya erroneously attributed improved student test scores to the provision of audiovisual aids. More rigorous random-assignment studies demonstrated little or no effect, signaling policymakers of the need to consider why there was no impact and challenging program designers to reconsider their assumptions (Glewwe and others 2004). ‘ (WWWEL report)

Proposed research and development project 3 year project to: Trial methods for impact assessment of participatory agricultural research and development projects and programs Trial methods for impact assessment of participatory agricultural research and development projects and programs Synthesise learnings from impact assessments of these types of projects and programs Synthesise learnings from impact assessments of these types of projects and programs Support capacity development (resources and training) in impact assessment for these types of projects and programs Support capacity development (resources and training) in impact assessment for these types of projects and programs

References Glouberman, S. and Zimmerman, B. (2002) Complicated and Complex Systems:What Would Successful Reform of Medicare Look Like? Commission on the Future of Health Care in Canada. Discussion Paper 8. Available at and Complex Systems:What Would Successful Reform of Medicare Look Like? Mackie, J. (1974). The Cement of the Universe. Oxford University Press, Oxford. Mark MR What works and how can we tell? Evaluation Seminar 2. Victoria Department of Natural Resources and Environment. Rogers, P.J. (2008) ‘Using programme theory for complicated and complex programmes’ Evaluation: the international jourmal of theory, research and practice. 14 (1): Rogers, P.J. (2008) ‘Impact Evaluation Guidance. Subgroup 2’. Meeting of NONIE (Network of Networks on Impact Evaluation), Washington, DC. Rogers, P.J. (2001) Impact Evaluation Research Report Department of Natural Resources and Environment, Victoria. Ross, H. L., Campbell, D. T., & Glass, G. V (1970). Determining the social effects of a legal reform. In S. S. Nagel (Ed.), Law and social change (pp ). Beverly Hills, CA: SAGE.