1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank.

Slides:



Advertisements
Similar presentations
Impact analysis and counterfactuals in practise: the case of Structural Funds support for enterprise Gerhard Untiedt GEFRA-Münster,Germany Conference:
Advertisements

The complex evaluation framework. 2 Simple projects, complicated programs and complex development interventions Complicated programs Simple projects blue.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Mywish K. Maredia Michigan State University
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
Monitoring and Evaluation of HIV/AIDS Programs Pretoria, South Africa, March 2011 Research Designs for Program Evaluation M&E for HIV/AIDS Programs March.
The French Youth Experimentation Fund (Fonds d’Expérimentation pour la Jeunesse – FEJ) Mathieu Valdenaire (DJEPVA - FEJ) International Workshop “Evidence-based.
COMP8130 and COMP4130 Adrian Marshall Verification and Validation Risk Management Adrian Marshall.
Making Impact Evaluations Happen World Bank Operational Experience 6 th European Conference on Evaluation of Cohesion Policy 30 November 2009 Warsaw Joost.
© Institute for Fiscal Studies The role of evaluation in social research: current perspectives and new developments Lorraine Dearden, Institute of Education.
TOOLS OF POSITIVE ANALYSIS
Types of Evaluation.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
Measuring Impact: Experiments
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Mattea Stein Quasi Experimental Methods.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
1 The Need for Control: Learning what ESF achieves Robert Walker.
Assessing the Distributional Impact of Social Programs The World Bank Public Expenditure Analysis and Manage Core Course Presented by: Dominique van de.
Policy Analysis and Implementation Evaluation (I).
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
CAUSAL INFERENCE Presented by: Dan Dowhower Alysia Cohen H 615 Friday, October 4, 2013.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Evaluating ‘system effects’ of large scale AIDS funding Kevin Kelly Third African Evaluation Association Conference 1-3 December 2004, Cape Town.
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
Impact Evaluation for Evidence-Based Policy Making
What is randomization and how does it solve the causality problem? 2.3.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Quasi Experimental and single case experimental designs
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
Current practices in impact evaluation Howard White Independent Evaluation Group World Bank.
The Kirkpatrick Model organizational change Richard Lambert, Ph.D.
1 General Elements in Evaluation Research. 2 Types of Evaluations.
Randomized Assignment Difference-in-Differences
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
Prof. (FH) Dr. Alexandra Caspari Rigorous Impact Evaluation What It Is About and How It Can Be.
Social Experimentation & Randomized Evaluations Hélène Giacobino Director J-PAL Europe DG EMPLOI, Brussells,Nov 2011 World Bank Bratislawa December 2011.
What IE is not and is not Male Circumcision Impact Evaluation Meeting Johannesburg, South Africa January 18-23, 2010 Nancy Padian UC Berkeley.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
ACSM at Country Level Sub Group Meeting
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Randomization.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Measuring causal impact 2.1. What is impact? The impact of a program is the difference in outcomes caused by the program It is the difference between.
The Evaluation Problem Alexander Spermann, University of Freiburg 1 The Fundamental Evaluation Problem and its Solution SS 2009.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
Monitoring and evaluation 16 July 2009 Michael Samson UNICEF/ IDS Course on Social Protection.
IEc INDUSTRIAL ECONOMICS, INCORPORATED Attributing Benefits to Voluntary Programs: Practical and Defensible Approaches Cynthia Manson, Principal June 23,
Impact Evaluation Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative AFTRL.
Kenya Evidence Forum - June 14, 2016 Using Evidence to Improve Policy and Program Designs How do we interpret “evidence”? Aidan Coville, Economist, World.
Introduction to Impact Evaluation The Motivation Emmanuel Skoufias The World Bank PRMPR PREM Learning Week: April 21-22, 2008.
Looking for statistical twins
Measuring Results and Impact Evaluation: From Promises into Evidence
Quasi Experimental Methods I
Right-sized Evaluation
Quasi Experimental Methods I
An introduction to Impact Evaluation
Quasi-Experimental Methods
Development Impact Evaluation in Finance and Private Sector
Evaluating Impacts: An Overview of Quantitative Methods
External Validity.
Applying Impact Evaluation Tools: Hypothetical Fertilizer Project
Steps in Implementing an Impact Evaluation
Nancy Padian UC Berkeley
Experiment or quasiexperiment: success and failure. Lithuanian case
Presentation transcript:

1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank

2 Background Initial response to AIDS epidemic: responding to a crisis. –Build political commitment –Establish institutions –Scale up coverage of HIV prevention, treatment and mitigation services Now we need to know what really works?

3 Traditional M&E and Impact Evaluation M&E: monitoring & process evaluation Descriptiveanalysis Causalanalysis ▫ What was the effect of the program on outcomes? ▫ How would outcomes change under alternative program designs? ▫ Is the program cost-effective? ▫ Is program being implemented efficiently? ▫ Is targeted population being reached? ▫ Are outcomes moving in the right direction? Impact Evaluation:

4 Why does impact evaluation matter? To know if the program had an impact and the average size of that impact: –Assess if policies work –Assess the net benefits/costs of the program –Assess the distribution of gains and losses

5 What do we mean by “impact evaluation” ? Impact = the difference between the relevant outcome indicator with the program and that without it. However, we can never observe someone in two different states of nature at the same time. While a post-intervention indicator is observed, its value in the absence of the program is not, i.e., it is a counter-factual. So all IE deals with overcoming the issue of missing data. Requires counterfactual analysis. The IE problem

6 Given the problem of missing data (individuals have only 1 existence) we can compare 2 groups ……we need: A counterfactual: a control/comparison group that will allow us to attribute any change in the participant group to the intervention (causality) what would have happened without the program What we need

7 Common IE practices 1. Before and after 2. Participants-non-participants BUT, it’s difficult to assess the TRUE AVERAGE CAUSAL EFFECT How to solve the FUNDAMENTAL PROBLEM OF EVALUATION?

8 Comparison group issues Two central problems: –Programs are targeted Program areas will differ in observable and unobservable ways precisely because the program intended this –Individual participation is (usually) voluntary Participants will differ from non-participants in observables and unobservable ways.

9 Tools to identify a Counterfactual Randomized Designs Quasi-experimental Designs –Matching –Instrumental variables –Regression discontinuity

10 Some general principles to consider when planning an IE Government ownership—what matters is institutional buy-in Relevance and applicability—asking the right questions Flexibility and adaptability Horizon matters

11 Summing up: Methods/Practicalities Randomization is the “gold standard” Be flexible, be creative – use the context IE requires good monitoring and monitoring will help you understand the effect size

12 Summing up: Methods/Practicalities Making IE works for you may require a change in the culture of project design and implementation…..that is to maximize the evidence-based upon which policy decisions can be made to improve the chances for success Impact evaluation is more than a tool – it is an analytical framework for policy development

13 THANK YOU