Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, 2009 1 Causal Inference Nandini Krishnan Africa Impact Evaluation.

Slides:



Advertisements
Similar presentations
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Advertisements

The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Advantages and limitations of non- and quasi-experimental methods Module 2.2.
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
Monitoring and Evaluation of HIV/AIDS Programs Pretoria, South Africa, March 2011 Research Designs for Program Evaluation M&E for HIV/AIDS Programs March.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Presented by Malte Lierl (Yale University).  How do we measure program impact when random assignment is not possible ?  e.g. universal take-up  non-excludable.
Impact Evaluation Click to edit Master title style Click to edit Master subtitle style Impact Evaluation World Bank InstituteHuman Development Network.
Assessing Program Impact Chapter 8. Impact assessments answer… Does a program really work? Does a program produce desired effects over and above what.
Public Policy & Evidence: How to discriminate, interpret and communicate scientific research to better inform society. Rachel Glennerster Executive Director.
Who are the participants? Creating a Quality Sample 47:269: Research Methods I Dr. Leonard March 22, 2010.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
TOOLS OF POSITIVE ANALYSIS
Chapter 13 Notes Observational Studies and Experimental Design
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
Measuring Impact: Experiments
AADAPT Workshop South Asia Goa, December 17-21, 2009 Nandini Krishnan 1.
Final Study Guide Research Design. Experimental Research.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Experimental Design making causal inferences Richard Lambert, Ph.D.
CAUSAL INFERENCE Shwetlena Sabarwal Africa Program for Education Impact Evaluation Accra, Ghana, May 2010.
The days ahead Monday-Wednesday –Training workshop on how to measure the actual reduction in HIV incidence that is caused by implementation of MC programs.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
CAUSAL INFERENCE Presented by: Dan Dowhower Alysia Cohen H 615 Friday, October 4, 2013.
Impact Evaluation Designs for Male Circumcision Sandi McCoy University of California, Berkeley Male Circumcision Evaluation Workshop and Operations Meeting.
Public Policy Analysis ECON 3386 Anant Nyshadham.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
Research Methods in Psychology Chapter 2. The Research ProcessPsychological MeasurementEthical Issues in Human and Animal ResearchBecoming a Critical.
Research Strategies. Why is Research Important? Answer in complete sentences in your bell work spiral. Discuss the consequences of good or poor research.
Impact Evaluation for Evidence-Based Policy Making
What is randomization and how does it solve the causality problem? 2.3.
Research Design ED 592A Fall Research Concepts 1. Quantitative vs. Qualitative & Mixed Methods 2. Sampling 3. Instrumentation 4. Validity and Reliability.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Laura Chioda.
Handbook for Health Care Research, Second Edition Chapter 7 © 2010 Jones and Bartlett Publishers, LLC CHAPTER 7 Designing the Experiment.
Randomized Assignment Difference-in-Differences
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
Social Experimentation & Randomized Evaluations Hélène Giacobino Director J-PAL Europe DG EMPLOI, Brussells,Nov 2011 World Bank Bratislawa December 2011.
CJ490: Research Methods in Criminal Justice UNIT #4 SEMINAR Professor Jeffrey Hauck.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Randomization.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Measuring causal impact 2.1. What is impact? The impact of a program is the difference in outcomes caused by the program It is the difference between.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
Chapter Two Psychological Science. RESEARCH GOALS Basic Research Answers fundamental questions about behavior – e.g., how nerves conduct impulses from.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank.
Impact Evaluation Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative AFTRL.
SESRI Workshop on Survey-based Experiments
Measuring Results and Impact Evaluation: From Promises into Evidence
Quasi-Experimental Methods
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
Making the Most out of Discontinuities
Quasi-Experimental Methods
Development Impact Evaluation in Finance and Private Sector
SESRI Workshop on Survey-based Experiments
1 Causal Inference Counterfactuals False Counterfactuals
Evaluating research Is this valid research?.
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Designs for Male Circumcision
Positive analysis in public finance
Steps in Implementing an Impact Evaluation
Presentation transcript:

Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation Initiative World Bank March 9, 2009

2 Motivation Causal inference: Does a relation from cause to effect exist? In the health sciences, many of the critical questions are causal in nature For example: –What is the efficacy of a given drug on a target population? –What fraction of HIV infections could have been prevented by a given treatment or policy?

3 Motivation The most challenging empirical questions for our health programs also involve cause-effect relationships: –Does the decentralization of health facilities improve the quality of service provision? –Does HIV testing cause behavioral change amongst high-risk populations? Policy perspective: What works? What are the benefits per unit costs of alternative interventions?

4 Evaluate the impact/effect of a program or an intervention on some outcomes of interest –By how much did X (intervention) change Y (outcome)? Not the same as correlation! –X and Y are related, move together in some way –Holding an umbrella & having wet shoes –Does the umbrella cause wet shoes? Identifying causal impact

5 What is the effect of an intervention/treatment P on outcome Y ? Example: What is the effect of an Information campaign (P) on VCT-uptake among the youth (Y)? Impact of P = VCT uptake (Y) for a youth participating in the information campaign – VCT uptake (Y) for the same youth in the absence of the information campaign (at the same point in time) Evaluation Question

6 We observe VCT uptake (Y) for a youth participating in the information campaign But we do not observe VCT uptake (Y) for the same youth in the absence of the information campaign Fundamental problem: We never observe the same individual with and without program at the same point in time Attributing Causality: Problems

7 Estimate/ mimic/find a good proxy for what would have happened to outcome Y in the absence of program P Compare the youth with someone who ‘looks’ exactly like him/her who was not exposed to the intervention P at the same point of time In other words, we must find a valid Counterfactual or Control group Attributing Causality: Solution

8 Finding a Valid Counterfactual Understand the process by which program participation (treatment) is determined: How are benefits assigned? What are the eligibility rules? The counterfactual must be similar in terms of the likelihood of treatment/program participation The treated observation and the counterfactual should have identical characteristics, except for benefiting from the intervention  only reason for different outcomes between treatment and counterfactual is the intervention (P)

9 “Counterfeit” Counterfactual #1 Before and After –Same individual before and after treatment –Example: Food aid program to subsistence farmers in a drought- prone area –Compare incomes before and after –Findings: Income after food aid is lower than income before food aid –Did the program fail? What else is happening over time? –Beneficiaries received aid because of their characteristics –Cannot separate effect of food aid from the shock that led to needing it (worsening drought situation) –Effect of treatment and time-varying variables on outcome cannot be separated

10 Compare participants to non-participants at the same time –Non-participants: Those who choose not to enroll in program or Those who were not offered the program, ineligible Problem: –We cannot determine why some participate in the program and others do not, existing differences in behavior could affect outcomes –Cannot compare eligible to ineligible, differ in characteristics that also affect outcomes “Counterfeit” Counterfactual #2

11 Why might participants and non-participants differ? An example Government program offers VCT: What is the effect of VCT on risk behaviors? Who signs up? –What if those getting treatment decide to get tested because they know they know they engage in high-risk behaviors or think they might be HIV positive? Participants have lower health status and/or engage in riskier behaviors relative to those who do not sign up– differing (pre- existing) characteristics that also affect outcomes of interest Non-participants  a poor counterfactual for the treatment group

12 Possible Solutions… Guarantee comparability of treatment and control groups ONLY remaining difference is intervention How? –Experimental design –Non-experimental/ Quasi-experimental design

13 These solutions all involve… EITHER Randomization –Give everyone an equal chance of being in control or treatment groups –Ensures that participants and non-participants will be similar on most characteristics –Only difference is the intervention OR Transparent & observable (quantifiable) criteria for assignment into the program –Separate effects

14 Conclusions Seek to measure the causal effect of a program on some outcome Need a valid counterfactual, find a good control group –Behave the same as treated group in the absence of the intervention Invalid counterfactuals: –Before and after: time-varying variables –Participants vs. non-participants: characteristics Options (Tomorrow!) –Randomize –Quasi-experimental methods: More assumptions

15 Thank You