Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 1 Causal Inference Nandini.

Slides:



Advertisements
Similar presentations
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Advertisements

The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Advantages and limitations of non- and quasi-experimental methods Module 2.2.
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
A Guide to Education Research in the Era of NCLB Brian Jacob University of Michigan December 5, 2007.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Presented by Malte Lierl (Yale University).  How do we measure program impact when random assignment is not possible ?  e.g. universal take-up  non-excludable.
Impact Evaluation Click to edit Master title style Click to edit Master subtitle style Impact Evaluation World Bank InstituteHuman Development Network.
What could go wrong? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop Africa Program for Education.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
TOOLS OF POSITIVE ANALYSIS
Types of Evaluation.
What do we know about gender and agriculture in Africa? Markus Goldstein Michael O’Sullivan The World Bank Cross-Country Workshop for Impact Evaluations.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Using Randomized Evaluations.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Nandini Krishnan Africa Impact Evaluation Initiative World Bank April 14, 2009.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
Measuring Impact: Experiments
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Mattea Stein Quasi Experimental Methods.
AADAPT Workshop South Asia Goa, December 17-21, 2009 Nandini Krishnan 1.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
CAUSAL INFERENCE Shwetlena Sabarwal Africa Program for Education Impact Evaluation Accra, Ghana, May 2010.
The days ahead Monday-Wednesday –Training workshop on how to measure the actual reduction in HIV incidence that is caused by implementation of MC programs.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
Impact Evaluation Designs for Male Circumcision Sandi McCoy University of California, Berkeley Male Circumcision Evaluation Workshop and Operations Meeting.
Public Policy Analysis ECON 3386 Anant Nyshadham.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Why Use Randomized Evaluation? Isabel Beltran, World Bank.
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
Impact Evaluation for Evidence-Based Policy Making
What is randomization and how does it solve the causality problem? 2.3.
An introduction to Impact Evaluation and application to the Ethiopia NFSP Workshop on Approaches to Evaluating The Impact of the National Food Security.
AADAPT Workshop for Impact Evaluation in Agriculture and Rural Development Goa, India 2009 With generous support from Gender Action Plan.
Measuring Impact 1 Non-experimental methods 2 Experiments
Africa RISING M&E Expert Meeting Addis Ababa, 5-7 September 2012.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 Steps in Implementing an Impact.
Randomized Assignment Difference-in-Differences
Impact Evaluation Using Impact Evaluation for Results Based Policy Making Arianna Legovini Impact Evaluation Cluster, AFTRL Slides by Paul J. Gertler &
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
Social Experimentation & Randomized Evaluations Hélène Giacobino Director J-PAL Europe DG EMPLOI, Brussells,Nov 2011 World Bank Bratislawa December 2011.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Randomization.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Measuring causal impact 2.1. What is impact? The impact of a program is the difference in outcomes caused by the program It is the difference between.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank.
Kenya Evidence Forum - June 14, 2016 Using Evidence to Improve Policy and Program Designs How do we interpret “evidence”? Aidan Coville, Economist, World.
Impact Evaluation for Real Time Decision Making
Measuring Results and Impact Evaluation: From Promises into Evidence
Commercial Agriculture Development Project Impact Evaluation
Quasi Experimental Methods I
Quasi Experimental Methods I
Explanation of slide: Logos, to show while the audience arrive.
Quasi-Experimental Methods
Development Impact Evaluation in Finance and Private Sector
1 Causal Inference Counterfactuals False Counterfactuals
Implementation Challenges
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Designs for Male Circumcision
Applying Impact Evaluation Tools: Hypothetical Fertilizer Project
Presentation transcript:

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini Krishnan Africa Impact Evaluation Initiative World Bank April 13, 2009

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Motivation Causal inference: Does a relation from cause to effect exist? Many of the critical policy questions are causal in nature: Identifying and implementing effective interventions to affect outcomes

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Theory  Practice Thinking through a causal chain: – Use of improved inputs and technology increases yields- How do we get farmers to use improved seeds, fertilizers, and adopt new technologies? What effect does adoption have on yields in a real setting? – Accountability improves the delivery of goods and services of local governments- What mechanisms improve accountability? What is the effect on improved accountability in our context?

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Attributing causality Policy perspective: – What works? Identifying effective interventions – What are the benefits per unit costs of alternative interventions? Comparing alternatives

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causally linking interventions to outcomes: EFFECTIVENESS Especially challenging Interventions  Behavioral outcomes: – Do subsidies increase input use? Are matching grants effective as a tool for technology adoption? – If communities must compete for grants from the state government, will this lead to greater community participation in decision-making?

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Comparing alternatives: COST-EFFECTIVENESS AND INFORMING SCALE-UP Comparing interventions in terms of final outcomes: – Subsidies versus matching grants: Which is the most effective intervention in terms of increasing crop yields? – Competitive community grants versus guaranteed budget support – Which leads to better targeting and greater responsiveness to the interests of vulnerable groups?

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Evaluate the impact/effect of a program or an intervention on some outcomes of interest – By how much did X (intervention) change Y (outcome)? Not the same as correlation! – X and Y are related, move together in some way – Fertilizer subsidy program & bad harvest – Does the subsidy lower yields? Drought? Pest attack? Identifying causal impact

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, What is the effect of an intervention/treatment P on outcome Y ? Example: What is the effect of an input subsidy program (P) on maize yields (Y)? Evaluation Question

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Impact of P = Maize yield (Y) for a farmer participating in the input subsidy program – Maize yield (Y) for the same farmer in the absence of the input subsidy program (at the same point in time) Evaluation Question

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, We observe maize yields (Y) for the farmer participating in the subsidy program But we do not observe maize yields (Y) for the same farmer in the absence of the subsidy program Fundamental problem: We never observe the same individual with and without program at the same point in time Attributing Causality: Problems

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Estimate/ mimic/find a good proxy for what would have happened to outcome Y in the absence of program P Compare the farmer with someone who ‘ looks ’ exactly like him/her who was not exposed to the intervention P at the same point of time In other words, we must find a valid Counterfactual: Comparison or Control group- comparing average effects Attributing Causality: Solution

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Finding a Valid Counterfactual Understand the process by which program participation (treatment) is determined: How are benefits assigned? What are the eligibility rules? The counterfactual must be similar in terms of the likelihood of treatment/program participation

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Finding a Valid Counterfactual The treated group and the counterfactual group should have identical characteristics on average, except for benefiting from the intervention  only reason for different outcomes between treatment and counterfactual is the intervention

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, “ Counterfeit ” Counterfactual #1 Before and After – Same group of farmers before and after treatment/subsidy program – Compare yields before and after – Findings: Yield after program is lower than yields before the program – Did the program fail?

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, “ Counterfeit ” Counterfactual #1 Before and After What else is happening over time? – Poor and irregular rainfall? – Effect of treatment and time-varying variables on outcome cannot be separated

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Compare participants to non-participants at the same time – Non-participants: Those who choose not to enroll in program: Communities who do not apply for grants- poor capacity, high need (other interventions?) or Those who were not offered the program, ineligible: Richer communities “ Counterfeit ” Counterfactual #2

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Compare participants to non-participants at the same time Problem: – We cannot determine why some participate in the program and others do not, existing differences in behavior could affect outcomes (compare more cohesive communities with low capacity communities) – Cannot compare eligible to ineligible, differ in characteristics that also affect outcomes (compare rich communities to poor communities) “ Counterfeit ” Counterfactual #2

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Why might participants and non-participants differ? Government offers input vouchers for fertilizers to farmers through extension workers: What is the effect of this intervention on yields? Who participates? Which farmers go to the extension workers to get the voucher?

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Who participates? – Farmers who are more enterprising, less risk-averse, have higher income to contribute their share of the subsidy – may have higher yields irrespective of fertilizer use – Participants have differing (pre-existing) characteristics relative to non-participating communities and individuals that also affect outcomes of interest Non-participants  a poor counterfactual for treatment group

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Possible Solutions … Guarantee comparability of treatment and counterfactual groups ONLY remaining difference is intervention How? – Experimental design – Non-experimental/ Quasi-experimental design

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, These solutions all involve … EITHER Randomization – Give everyone an equal chance of being in control or treatment groups (or a lottery) – Ensures that participants and non-participants will be similar on most characteristics – Only difference is the intervention OR Transparent & observable (quantifiable) criteria for assignment into the program – Separate effects of intervention from “ other things ”

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Conclusions To identify effective interventions and compare alternatives, we need to be able to attribute causality Need a valid counterfactual: a group that would behave the same as the treated group in the absence of the intervention Invalid counterfactuals: – Before and after: time-varying variables – Participants vs. non-participants: characteristics Options: Choice of method depends on program design, operational considerations, and the question

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Thank You