Www.worldbank.org/hdchiefeconomist The World Bank Human Development Network Spanish Impact Evaluation Fund.

Slides:



Advertisements
Similar presentations
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Advertisements

Impact Evaluation Methods: Causal Inference
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
An Overview Lori Beaman, PhD RWJF Scholar in Health Policy UC Berkeley
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Explanation of slide: Logos, to show while the audience arrive.
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Presented by Malte Lierl (Yale University).  How do we measure program impact when random assignment is not possible ?  e.g. universal take-up  non-excludable.
Impact Evaluation Click to edit Master title style Click to edit Master subtitle style Impact Evaluation World Bank InstituteHuman Development Network.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
Impact Evaluation Toolbox Gautam Rao University of California, Berkeley * ** Presentation credit: Temina Madon.
What do we know about gender and agriculture in Africa? Markus Goldstein Michael O’Sullivan The World Bank Cross-Country Workshop for Impact Evaluations.
Methods and Approaches to investigate the UK Education System Sandra McNally, University of Surrey and Centre for Economic Performance, London School of.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Development Impact Evaluation Initiative innovations & solutions in infrastructure, agriculture & environment naivasha, april 23-27, 2011 in collaboration.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Nandini Krishnan Africa Impact Evaluation Initiative World Bank April 14, 2009.
Randomized Control Trials for Agriculture Pace Phillips, Innovations for Poverty Action
Impact Assessment of Africa RISING: Approaches and Challenges Beliyou Haile IFPRI March 2013.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Making the Most out of Discontinuities Florence.
The World Bank Human Development Network Spanish Impact Evaluation Fund
CAUSAL INFERENCE Shwetlena Sabarwal Africa Program for Education Impact Evaluation Accra, Ghana, May 2010.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Session III Regression discontinuity (RD) Christel Vermeersch LCSHD November 2006.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Public Policy Analysis ECON 3386 Anant Nyshadham.
Public Policy Analysis ECON 3386 Anant Nyshadham.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
Measuring Impact 1 Non-experimental methods 2 Experiments
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Laura Chioda.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 Steps in Implementing an Impact.
Randomized Assignment Difference-in-Differences
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
Technical Track Session III
Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez.
Measuring Results and Impact Evaluation: From Promises into Evidence
An introduction to Impact Evaluation
Quasi-Experimental Methods
Impact Evaluation Methods
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
Quasi-Experimental Methods
Impact Evaluation Methods
1 Causal Inference Counterfactuals False Counterfactuals
Impact Evaluation Toolbox
Impact Evaluation Methods: Difference in difference & Matching
Explanation of slide: Logos, to show while the audience arrive.
Sampling for Impact Evaluation -theory and application-
Applying Impact Evaluation Tools: Hypothetical Fertilizer Project
Module 3: Impact Evaluation for TTLs
Presentation transcript:

The World Bank Human Development Network Spanish Impact Evaluation Fund

MEASURING IMPACT Impact Evaluation Methods for Policy Makers This material constitutes supporting material for the "Impact Evaluation in Practice" book. This additional material is made freely but please acknowledge its use as follows: Gertler, P. J.; Martinez, S., Premand, P., Rawlings, L. B. and Christel M. J. Vermeersch, 2010, Impact Evaluation in Practice: Ancillary Material, The World Bank, Washington DC ( The content of this presentation reflects the views of the authors and not necessarily those of the World Bank.

1 Causal Inference Counterfactuals False Counterfactuals Before & After (Pre & Post) Enrolled & Not Enrolled (Apples & Oranges)

2 IE Methods Toolbox Randomized Assignment Discontinuity Design Diff-in-Diff Randomized Offering/Promotion Difference-in-Differences P-Score matching Matching

2 IE Methods Toolbox Randomized Assignment Discontinuity Design Diff-in-Diff Randomized Offering/Promotion Difference-in-Differences P-Score matching Matching

Discontinuity Design Anti-poverty Programs Pensions Education Agriculture Many social programs select beneficiaries using an index or score: Targeted to households below a given poverty index/income Targeted to population above a certain age Scholarships targeted to students with high scores on standarized text Fertilizer program targeted to small farms less than given number of hectares)

Example: Effect of fertilizer program on agriculture production Improve agriculture production (rice yields) for small farmers Goal o Farms with a score (Ha) of land ≤50 are small o Farms with a score (Ha) of land >50 are not small Method Small farmers receive subsidies to purchase fertilizer Intervention

Regression Discontinuity Design-Baseline Not eligible Eligible

Regression Discontinuity Design-Post Intervention IMPACT

Case 5: Discontinuity Design We have a continuous eligibility index with a defined cut-off o Households with a score ≤ cutoff are eligible o Households with a score > cutoff are not eligible o Or vice-versa Intuitive explanation of the method: o Units just above the cut-off point are very similar to units just below it – good comparison. o Compare outcomes Y for units just above and below the cut-off point. For a discontinuity design, you need: 1)Continuous eligibility index 2)Clearly defines eligibility cut-off.

Case 5: Discontinuity Design Eligibility for Progresa is based on national poverty index Household is poor if score ≤ 750 Eligibility for Progresa: o Eligible=1 if score ≤ 750 o Eligible=0 if score > 750

Case 5: Discontinuity Design Score vs. consumption at Baseline-No treatment Poverty Index Consumption Fitted values

Case 5: Discontinuity Design Score vs. consumption post-intervention period-treatment (**) Significant at 1% Consumption Fitted values Poverty Index 30.58** Estimated impact on consumption (Y) | Multivariate Linear Regression

Keep in Mind Discontinuity Design Discontinuity Design requires continuous eligibility criteria with clear cut-off. Gives unbiased estimate of the treatment effect: Observations just across the cut-off are good comparisons. No need to exclude a group of eligible households/ individuals from treatment. Can sometimes use it for programs that already ongoing. !

Keep in Mind Discontinuity Design Discontinuity Design produces a local estimate: o Effect of the program around the cut-off point/discontinuity. o This is not always generalizable. Power: o Need many observations around the cut-off point. Avoid mistakes in the statistical model: Sometimes what looks like a discontinuity in the graph, is something else. !