Www.worldbank.org/hdchiefeconomist The World Bank Human Development Network Spanish Impact Evaluation Fund.

Slides:



Advertisements
Similar presentations
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Advertisements

Impact Evaluation Methods: Causal Inference
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
An Overview Lori Beaman, PhD RWJF Scholar in Health Policy UC Berkeley
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Explanation of slide: Logos, to show while the audience arrive.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation Click to edit Master title style Click to edit Master subtitle style Impact Evaluation World Bank InstituteHuman Development Network.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
VIII Evaluation Conference ‘Methodological Developments and Challenges in UK Policy Evaluation’ Daniel Fujiwara Senior Economist Cabinet Office & London.
Latest Trends in Evaluation: Interviews with Industry Leaders Don Snodgrass and Zan Northrip October 2, 2008 DAI.
Impact Evaluation Toolbox Gautam Rao University of California, Berkeley * ** Presentation credit: Temina Madon.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Nandini Krishnan Africa Impact Evaluation Initiative World Bank April 14, 2009.
Measuring Impact: Experiments
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Session III Regression discontinuity (RD) Christel Vermeersch LCSHD November 2006.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
CAUSAL INFERENCE Presented by: Dan Dowhower Alysia Cohen H 615 Friday, October 4, 2013.
Impact Evaluation Designs for Male Circumcision Sandi McCoy University of California, Berkeley Male Circumcision Evaluation Workshop and Operations Meeting.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
Measuring Impact 1 Non-experimental methods 2 Experiments
Development Impact Evaluation in Finance and Private Sector 1.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
Current practices in impact evaluation Howard White Independent Evaluation Group World Bank.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Laura Chioda.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 Steps in Implementing an Impact.
Implementing an impact evaluation under constraints Emanuela Galasso (DECRG) Prem Learning Week May 2 nd, 2006.
Randomized Assignment Difference-in-Differences
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
Technical Track Session III
1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank.
Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez.
Impact Evaluation Methods
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
Development Impact Evaluation in Finance and Private Sector
Impact Evaluation Methods
1 Causal Inference Counterfactuals False Counterfactuals
Impact Evaluation Toolbox
Implementation Challenges
Impact Evaluation Methods: Difference in difference & Matching
Impact Evaluation Designs for Male Circumcision
Explanation of slide: Logos, to show while the audience arrive.
Sampling for Impact Evaluation -theory and application-
Applying Impact Evaluation Tools: Hypothetical Fertilizer Project
Module 3: Impact Evaluation for TTLs
Steps in Implementing an Impact Evaluation
Presentation transcript:

The World Bank Human Development Network Spanish Impact Evaluation Fund

MEASURING IMPACT Impact Evaluation Methods for Policy Makers This material constitutes supporting material for the "Impact Evaluation in Practice" book. This additional material is made freely but please acknowledge its use as follows: Gertler, P. J.; Martinez, S., Premand, P., Rawlings, L. B. and Christel M. J. Vermeersch, 2010, Impact Evaluation in Practice: Ancillary Material, The World Bank, Washington DC ( The content of this presentation reflects the views of the authors and not necessarily those of the World Bank.

1 Causal Inference Counterfactuals False Counterfactuals Before & After (Pre & Post) Enrolled & Not Enrolled (Apples & Oranges)

2 IE Methods Toolbox Randomized Assignment Discontinuity Design Diff-in-Diff Randomized Offering/Promotion Difference-in-Differences P-Score matching Matching

Choosing your IE method(s) Prospective/Retrospective Evaluation? Eligibility rules and criteria? Roll-out plan (pipeline)? Is the number of eligible units larger than available resources at a given point in time? o Poverty targeting? o Geographic targeting? o Budget and capacity constraints? o Excess demand for program? o Etc. Key information you will need for identifying the right method for your program:

Choosing your IE method(s) Best Design Have we controlled for everything? Is the result valid for everyone? o Best comparison group you can find + least operational risk o External validity o Local versus global treatment effect o Evaluation results apply to population we’re interested in o Internal validity o Good comparison group Choose the best possible design given the operational context:

Progresa Policy Recommendation? Note: If the effect is statistically significant at the 1% significance level, we label the estimated impact with 2 stars (**). If significant at 10% level, we label impact with + Impact of Progresa on Consumption (Y) Case 1: Before & After 34.28** Case 2: Enrolled & Not Enrolled Case 3: Randomized Assignment 29.75** Case 4: Randomized Offering 30.4** Case 5: Discontinuity Design 30.58** Case 6: Difference-in-Differences 25.53** Case 7: Matching 7.06+

2 IE Methods Toolbox Randomized Assignment Discontinuity Design Diff-in-Diff Randomized Offering/Promotion Difference-in-Differences P-Score matching Matching Combinations of methods

Choosing your method Targeted (Eligibility Cut-off) Universal (No Eligibility Cut-off) Limited Resources (Never Able to Achieve Scale) Fully Resourced (Able to Achieve Scale) Limited Resources (Never Able to Achieve Scale) Fully Resourced (Able to Achieve Scale) Phased Implementation Over Time o Randomized Assignment o Discontinuity Design o Randomized Assignment (roll-out) o Discontinuity Design o Randomized Assignment o Matching with Diff-in-Diff o Randomized Assignment (roll-out) o Matching with Diff-in- Diff Immediate Implementation o Randomized Assignment o Discontinuity Design o Randomized Promotion o Discontinuity Design o Randomized Assignment o Matching with Diff-in-Diff o Randomized Promotion

“ Remember The objective of impact evaluation is to estimate the causal effect or impact of a program on outcomes of interest.

“ Remember To estimate impact, we need to estimate the counterfactual. o what would have happened in the absence of the program and o use comparison or control groups.

“ Remember We have a toolbox with 5 methods to identify comparison groups.

“ Remember Choose the best evaluation method that is feasible in the program’s operational context.