Www.worldbank.org/hdchiefeconomist The World Bank Human Development Network Spanish Impact Evaluation Fund.

Slides:



Advertisements
Similar presentations
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Advertisements

Impact Evaluation Methods: Causal Inference
Impact analysis and counterfactuals in practise: the case of Structural Funds support for enterprise Gerhard Untiedt GEFRA-Münster,Germany Conference:
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Explanation of slide: Logos, to show while the audience arrive.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Mywish K. Maredia Michigan State University
David Fairris Tarek Azzam
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Social Impacts Measurement in Government and Academia Daniel Fujiwara Cabinet Office & London School of Economics.
Impact Evaluation Click to edit Master title style Click to edit Master subtitle style Impact Evaluation World Bank InstituteHuman Development Network.
The counterfactual logic for public policy evaluation Alberto Martini hard at first, natural later 1.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Designing Influential Evaluations Session 5 Quality of Evidence Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014.
Collecting Data.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
VIII Evaluation Conference ‘Methodological Developments and Challenges in UK Policy Evaluation’ Daniel Fujiwara Senior Economist Cabinet Office & London.
Methods and Approaches to investigate the UK Education System Sandra McNally, University of Surrey and Centre for Economic Performance, London School of.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Non Experimental Design in Education Ummul Ruthbah.
TRANSLATING RESEARCH INTO ACTION What is Randomized Evaluation? Why Randomize? J-PAL South Asia, April 29, 2011.
Measuring the impact of education interventions Stephen Taylor STELLENBOSCH, AUGUST 2015.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Mattea Stein Quasi Experimental Methods.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
The World Bank Human Development Network Spanish Impact Evaluation Fund
CAUSAL INFERENCE Shwetlena Sabarwal Africa Program for Education Impact Evaluation Accra, Ghana, May 2010.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Matching Estimators Methods of Economic Investigation Lecture 11.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
MSRP Year 1 (Preliminary) Impact Research for Better Schools RMC Corporation.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
Framework of Preferred Evaluation Methodologies for TAACCCT Impact/Outcomes Analysis Random Assignment (Experimental Design) preferred – High proportion.
Chapter 16 Social Statistics. Chapter Outline The Origins of the Elaboration Model The Elaboration Paradigm Elaboration and Ex Post Facto Hypothesizing.
Current practices in impact evaluation Howard White Independent Evaluation Group World Bank.
ECON 3039 Labor Economics By Elliott Fan Economics, NTU Elliott Fan: Labor 2015 Fall Lecture 41.
1 General Elements in Evaluation Research. 2 Types of Evaluations.
Randomized Assignment Difference-in-Differences
Differences-in- Differences. Identifying Assumption Whatever happened to the control group over time is what would have happened to the treatment group.
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
Comments on the research proposal “School Attendance, Labor Supply and Formality. An Impact Evaluation of PANES” (Team leader: Veronica Amarante) Nguyen.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Technical Track Session III
Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez.
Impact Evaluation Methods
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
The Association between External Ear Size and Medical Student Performance: A Purely Hypothetical Study John Star Student, B.S. and Jane Doe Mentor, M.D.,
Impact Evaluation Methods
Impact Evaluation Methods
1 Causal Inference Counterfactuals False Counterfactuals
Impact Evaluation Toolbox
Matching Methods & Propensity Scores
Impact Evaluation Methods: Difference in difference & Matching
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
Module 3: Impact Evaluation for TTLs
Presentation transcript:

The World Bank Human Development Network Spanish Impact Evaluation Fund

MEASURING IMPACT Impact Evaluation Methods for Policy Makers This material constitutes supporting material for the "Impact Evaluation in Practice" book. This additional material is made freely but please acknowledge its use as follows: Gertler, P. J.; Martinez, S., Premand, P., Rawlings, L. B. and Christel M. J. Vermeersch, 2010, Impact Evaluation in Practice: Ancillary Material, The World Bank, Washington DC ( The content of this presentation reflects the views of the authors and not necessarily those of the World Bank.

1 Causal Inference Counterfactuals False Counterfactuals Before & After (Pre & Post) Enrolled & Not Enrolled (Apples & Oranges)

2 IE Methods Toolbox Randomized Assignment Discontinuity Design Diff-in-Diff Randomized Offering/Promotion Difference-in-Differences P-Score matching Matching

2 IE Methods Toolbox Randomized Assignment Discontinuity Design Diff-in-Diff Randomized Offering/Promotion Difference-in-Differences P-Score matching Matching

Difference-in-differences (Diff-in-diff) Y=Girls exam score (percentage of correct answers) P= Tutoring Diff-in-Diff: Impact=(Y t1 -Y t0 )-(Y c1 -Y c0 ) Enrolled/ With tutoring Not Enrolled/ No tutoring After Before Difference =

Difference-in-differences (Diff-in-diff) Diff-in-Diff: Impact=(Y t1 -Y c1 )-(Y t0 -Y c0 ) Y=Girls exam score (percentage of correct answers) P= Tutoring Enrolled/ With tutoring Not Enrolled/ No tutoring After Before Difference =

Impact =(A-B)-(C-D)=(A-C)-(B-D) Exam score B=0.60 C=0.81 D=0.78 T=0 Before T=1 After Time Enrolled Not enrolled Impact=0.11 A=0.74

Impact =(A-B)-(C-D)=(A-C)-(B-D) Exam score Impact<0.11 B=0.60 A=0.74 C=0.81 D=0.78 T=0 Before T=1 After Time Enrolled Not enrolled

Case 6: Difference-in-differences EnrolledNot EnrolledDifference Follow-up (T=1) Consumption (Y) Baseline (T=0) Consumption (Y) Difference Estimated Impact on Consumption (Y) Linear Regression 27.06** Multivariate Linear Regression 25.53** Note: If the effect is statistically significant at the 1% significance level, we label the estimated impact with 2 stars (**).

Progresa Policy Recommendation? Note: If the effect is statistically significant at the 1% significance level, we label the estimated impact with 2 stars (**). Impact of Progresa on Consumption (Y) Case 1: Before & After 34.28** Case 2: Enrolled & Not Enrolled Case 3: Randomized Assignment 29.75** Case 4: Randomized Offering 30.4** Case 5: Discontinuity Design 30.58** Case 6: Difference-in-Differences 25.53**

Keep in Mind Difference-in-Differences Difference-in-Differences combines Enrolled & Not Enrolled with Before & After. Slope: Generate counterfactual for change in outcome Trends –slopes- are the same in treatments and comparisons (Fundamental assumption). To test this, at least 3 observations in time are needed: o 2 observations before o 1 observation after. !