Presentation is loading. Please wait.

Presentation is loading. Please wait.

Making Impact Evaluations Happen World Bank Operational Experience 6 th European Conference on Evaluation of Cohesion Policy 30 November 2009 Warsaw Joost.

Similar presentations


Presentation on theme: "Making Impact Evaluations Happen World Bank Operational Experience 6 th European Conference on Evaluation of Cohesion Policy 30 November 2009 Warsaw Joost."— Presentation transcript:

1 Making Impact Evaluations Happen World Bank Operational Experience 6 th European Conference on Evaluation of Cohesion Policy 30 November 2009 Warsaw Joost de Laat and Kaspar Richter

2 Outline Result Agenda & Impact Evaluations Impact Evaluations of Active Labor Market Policies

3 World Bank’s Result Agenda Results-based approach to ensure that WB contributes to improved country outcomes Demand for evidence of the results of development assistance is increasing – limited resources, donor fatigue, client expectations Multiple policy options to address needs Rigorous evidence often lacking to prioritize policy options Among monitoring and evaluation techniques, impact evaluation provides an important tool to show the effect of interventions Given the power of this tool, World Bank increasing number of impact evaluations

4 World Bank’s Development Impact Evaluation Initiative Objectives: Objectives: To increase number of Bank projects with impact evaluation components To increase staff capacity to design and carry out such evaluations To build a process of systematic learning based on effective development interventions with lessons learned from completed evaluations 12 Clusters: Conditional Cash Transfers, Early Childhood Development, Education Service Delivery, HIV/AIDS Treatment and Prevention, Local Development, Malaria Control, Pay-for-Performance in Health, Rural Roads, Rural Electrification, Urban Upgrading, ALMP and Youth Employment

5

6 Impact evaluation differs from M&E INPUTSOUTCOMESOUTPUTS MONITOR EFFICIENCY EVALUATE EFFECTIVENESS $$$, activities BEHAVIOR

7 Impact Evaluation Informs

8 Impact Evaluation Methods Experimental methods/Randomization Quasi-experimental methods Propensity score matching (PSM) Regression discontinuity design (RDD) Other Econometric methods Before and After (Reflexive comparisons) Difference in Difference (Dif in Dif) Instrumental variables Encouragement design

9 Global Growth Industry - Ongoing Randomized Impact Evaluations From MIT Poverty Action Lab Website (2009)

10 IE are easier in some sectors than in others

11 Randomization in Infrastructure? Very hard to do this mainly due to engineering constrains Unit of observation are often communities rather than HHs Self selection: local communities have to be eligible, prepare a project, apply for funds, and commit some project value (in kind and cash); Some projects already ongoing and the government has no capacity to start everywhere at the same time

12 10 Steps to Making Impact Evaluations Relevant for Practitioners  Make policy question the starting point  Take seriously ethical objections and political sensitivities  Take comprehensive approach to sources of bias  Look for spillover effects  Take a sectoral approach  Look for impact heterogeneity  Take scaling up seriously  Understand what determines the impact  Don’t reject theory  Develop within-country capacity

13 Active Labor Market Programs – DIME Examples Financial Support: WB Spanish Impact Evaluation Fund Funded by Spain ( Funded by Spain (€10.4 mn) and UK (€1 mn)

14 Steps for ALMP Impact Evaluation Undertake prior quantitative analysis to identify priority areas Undertake prior quantitative analysis to identify priority areas Skills are particularly low among which (age, ethnic, etc.) groups? Unemployment is particularly high among which groups? Undertake qualitative analysis that may answer the why questions Undertake qualitative analysis that may answer the why questions (why is unemployment high among certain groups?). Inquire with Government, Employers, Employees, Unemployed Design a pilot and evaluate impact before scale-up Design a pilot and evaluate impact before scale-up Select one group to receive treatment (job training, counseling etc.) Select one group to receive treatment (job training, counseling etc.) Find a comparison group to obtain counterfactual Find a comparison group to obtain counterfactual Treatment & comparison groups with identical initial characteristics so that only difference is the ALMP Hence, differences in unemployment rate arise only due to ALMP Collect baseline data Collect baseline data can ensure proper targeting of ALMP allows verification that treatment and comparison groups are statistically identical prior to intervention enables ex-post evaluation of heterogeneous program effects (i.e. was the job training program more effective among certain types of subgroups?) Implement and monitor outcomes in treatment & Implement and monitor outcomes in treatment & comparison groups

15 Evidence from Randomized IE of ALMPs Ibarrarán and Shady (2008) Considerable heterogeneity with none to modest employment impacts overall. Considerable heterogeneity with substantial employment impacts effects on some subgroups (e.g. women, adults) but not others. European evidence is far more uncertain in part because of the lack of experimental studies and the wide variation in evaluation methods. Given the considerable heterogeneity, it is important to pilot and evaluate.

16 Some Resources on Impact Evaluations www.worldbank.org/sief www.worldbank.org/dime www.worldbank.org/impactevaluation http://ec.europa.eu/regional_policy/sources/docgener/evaluation/eval uation_en.htm http://ec.europa.eu/regional_policy/sources/docgener/evaluation/eval uation_en.htm www.povertyactionlab.org http://evidencebasedprograms.org/ “Using Randomization in Development Economics Research: A Toolkit” (2006). By: E. Duflo, M. Kremer and R. Glennerster (At: www.povertyactionlab.org/research/rand.php )www.povertyactionlab.org/research/rand.php “Institutionalizing Impact Evaluation Within the Framework of a Monitoring and Evaluation System” (2009): By: World Bank (At: www.worldbank.org/ieg/ecd/docs/inst_ie_framework_me.pdf )www.worldbank.org/ieg/ecd/docs/inst_ie_framework_me.pdf

17 ADDITIONAL SLIDES

18 What we need for an IE 1. The difference in outcomes with the program versus without the program – for the same unit of analysis (e.g. individual, community etc.) 2. Problem: individuals only have one existence 3. Hence, we have a problem of a missing counter-factual, a problem of missing data

19 We observe an outcome indicator, Intervention

20 and its value rises after the program: Intervention

21 Having the “ideal” counterfactual…… Intervention


Download ppt "Making Impact Evaluations Happen World Bank Operational Experience 6 th European Conference on Evaluation of Cohesion Policy 30 November 2009 Warsaw Joost."

Similar presentations


Ads by Google