Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Impact Evaluation The Motivation Emmanuel Skoufias The World Bank PRMPR PREM Learning Week: April 21-22, 2008.

Similar presentations


Presentation on theme: "Introduction to Impact Evaluation The Motivation Emmanuel Skoufias The World Bank PRMPR PREM Learning Week: April 21-22, 2008."— Presentation transcript:

1 Introduction to Impact Evaluation The Motivation Emmanuel Skoufias The World Bank PRMPR PREM Learning Week: April 21-22, 2008

2 Outline of presentation 1.Role of IE in the Results Agenda 2.Impact evaluation: Why and When? 3.Evaluation vs. Monitoring 4.Necessary ingredients of a good Impact Evaluation

3 The Role of IE in the Results Agenda  Demand for evidence of the results of development assistance is increasing.  Among monitoring and evaluation techniques, impact evaluation provides an important tool to show the effect of interventions  Given the power of this tool, the Bank is supporting an increasing number of impact evaluations (figure 1)

4

5 Status of IE within the Bank-1  Although the number of impact evaluations is growing overall, some Regions and Networks are more active than others  Most ongoing impact evaluations are in the social sectors (figure 2), which reflects not only the support provided by the HD Network, but also that there is more of an evaluation tradition in these areas and that the projects are more amenable to impact evaluation techniques.

6 WB Lending and IE by Sector

7 Status of IE within the Bank--2  The regional picture is also a skewed one. Africa is the leader with 47 ongoing evaluations, followed by SAR (27), LAC (26), and EAP (17). MENA and ECA have 2 each.

8 WB Lending and IE by Region

9 2. Impact Evaluation: Why and When?

10 Impact evaluation  Ex-ante vs. ex-post  Impact is the difference between outcomes with the program and without it  The goal of impact evaluation is to measure this difference in a way that can attribute the difference to the program, and only the program.  Challenge to evaluating SDN operations:  difficult to find comparison group  need quasi-experimental methods  take advantage of sub-national variation

11 Why conduct an Impact Evaluation?  Knowledge & Learning  Improve design and effectiveness of the program  Economic Reasons To make resource allocation decisions: Comparing program impacts allows G to reallocate funds from less to more effective programs and thus to an increase in Social Welfare  Social Reasons  increases transparency & accountability  Support of public sector reform / innovation  Political Reasons  Credibility/break with “bad” practices of past

12 When Is It Time to Make Use of Evaluation?--1  When you want to determine the roles of both design and implementation on project, program, or policy outcomes  Resource and budget allocations are being made across projects, programs, or policies  A decision is being made whether to (or not) expand a pilot  When regular results measurement suggests actual performance diverges sharply from planned performance.

13 When Is It Time to Make Use of Evaluation?--2  There is a long period with no evidence of improvement in the problem situation  Similar projects, programs or policies are reporting divergent outcomes  There are conflicting political pressures on decision- making in ministries or parliament  Public outcry over a governance issue  To identify issues around an emerging problem, I.e. children dropping out of school

14 Summary An impact evaluation informs: Strategy  Whether we are doing the right things  Rationale/justification  Clear theory of change Operation  Whether we are doing things right  Effectiveness in achieving expected outcomes  Efficiency in optimizing resources  Client satisfaction Learning  Whether there are better ways of doing it  Alternatives  Best practices  Lessons learned

15 3. Evaluation vs. Monitoring

16 Definitions (Results Based) Monitoring: is a continuous process of collecting and analyzing information to compare how well a project, program or policy is performing against expected results (Results-Based) Evaluation: An assessment of a planned, ongoing, or completed intervention to determine its relevance, efficiency, effectiveness, impact and sustainability. The intent is to incorporate lessons learned into the decision- making process.

17 Monitoring and Evaluation

18 Evaluation Addresses: “Why” Questions – What caused the changes we are monitoring “How” Questions – What was the sequence or processes that led to successful (or not) outcomes “Compliance/ Accountability Questions” Process/ Implementation Questions – Did the promised activities actually take place and as they were planned? Was the implementation process followed as anticipated, and with what consequences

19 Six Types Of Evaluation Impact Evaluation Process Implementation Performance Logic Chain Meta-Evaluation Case Study Pre-Implementation Assessment

20 Complementary Roles of Results-Based Monitoring and Evaluation MonitoringEvaluation Clarifies program objectivesAnalyzes why intended results were or were not achieved Links activities and their resources to objectives Assesses specific causal contributions of activities to results=Impact Evaluation Translates objectives into performance indicators and set targets Examines implementation process=Operations Evaluation Routinely collects data on these indicators, compares actual results with targets Explores unintended results=Spillover effects Reports progress to managers and alerts them to problems Provides lessons, highlights significant accomplishment or program potential, and offers recommendations for improvement

21 Summary--1  Results-based monitoring and evaluation are generally viewed as distinct but complementary functions  Each provides a different type of performance information  Both are needed to be able to better manage policy, program, and project implementation

22 Summary--2  Implementing results-based monitoring and evaluation systems can strengthen WB and public sector management  Implementing results-based monitoring and evaluation systems requires commitment by leadership and staff alike

23 4. Necessary ingredients of a good Impact Evaluation: A good counterfactual & robustness checks

24 What we need for an IE  The difference in outcomes with the program versus without the program – for the same unit of analysis (e.g. individual, community etc.)  Problem: individuals only have one existence  Hence, we have a problem of a missing counter-factual, a problem of missing data

25 Thinking about the counterfactual  Why not compare individuals before and after (the reflexive)?  The rest of the world moves on and you are not sure what was caused by the program and what by the rest of the world  We need a control/comparison group that will allow us to attribute any change in the “treatment” group to the program (causality)

26 We observe an outcome indicator, Intervention

27 and its value rises after the program: Intervention

28 Having the “ideal” counterfactual…… Intervention

29 allows us to estimate the true impact

30 Comparison Group Issues  Two central problems:  Programs are targeted  Program areas will differ in observable and unobservable ways precisely because the program intended this  Individual participation is (usually) voluntary  Participants will differ from non-participants in observable and unobservable ways (selection based on observable variables such as age and education and unobservable variables such as ability, motivation, drive)  Hence, a comparison of participants and an arbitrary group of non-participants can lead to heavily biased results

31 Impact Evaluation methods Differ in how they construct the counterfactual Experimental methods/Randomization Quasi-experimental methods Propensity score matching (PSM) Regression discontinuity design (RDD) Other Econometric methods Before and After (Reflexive comparisons) Difference in Difference (Dif in Dif) Instrumental variables Encouragement design

32 Thank you


Download ppt "Introduction to Impact Evaluation The Motivation Emmanuel Skoufias The World Bank PRMPR PREM Learning Week: April 21-22, 2008."

Similar presentations


Ads by Google