Presentation is loading. Please wait.

Presentation is loading. Please wait.

Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank.

Similar presentations


Presentation on theme: "Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank."— Presentation transcript:

1 Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank

2 FIRST What is a results chain? Inputs Financial and human resources Activities Actions to convert inputs into outputs Outputs Tangible products, informatio n campaign s, trainings, studies Inter- mediate outcome s Use of outputs by intended population Take-up Use Final Outcome s Objective of the program Growth Social cohesion Employm ent status 2 Implementation (SUPPLY SIDE)Results (DEMAND RESPONSE)

3 Example of a results chain Inputs Financial and human resources to build local institutions Activities Develop com- munication Set up grant mechanisms Provide technical assistance to communities Outputs Communica- tion package Grant offerings Technical assistance visits Intermedia te outcomes Grants awarded and disbursed Projects implemente d Community rallies Final Outcomes Reduced conflict Improved inclusion of ex- combatants Expanded employment 3 Monitor whether program is implemented as plannedEvaluate whether program is effective ?

4 The results chain  Developmental hypothesis that helps you define:  what you are doing and for what purpose  what needs to be monitored and  what needs to be evaluated 4

5 What is monitoring?  Monitoring tracks indicators over time (in the treatment group)  It is descriptive before-after analysis  It tells us whether things are moving in the right direction 5

6  Impact evaluation tracks outcomes over time in the treatment group relative to a control group  It measures the effect of an intervention on outcomes relative to a counterfactual  what would have happened without it?  It identifies the causal output-outcome link  separately from the effect of other time-varying factors

7  Use monitoring to track implementation efficiency (input-output) 7 INPUTSOUTCOMESOUTPUTS MONITOR EFFICIENCY EVALUATE EFFECTIVENESS $ UPPLY DEMAND RESPONSE  Use impact evaluation to measure effectiveness (output-outcome)

8 Pick the right method to answer your questions Descriptive Analysis Monitoring (and process evaluation) Is program being implemented efficiently? Is program targeting the right population? Are outcomes moving in the right direction? Causal Analysis Impact Evaluation What was the effect of the program on outcomes? How do alternative implementation modalities compare? Is the program cost- effective? 8

9 Discuss among yourselves (5m)  What happens if you use monitoring to evaluate impact? 9

10 Discuss among yourselves (5m)  What happens if you use monitoring to evaluate impact?  You get the wrong answer… …100% of the times 10 AfterBefore B A t0t0 t1t1 B Intervention Change C Impact

11 NEXT: Do we know ex ante…  On Community Driven Development, what  information will get communities to respond?  facilitation will results in high quality proposals?  rules will increase inclusion in the decision-making process?  monitoring mechanisms and co-payments will improve local projects and their use of funds?  On Disarmament, Demobilization and Reintegration,  are community based or targeted approaches most effective?  should we try to delink combatants from units or build on unit cohesion?  is including or excluding command structures most effective? 11

12  We turn to our best judgment for guidance and pick an information campaign, a package of services, a structure of incentives.  Is there any other campaign, package, incentive structure that will do better? 12

13 The decision process is complex  A few big decisions are taken during design but many more decisions are taken during roll out & implementation 13

14 Pick up the ball: What is a results tree?  A results tree is a representation of the set of results chains that are considered viable during program design or program restructuring.  Is a set of competing policy and operational alternatives to reach a specific objective. 14

15 Reintegratio n Public communicat ion campaign to reintegrate ex- combatants Use combatant unit cohesion structures Include command structuresExclude command structures Delink combatant and dismantle unit structure Supply-based trainingDemand-based training Community- based campaign Use combatant unit cohesion structures Include command structuresExclude command structures Delink combatant and dismantle unit structure Supply-based trainingDemand-based training 15

16  Establish which decisions will be taken upfront and which will be tested during roll- out  Experimentally test critical nodes: measure the impact of one option relative to another or to no intervention  Pick better and discard worse during implementation  Cannot learn everything at once  Select carefully what you want to test by involving all relevant partners 16

17 17 Take-up MAX Use of services MAX Benefits from services MAX Reintegration Public communicati on campaign to reintegrate ex- combatants Use combatant unit cohesion structures Include command structuresExclude command structures Delink combatant and dismantle unit structure Supply-based trainingDemand-based training Community- based campaign Use combatant unit cohesion structures Include command structuresExclude command structures Delink combatant and dismantle unit structure Supply-based trainingDemand-based training

18 H0w IE can support you 18

19 Discuss among yourselves (5m)  How many times do you make changes to your program on the feeling that something is not working right?  How useful would it be to know for sure which ways are best? 19

20 Why evaluate?  Improve quality of programs  Test alternatives and inform design in real time  Increase program effectiveness  Answer the “so what” questions  Build government institutions for evidence-based policy-making  Plan for implementation of options not solutions  Find out what alternatives work best  Adopt better way of doing business and taking decisions 20

21 The market for evidence 21 PM/Presidency: Communicate to constituencies Treasury/ Finance: Allocate budget Line ministries: Deliver programs and negotiate budget Cost- effectiveness of different programs Effects of government program BUDGET SERVICE DELIVERY CAMPAIGN PROMISES Accountability Cost-effectiveness of alternatives and effect of sector programs

22 Shifting Program Paradigm From:  Program is a set of activities designed to deliver expected results  Program will either deliver or not To:  Program is menu of alternatives with a learning strategy to find out which work best  Change programs overtime to deliver more results 22

23  From retrospective, external, independent evaluation  Top down  Determine whether program worked or not  To prospective, internal, and operationally driven impact evaluation /externally validated  Set program learning agenda bottom up  Consider plausible implementation alternatives  Test scientifically and adopt best  Just-in-time advice to improve effectiveness of program over time 23

24 Retrospective (designed & evaluated ex-post) vs. Prospective (designed ex-ante and evaluated ex-post)  Retrospective impact evaluation:  Collecting data after the event you don’t know how participants and nonparticipants compared before the program started  Have to try and disentangle why the project was implemented where and when it was, after the event  Prospective impact evaluation:  design the evaluation to answer the question you need to answer  collect the data you will need later  Ensure analytical validity 24

25  Evaluative process to provide useful (actionable) information at each step of the impact evaluation 25

26 Discuss among yourselves (5m)  What are some of the entry points in policy- making cycles?  What are some of the answers you would like to have? 26

27 Ethical considerations  It is not ethical to deny benefits to something that is available and we know works  HIV medicine proven to prolong life  It is ethical to test interventions before scale up if we don’t know if it works and whether it has unforeseen consequences  Food aid may destroy local markets create perverse incentives  Most times we use opportunities created by roll out and budget constraints to evaluate so as to minimize ethical considerations AND  We can always and should test alternatives to maximize out results 27

28 Questions? Comments? 28


Download ppt "Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank."

Similar presentations


Ads by Google