Download presentation
Presentation is loading. Please wait.
Published byOswald Owens Modified over 9 years ago
1
IMPACT EVALUATION WORKSHOP ISTANBUL, TURKEY MAY 11-14
2
IMPACT EVALUATION FOR REAL TIME POLICY MAKING Arianna Legovini Development Impact Evaluation Istanbul, May 11, 2015
3
WHAT IS THE VALUE OF A POLICY? “measuring” HOW TO GET THE MOST OUT OF A POLICY? “reaching” 3
4
What does DIME do? 4 Transform development policy 1.Run experiments to inform decisions 2.Build agencies’ capacity to do this systematically 3.Draw lessons and share them face-to-face to global audiences
5
175 IEs in 47 Countries Distribution of dime ’s IEs by GP and CCSA (number of IEs) 5
6
What is the point? From Dishing out half-cooked advice To transferring the tools to Generate your own evidence and guide your own decisions 6
7
How? Focus on the Causal Links Inputs Financial and human resources Activities Actions to convert inputs into outputs Outputs Tangible products Intermediat e outcomes Effective access Use Compliance Service quality Final Outcomes Growth Lives saved Productivity Legovini WHAT (SUPPLY SIDE)HOW TO (DEMAND RESPONSE) & RESULTS
8
Policy objective Invest in what gets us results 8 Invest in the cause to get an effect
9
Evaluation objective Identify cause - effect 9 enable good policy decisions
10
1010 WHY IS UNDERSTANDING CAUSALITY SO IMPORTANT? I THINK I WILL STOP USING UMBRELLAS WHEN I USE MY UMBRELLA MY SHOES GET WET It is easy to confuse correlation with causation
11
Do training programs increase employment? 11 A.Yes B.No C.Don’t know
12
12 Invest in the cause to get results Mentorship interacts with quality of mentor to change attitudes and value for youth to act upon their life-skill training and change their behavior (Bushway and Reuter, 2002, 2007) Charismatic mentor + behavioral therapy Attitudes and values Vocational training EMPLOYMENT More motivated youth
13
Monitoring: trends and correlations not causality DO WE NEED MONITORING? YES! Monitoring tracks indicators over time But only among participants It is descriptive before-after analysis It tells us whether things are moving in the right direction But it does not tell us why things happen (causality) 13
14
Evaluation Narrow down causality by identifying a counterfactual and compare 14 WHAT HAPPENED WHAT WOULD HAVE HAPPENED TO
15
15 What is counterfactual analysis? Compare same individual – with & without intervention – at the same point in time Missing data Compare statistically identical groups of individuals – with & without intervention – at the same point in time Comparable data
16
16 Counterfactual criteria Treated & control groups Have identical initial average characteristics (observed and unobserved) So that, the only difference is the treatment Therefore the only reason for the difference in outcomes is due to the treatment
17
17 USE SCIENTIFIC EXPERIMENTS Randomly assign treatment and control and compare outcomes over time
18
Public credit guarantee for banks to lend to SMEs Public credit guarantees begin 2013 2014 2016 Outcomes of SMEs participating in the program: 20132016 in exports % SMEs exporting15.2%18.5%3.3% Exports US Million5.26.81.6 Question: How much IN EXPORTS was caused by GUARANTEES?
19
What percentage of EXPORTS was caused by GUARANTEES? 19 A.3.3% B.18.5% C.None of the above
20
Public credit guarantees begin 2013 2014 2016 But what if during these years… Exchange rates change? Local economy grows? Global economic crisis hits?SME Exports may change regardless! Credit registries improve? Any other variable influencing SMEs changes? IN EXPORTS CAUSED BY: Intervention + Exchange rates + Local economy + Global economic crisis + Credit registries + Other factors
21
what would have happened to the firms that participate in the intervention if they would not have participated? = counterfactual 2013 2016 With intervention Without intervention What we need to know is: The evaluation problem stems from the fact that we cannot observe the same firms in the “treated (i.e., receiving the credit guarantee)” and “untreated” states at the same time
22
Selecting a proper control group is not trivial A valid control group consists of: A group of firms with similar characteristics to those of the participating firms (treated firms) but who do not receive the intervention Non-participants are not a “good” counterfactual because they are likely to be different (less willing to change, less informed, less connected) and this may bias the impact estimates
23
Experimental approach The effect of the intervention is given by the average difference between treated and control groups
24
what would have happened to the firms that participate in the intervention if they would not have participated? = counterfactual 2013 2016 With intervention Without intervention SMEs exporting in 3.3% Exports in 1.6 US Million SMEs exporting in 2.8% Exports in 1.1 US Million What we need to know is:
25
Firms with intervention Firms without intervention treatment 2013 -2016 control 2013 -2016 Effect of program % SMEs exporting3.3%2.8%0.5% Exports US Million1.61.20.4 Impact Evaluation identifies a proper counterfactual group to compare with the group of SMEs that were affected by the policy
26
So now you evaluated the program… But what about getting better results? Common problem with SMEs is low take up 26
27
RETURNS TO INVESTMENTS ARE NOT FIXED USE EVIDENCE TO INFORM POLICY DESIGN EXPERIMENTS TO GUIDE POLICY IMPLEMENTATION TO INCREASE IMPACT
28
WHAT At first, the most relevant for policymakers and stakeholders Measures effectiveness of program as is (accountability) Does not answers how/why question WHY/HOW Theory-based problem-solving What mechanism explains success/failure and increases impact? Targets knowledge generation to: Improve policy effectiveness (local effect) Generalize and adapt to other contexts (global effect) DIME works on two broad questions “What” and “Why” Questions
29
Take the example of matching grants 9 evaluations failed because of low SME take-up! Should we conclude that matching grants are not effective? Two possible problems: – We have not addressed design issues Take up: awareness, knowledge, technical assistance – Our assumptions do not hold It’s not just about money SMEs don’t know what they need and/or There are no local markets to deliver them what they need 29
30
What is what is needed is Managerial Skills? Field experiment on Indian textile firms (Bloom et al., 2013) Free consulting on modern management practices to a randomly chosen set of treatment plants 5 months (treatment group) vs. 1 month (control group) of consulting services Treatment 17% in productivity USD 300,000 in additional profits per year Impact
31
They didn’t know that they didn’t know Why don’t these firms improve themselves?
32
32 IMPACT EVALUATION not just to measure impact But 2 achieve IMPACT
33
We evaluate the impact of packages of interventions (“what”) and experiment with mechanisms (“why/how”) to make them work better: IE evaluation questions in IEs with DIME Involvement
34
How to Behavioral biases – limited attention, self-control, cognitive capacity, understanding, asymmetric valuation of gains and losses Delivery mechanisms – targeting rules, centralized/decentralized modality of public service delivery, collective vs. individual or peer-to-peer, private sector vs. public sector delivery, paid vs. voluntary, use of media and technology Accountability mechanisms – top-down accountability (audits, inspections, supervision, performance assessment and feedback, laws and regulations) – demand-side accountability (information, report cards, user participation and monitoring). Incentives – supply or demand conditional monetary and non-monetary incentives Constraints to capital formation and productivity – credit, financial, cash-flow, risk-management, – financial literacy and life skills, – information, inputs, – institutional, legal, tax burden, corruption, property, public goods, – managerial, skills, technology, and transaction costs. 34
35
3535 Get the delivery right Subsidy for malaria treatment $350 million Sealing packages saved 16,600 under five + 2,200 adult lives in 5 years Problem: poor public sector distribution of medicines Solution: one system reduced stockouts by 35%; best 83% Source: Vledder, M.; Friedman, J; Yadav, P., and Sjoblom M. RANDOM ASSIGNMENT OF HEALTH DISTRICTS TO 3 DIFFERENT DISTRIBUTIO N SYSTEMS
36
How much did it cost to save each life? 36 A.120 dollars B.1,200 dollars C.12,000 dollars D.120,000 dollars
37
37 Price a CURE $6
38
Is the benefit worth the cost? Zambia best supply chain – IE solution: 83% reduction in stock out 21-25% reduction in malaria mortality – Pilot: Benefit ($7.4M) / Cost ($3.8M) = 2 – Scaled up: Benefit ($60.7M) / Cost ($12.4M) = 4.9 38 IE PAYS FOR ITSELF A PUBLIC GOOD WORTH FINANCING
39
Get people to do it: financial literacy & behavior formation Pilot with 20,000 Brazilian high-school students (Bruhn et al., 2013) 3-semester of high-quality financial literacy and behavior formation incorporated into high-school curriculum Life-relevant, interactive and actionable material (72 case studies), teacher guidelines, online material Parent workshop in financial literacy Treatment Important increase in financial knowledge Changes in attitudes and behaviors 1.4 pp increase in savings (Large and economically significant effect in a country with low savings rate) Triple impacts for those with parent workshop Impact
40
DIME in action 40 Inform policy design Guide mid- course corrections Inform adoption and scale-up Systematic use of evidence Train & apply Learn by doing Apply knowledge Capacity building IE Product IE DESIGN IE IMPLEMENTATION IE DISSEMINATION
41
Building Institutions iterative learning Policymakers who know how to use IE tools to inform their policy choices Researchers who put their skills to serve policy objectives Operations to adapt implementation modalities Sustained collaborations for operational research to guide policy making Policy relevant questions Rigorous design Full implemen tation support High quality data Causal analysis Policy adoption Actionable answers
42
Clear roadmap 42 Clearer roadmap WhatTo whomWhenWhere
43
More efficient implementation 43
44
Better capacity 44 Better capacity Research team Full time field coordinator Training More and better data
45
Better decisions 45
46
Thank You! facebook.com/ieKnow #impacteval blogs.worldbank.org/impactevaluations microdata.worldbank.org/index.php/catalog/ impact_evaluation http://dime.worldbank.org WEB
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.