Download presentation
Presentation is loading. Please wait.
Published byClinton Eaton Modified over 9 years ago
1
PREM Week April 2009 PREM Week April 2009 Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real Time Decision Making
2
Can leadership and motivation alone break the poverty trap? –Rwanda rapid results initiative Can participation change local norms and institutions? – Sierra Leone institutional reform & capacity building/gobifo What type of information can enhance local accountability? –Uganda schools’ budget What is the best way to select local projects? –Indonesia direct voting versus representatives’ decisions Do audits increase accountability in public administration? –Indonesia, Brazil
3
These are difficult questions… We turn to our best judgment for guidance and pick an incentive, subsidy level, a voting scheme, a package of services… Is there any other incentive, subsidy, scheme or package that will do better?
4
A few big decisions are taken during design but many more decisions are taken during roll out & implementation
5
Road Authority procurement of rural roads O&M Small contracts Fixed-cost contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation Adjustable contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation Large contracts Fixed-cost contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation Adjustable contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation
6
Establish which decisions will be taken upfront and which will be tested during roll-out Scientifically test critical nodes: measure the impact of one option relative to another or to no intervention Pick better and discard worse during implementation Cannot learn everything at once Select carefully what you want to test by involving all relevant partners
7
Application of the scientific method to understand and measure behavioral response Hypothesis ▪ If we subdivide contracts then more companies will bid for the contract Testing ▪ Randomly assign which bids will be subdivided in smaller size contracts and compare number of bidders and bidding price relative to engineering costs Observations ▪ Smaller contract size increases number of bidders ▪ Bidding prices fall ▪ Costs of administering contracts rise Conclusion ▪ Smaller contract size increases competition in the bidding procedure
8
Road Authority procurement of rural roads O&M Small contracts Fixed-cost contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation Adjustable contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation Large contracts Fixed-cost contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation Adjustable contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation In Okhlahoma, publication of engineering costs lowered procurement costs by 4.6% Andhra Pradesh e- procurement increased bidder participation by 25%
9
What is Impact Evaluation? Counterfactual analysis isolates the causal effect of an intervention on an outcome Effect of contract size on number of bidders Effect of renegotiation option on bidding price Effect of information on price Compare same individual with & without option, information etc. at the same point in time to measure the effect This is impossible Impact evaluation uses large numbers (individuals, communities) to estimate the effect
10
How is this done? Select one group to receive treatment (contract alternative, information…) Find a comparison group to serve as counterfactual (other alternative, no information…) Use these counterfactual criteria: Treated & comparison groups have identical initial average characteristics (observed and unobserved) The only difference is the treatment Therefore the only reason for the difference in outcomes is due to the treatment (or differential treatment)
11
How is monitoring different from impact evaluation? The Monitoring story In 1996, transfer to school was 20 out of 100 budget allocation Government started publishing school allocations in newspapers By 2001, transfers jumped to 80% of budget allocation, a gain of 60 (80-20) % AfterBefore 20 80 1996 2001 20 Information Change =60 Low controls High controls
12
How is monitoring different from impact evaluation? % AfterBefore 20 80 1996 2001 20 Information & other factors 36 Impact =44 The Impact evaluation story In 1996, a low awareness year, transfers to schools were 20% of budget allocation After the 1996 PETS were published, there was much public discussion and budgets published in local newspapers By 2001, transfers in schools close to newspaper outlets increased to 80 Transfers to schools far from newspaper outlets increased to 36 Newspaper information increased transfers by 44 (80-36) Low controls High controls
13
How to use Monitoring & Impact Evaluation? Use monitoring to track implementation efficiency (input-output) INPUTSOUTCOMESOUTPUTS MONITOR EFFICIENCY EVALUATE EFFECTIVENESS $$$ BEHAVIOR Use impact evaluation to measure effectiveness (output-outcome)
14
DescriptiveanalysisDescriptiveanalysis CausalanalysisCausalanalysis Monitoring and process evaluation Is program being implemented efficiently? Is program targeting the right population? Are outcomes moving in the right direction? Impact Evaluation What was the effect of the program on outcomes? How would outcomes change under alternative program designs? Is the program cost-effective?
15
Are transfers to localities being delivered as planned? Does information reduce capture? What are the trends in public procurement? Do contractual incentives increase timeliness in delivery of services? M&E IE M&E IE
16
Uganda Community-Based Nutrition Failed project In year 3 communities stopped receiving funds Parliament negative reaction Intervention stopped …but… Strong impact evaluation results in year 1-2 Children in treatment scored half a standard deviation better than children in the control Recently, Presidency asked to take a second look at the evaluation: saving the baby? Why evaluate: babies & bath water
17
Improve quality of programs Separate institutional performance from quality of intervention Test alternatives and inform design in real time Increase program effectiveness Answer the “so what” questions Build government institutions for evidence-based policy-making Plan for implementation of options not solutions Find out what alternatives work best Adopt better way of doing business and taking decisions
18
PM/Presidency: Communicate to constituencies Treasury/ Finance: Allocate budget Line ministries: Deliver programs and negotiate budget Effects of government program BUDGET SERVICE DELIVERY CAMPAIGN PROMISES Accountability Cost-effectiveness of alternatives and effect of sector programs Cost- effectiveness of different programs
19
From: Program is a set of activities designed to deliver expected results Program will either deliver or not To: Program is menu of alternatives with a learning strategy to find out which work best Change programs overtime to deliver more results Shifting Program Paradigm
20
From retrospective, external, independent evaluation Top down Determine whether program worked or not To prospective, internal, and operationally driven impact evaluation /externally validated Set program learning agenda bottom up Consider plausible implementation alternatives Test scientifically and adopt best Just-in-time advice to improve effectiveness of program over time
21
Question design-choices of program Institutional arrangements, Delivery mechanisms, Packages, Pricing/incentive schemes Use random trials to test alternatives Focus on short term outcomes take up rates, use, adoption Follow up data collection and analysis 3-6-12 months after exposure Measure impact of alternative treatments on short term outcomes and identify “best” Change program to adopt best alternative Start over
22
How much does the program deliver? Is it cost-effective? Use most rigorous method of evaluation possible Focus on higher level outcomes educational achievement, health status, income Measure impact of operation on stated objectives and a metric of common outcomes One, two, three year horizon Compare with results from other programs Inform budget process and allocations
23
This is a technical assistance product to change the way decisions are taken It is about building a relationship Adds results-based decision tools to complement existing sector skills The relationship delivers not one but a series of analytical products Must provide useful (actionable) information at each step of the impact evaluation
24
Objective To build know‐how in implementing agencies and work with Bank operations to generate operational knowledge Programmatic Activities Annual workshops for skill development Community of practice for South‐to‐South learning Technical advisory group to assure quality of analytical work In-Country Activities Technical support and field coordination through project cycle
26
Thank you email alegovini@worldbank.orgalegovini@worldbank.org Webwww.worldbank.org/dimewww.worldbank.org/dime
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.