PREM Week April 2009 PREM Week April 2009 Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real.

Slides:



Advertisements
Similar presentations
ClimDev-Africa Program & African Climate Policy Center (ACPC)
Advertisements

Options appraisal, the business case & procurement
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Second Cross-country Workshop of the Africa Programs for Education and AIDS Impact Evaluation Dakar, December 2008 Africa Impact Evaluation Initiative.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Explanation of slide: Logos, to show while the audience arrive.
“From Shouting to Counting” - Introducing the Concept of Social Accountability Participation and Civic Engagement Group, Social Development Department,
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Performance-Based Funding in Higher Education Presentation by Arthur M. Hauptman Financing Reforms for Tertiary Education in the Knowledge Economy Seoul,
Types of Evaluation.
Australia’s Experience in Utilising Performance Information in Budget and Management Processes Mathew Fox Assistant Secretary, Budget Coordination Branch.
Conditional Cash Transfers for Improving Utilization of Health Services Health Systems Innovation Workshop Abuja, January 25 th -29 th, 2010.
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Arianna Legovini Africa Impact Evaluation Initiative (AIM) and Development.
A SOUND INVESTMENT IN SUCCESSFUL VR OUTCOMES FINANCIAL MANAGEMENT FINANCIAL MANAGEMENT.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Using Randomized Evaluations.
Presentation to Inclusion Ireland Conference & AGM Pat Healy – National Director Social Care 10 th May, 2014.
PERFORMANCE CONTRACTS IN KENYA RESTORING AND BUILDING TRUST IN GOVERNMENT THROUGH INNOVATIONS TO PROMOTE QUALITY OF PUBLIC SERVICE PRESENTATION BY Ambassador.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Mattea Stein Quasi Experimental Methods.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Use of impact evaluation for operational research PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL)
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
TRI science addiction USING PERFORMANCE AND OUTCOME MEASURES Mady Chalk, Ph.D. Treatment Research Institute Summit on Performance and Outcome Measurement.
Why Evaluate? Evaluating the Impact of Projects and Programs, Beijing, China April Shahid Khandker World Bank Institute.
IPMA Executive Conference Value of IT September 22, 2005.
Private involvement in education: Measuring Impacts Felipe Barrera-Osorio HDN Education Public-Private Partnerships in Education, Washington, DC, March.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Impact Evaluation Designs for Male Circumcision Sandi McCoy University of California, Berkeley Male Circumcision Evaluation Workshop and Operations Meeting.
Community-Driven Development: An Overview of Practice Community Development Strategies – how to prioritize, sequence and implement programs CommDev Workshop.
Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank.
Result Orientation in Interreg CENTRAL EUROPE Annual Meeting, Luxemburg, 15 September 2015 Monika Schönerklee-Grasser, Joint Secretariat.
The Next Stage for Results in Africa. Context 2005 Paris Declaration on Aid Effectiveness 2006 Mutual Learning Events Uganda & Burkina Faso 2007 Hanoi.
Achieving the MDGs: RBA Training Workshop Module 7: Synthesis of sector needs assessments and preparation of 10 year framework 9-12 May 2005.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Lessons from Programme Evaluation in Romania First Annual Conference on Evaluation Bucharest 18 February 2008.
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Workshop of the Africa Program for Impact Evaluation of HIV/AIDS.
The all important details: How to operationalize the RBF system Monique Vledder World Bank June 2008.
Impact Evaluation for Evidence-Based Policy Making
An introduction to Impact Evaluation and application to the Ethiopia NFSP Workshop on Approaches to Evaluating The Impact of the National Food Security.
AADAPT Workshop for Impact Evaluation in Agriculture and Rural Development Goa, India 2009 With generous support from Gender Action Plan.
Africa RISING M&E Expert Meeting Addis Ababa, 5-7 September 2012.
Development Impact Evaluation in Finance and Private Sector 1.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Harmonisation, Decentralisation and Local Governance.
A short introduction to the Strengthened Approach to supporting PFM reforms.
Kathy Corbiere Service Delivery and Performance Commission
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 Steps in Implementing an Impact.
AADAPT Workshop South Asia Goa, India, December 17-21, 2009 Arianna Legovini Manager, Development Impact Evaluation Initiative The World Bank.
Regional Centers for Evaluation Capacity Development Regional Centers for Evaluation Capacity Development A Multilateral Initiative June 2009.
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
1 Developing country capacity for impact evaluation PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL) Africa Impact Evaluation.
Kenya Evidence Forum - June 14, 2016 Using Evidence to Improve Policy and Program Designs How do we interpret “evidence”? Aidan Coville, Economist, World.
Operational Aspects of Impact Evaluation
impact evaluation for real-time decision-making
Impact Evaluation for Real Time Decision Making
Measuring Results and Impact Evaluation: From Promises into Evidence
Seminario Avaliacão de Impacto em Agricultura e Desinvolvimento Rural
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
Institutionalizing the Use of Impact Evaluation
Workshop of the Africa Program for Impact Evaluation of HIV/AIDS
Development Impact Evaluation in Finance and Private Sector
Implementation Challenges
Impact Evaluation for Real Time Decision Making
Impact Evaluation Designs for Male Circumcision
Implementing the 2030 Agenda in the Asia- Pacific region, January 2019, Shanghai Institutional arrangements to facilitate coherence in sustainable.
Steps in Implementing an Impact Evaluation
Presentation transcript:

PREM Week April 2009 PREM Week April 2009 Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real Time Decision Making

 Can leadership and motivation alone break the poverty trap? –Rwanda rapid results initiative  Can participation change local norms and institutions? – Sierra Leone institutional reform & capacity building/gobifo  What type of information can enhance local accountability? –Uganda schools’ budget  What is the best way to select local projects? –Indonesia direct voting versus representatives’ decisions  Do audits increase accountability in public administration? –Indonesia, Brazil

 These are difficult questions…  We turn to our best judgment for guidance and pick an incentive, subsidy level, a voting scheme, a package of services…  Is there any other incentive, subsidy, scheme or package that will do better?

 A few big decisions are taken during design but many more decisions are taken during roll out & implementation

Road Authority procurement of rural roads O&M Small contracts Fixed-cost contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation Adjustable contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation Large contracts Fixed-cost contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation Adjustable contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation

 Establish which decisions will be taken upfront and which will be tested during roll-out  Scientifically test critical nodes: measure the impact of one option relative to another or to no intervention  Pick better and discard worse during implementation  Cannot learn everything at once  Select carefully what you want to test by involving all relevant partners

Application of the scientific method to understand and measure behavioral response  Hypothesis ▪ If we subdivide contracts then more companies will bid for the contract  Testing ▪ Randomly assign which bids will be subdivided in smaller size contracts and compare number of bidders and bidding price relative to engineering costs  Observations ▪ Smaller contract size increases number of bidders ▪ Bidding prices fall ▪ Costs of administering contracts rise  Conclusion ▪ Smaller contract size increases competition in the bidding procedure

Road Authority procurement of rural roads O&M Small contracts Fixed-cost contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation Adjustable contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation Large contracts Fixed-cost contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation Adjustable contract Conceal engineering costs Allow renegotiation Deny renegotiation Advertize engineering costs Allow renegotiationDeny renegotiation In Okhlahoma, publication of engineering costs lowered procurement costs by 4.6% Andhra Pradesh e- procurement increased bidder participation by 25%

What is Impact Evaluation? Counterfactual analysis isolates the causal effect of an intervention on an outcome  Effect of contract size on number of bidders  Effect of renegotiation option on bidding price  Effect of information on price  Compare same individual with & without option, information etc. at the same point in time to measure the effect  This is impossible  Impact evaluation uses large numbers (individuals, communities) to estimate the effect

How is this done?  Select one group to receive treatment (contract alternative, information…)  Find a comparison group to serve as counterfactual (other alternative, no information…)  Use these counterfactual criteria:  Treated & comparison groups have identical initial average characteristics (observed and unobserved)  The only difference is the treatment  Therefore the only reason for the difference in outcomes is due to the treatment (or differential treatment)

How is monitoring different from impact evaluation? The Monitoring story  In 1996, transfer to school was 20 out of 100 budget allocation  Government started publishing school allocations in newspapers  By 2001, transfers jumped to 80% of budget allocation, a gain of 60 (80-20) % AfterBefore Information Change =60 Low controls High controls

How is monitoring different from impact evaluation? % AfterBefore Information & other factors 36 Impact =44 The Impact evaluation story  In 1996, a low awareness year, transfers to schools were 20% of budget allocation  After the 1996 PETS were published, there was much public discussion and budgets published in local newspapers  By 2001, transfers in schools close to newspaper outlets increased to 80  Transfers to schools far from newspaper outlets increased to 36  Newspaper information increased transfers by 44 (80-36) Low controls High controls

How to use Monitoring & Impact Evaluation?  Use monitoring to track implementation efficiency (input-output) INPUTSOUTCOMESOUTPUTS MONITOR EFFICIENCY EVALUATE EFFECTIVENESS $$$ BEHAVIOR  Use impact evaluation to measure effectiveness (output-outcome)

DescriptiveanalysisDescriptiveanalysis CausalanalysisCausalanalysis  Monitoring and process evaluation Is program being implemented efficiently? Is program targeting the right population? Are outcomes moving in the right direction?  Impact Evaluation What was the effect of the program on outcomes? How would outcomes change under alternative program designs? Is the program cost-effective?

 Are transfers to localities being delivered as planned?  Does information reduce capture?  What are the trends in public procurement?  Do contractual incentives increase timeliness in delivery of services?  M&E  IE  M&E  IE

Uganda Community-Based Nutrition  Failed project  In year 3 communities stopped receiving funds  Parliament negative reaction  Intervention stopped …but…  Strong impact evaluation results in year 1-2  Children in treatment scored half a standard deviation better than children in the control  Recently, Presidency asked to take a second look at the evaluation: saving the baby? Why evaluate: babies & bath water

 Improve quality of programs  Separate institutional performance from quality of intervention  Test alternatives and inform design in real time  Increase program effectiveness  Answer the “so what” questions  Build government institutions for evidence-based policy-making  Plan for implementation of options not solutions  Find out what alternatives work best  Adopt better way of doing business and taking decisions

PM/Presidency: Communicate to constituencies Treasury/ Finance: Allocate budget Line ministries: Deliver programs and negotiate budget Effects of government program BUDGET SERVICE DELIVERY CAMPAIGN PROMISES Accountability Cost-effectiveness of alternatives and effect of sector programs Cost- effectiveness of different programs

From:  Program is a set of activities designed to deliver expected results  Program will either deliver or not To:  Program is menu of alternatives with a learning strategy to find out which work best  Change programs overtime to deliver more results Shifting Program Paradigm

 From retrospective, external, independent evaluation  Top down  Determine whether program worked or not  To prospective, internal, and operationally driven impact evaluation /externally validated  Set program learning agenda bottom up  Consider plausible implementation alternatives  Test scientifically and adopt best  Just-in-time advice to improve effectiveness of program over time

 Question design-choices of program  Institutional arrangements, Delivery mechanisms, Packages, Pricing/incentive schemes  Use random trials to test alternatives  Focus on short term outcomes  take up rates, use, adoption  Follow up data collection and analysis  months after exposure  Measure impact of alternative treatments on short term outcomes and identify “best”  Change program to adopt best alternative  Start over

 How much does the program deliver?  Is it cost-effective?  Use most rigorous method of evaluation possible  Focus on higher level outcomes  educational achievement, health status, income  Measure impact of operation on stated objectives and a metric of common outcomes  One, two, three year horizon  Compare with results from other programs  Inform budget process and allocations

 This is a technical assistance product to change the way decisions are taken  It is about building a relationship  Adds results-based decision tools to complement existing sector skills  The relationship delivers not one but a series of analytical products  Must provide useful (actionable) information at each step of the impact evaluation

 Objective  To build know‐how in implementing agencies and work with Bank operations to generate operational knowledge  Programmatic Activities  Annual workshops for skill development  Community of practice for South‐to‐South learning  Technical advisory group to assure quality of analytical work  In-Country Activities  Technical support and field coordination through project cycle

Thank you Webwww.worldbank.org/dimewww.worldbank.org/dime