Impact Evaluation for Real Time Decision Making

Slides:



Advertisements
Similar presentations
Mozambique Sustainable Irrigation Development Project (PROIRRI) Early Stage Steps to a Future Impact Evaluation Herminia Pedro, MINAG - DE Paulino Balate.
Advertisements

The World Bank Human Development Network Spanish Impact Evaluation Fund.
Second Cross-country Workshop of the Africa Programs for Education and AIDS Impact Evaluation Dakar, December 2008 Africa Impact Evaluation Initiative.
Explanation of slide: Logos, to show while the audience arrive.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Arianna Legovini Africa Impact Evaluation Initiative (AIM) and Development.
Theme: Innovations in Addressing Rural Finance Challenges in Africa INSTITUTIONAL INTRODUCTION COFFEE DEVELOPMENT FUND BY GEORGE O. OOKO MANAGING TRUSTEE.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
With the financial support of MAFAP project overview.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Mattea Stein Quasi Experimental Methods.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Use of impact evaluation for operational research PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL)
TRI science addiction USING PERFORMANCE AND OUTCOME MEASURES Mady Chalk, Ph.D. Treatment Research Institute Summit on Performance and Outcome Measurement.
PREM Week April 2009 PREM Week April 2009 Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real.
Why Evaluate? Evaluating the Impact of Projects and Programs, Beijing, China April Shahid Khandker World Bank Institute.
Commissioning Self Analysis and Planning Exercise activity sheets.
MARKETS II M&E FRAMEWORK AND CHALLENGES Joseph Obado.
Mastewal Yami Post Doctoral Fellow: Social and Institutional Scientist Challenges to Investment in Irrigation in Ethiopia: Lessons.
Private involvement in education: Measuring Impacts Felipe Barrera-Osorio HDN Education Public-Private Partnerships in Education, Washington, DC, March.
Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank.
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Workshop of the Africa Program for Impact Evaluation of HIV/AIDS.
IFMIS RELATED ISSUES in AFRICAN COUNTRIES. List of Anglophone countries currently developing, implementing IFMIS, and/or those in pipeline. Uganda, Tanzania,
An introduction to Impact Evaluation and application to the Ethiopia NFSP Workshop on Approaches to Evaluating The Impact of the National Food Security.
AADAPT Workshop for Impact Evaluation in Agriculture and Rural Development Goa, India 2009 With generous support from Gender Action Plan.
Africa RISING M&E Expert Meeting Addis Ababa, 5-7 September 2012.
Development Impact Evaluation in Finance and Private Sector 1.
Revisions Proposed to the CIS Plan by the Global Office Misha V. Belkindas Budapest, July 3-4, 2013.
Kathy Corbiere Service Delivery and Performance Commission
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 Steps in Implementing an Impact.
AADAPT Workshop South Asia Goa, India, December 17-21, 2009 Arianna Legovini Manager, Development Impact Evaluation Initiative The World Bank.
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
Agricultural (rain and irrigation) water management across landscape for sustainable intensification and smallholders resilience building.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
1 Developing country capacity for impact evaluation PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL) Africa Impact Evaluation.
Kenya Evidence Forum - June 14, 2016 Using Evidence to Improve Policy and Program Designs How do we interpret “evidence”? Aidan Coville, Economist, World.
Improving Effectiveness and Outcomes for the Poor in Health, Nutrition, and Population An Evaluation of World Bank Group Support Since 1997 Martha Ainsworth,
Road Owners and PMS Christopher R. Bennett Senior Transport Specialist East Asia and Pacific Transport The World Bank Washington, D.C.
Logic Models How to Integrate Data Collection into your Everyday Work.
Monitoring and Evaluating Rural Advisory Services
How to show your social value – reporting outcomes & impact
Operational Aspects of Impact Evaluation
impact evaluation for real-time decision-making
Measuring Results and Impact Evaluation: From Promises into Evidence
Quasi Experimental Methods I
Project Management and Monitoring & Evaluation
Right-sized Evaluation
Investment Logic Mapping – An Evaluative Tool with Zing
Building the foundations for innovation
Quasi Experimental Methods I
Seminario Avaliacão de Impacto em Agricultura e Desinvolvimento Rural
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
Multi-Sectoral Nutrition Action Planning Training Module
Institutionalizing the Use of Impact Evaluation
P C Girihagama, T U Thilakawardane
Workshop of the Africa Program for Impact Evaluation of HIV/AIDS
Development Impact Evaluation in Finance and Private Sector
The UBSUP/SafiSan Programme
4.2 Identify intervention outputs
Implementation Challenges
Impact Evaluation for Real Time Decision Making
Class 2: Evaluating Social Programs
Sampling for Impact Evaluation -theory and application-
Class 2: Evaluating Social Programs
Developing the power sector in Federal Nepal Main lessons from international experience Kathmandu, November 06, 2018.
Steps in Implementing an Impact Evaluation
STRENGTHENING/IMPROVING THE CAPACITY OF
Steps in Implementing an Impact Evaluation
Training module on anthropometric data quality
Presentation transcript:

Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) & Africa Impact Evaluation Initiative (AIM) World Bank Impact Evaluation for Real Time Decision Making

Do we know… What information and services will improve market conditions for farmers? –India soy, Kenya horticulture What payment system will secure the financial sustainability of irrigation schemes? –Ethiopia irrigation What incentives will foster a sustainable use of land and water resources? –Nigeria Fadama What type of information will enhance local accountability? –Uganda schools’ budget What is the best way to select local projects? –Indonesia direct voting versus representatives’ decisions Will local workforce participation improve construction and maintenance of local investments? –Afghanistan road construction

Of course we know! We are the experts These are difficult questions… We turn to our best judgment for guidance and pick a subsidy level, a voting scheme, a package of services… Is there any other subsidy, scheme or package that will do better?

The decision process is complex Design Early roll out Implementation A few big decisions are taken during design but many more decisions are taken during roll out & implementation

Developing a decision tree for an irrigation scheme… Build and operate to large private operator Subschemes organized around farmer associations Water payments independently collected Water payments subtracted from crop sales New user associations established Built by private constructions co. and operated by user consortium Built and operated by government Water payment collected through taxation

How to select between plausible alternatives? Establish which decisions will be taken upfront and which will be tested during roll-out Scientifically test critical nodes: measure the impact of one option relative to another or to no intervention Pick better and discard worse during implementation Cannot learn everything at once Select carefully what you want to test by involving all relevant partners

Walk along the decision tree for your irrigation scheme to get more results Build and operated by large private operator Subschemes organized around farmer associations Water payments independently collected Water payments subtracted from crop sales New user associations established Built by private constructions co. and operated by user consortium Built and operated by government Water payment collected through taxation

Impact evaluation Application of the scientific method to understand and measure human behavior Hypothesis If we subsidize fertilizer then farmers will use more fertilizer and increase production Testing Provide different levels of subsidy and quantities to different farmers and compare fertilizer use and productivity Observations Fertilizer use increases at a decreasing rate with subsidized price Production increases and then declines with fertilizer quantities Conclusion There is a subsidy that maximizes use and an optimal quantity that maximizes production

What does impact mean? The word impact is often misused as a synonym for higher-level outcome Impact originally means “effect of something onto something else” Here impact is the “effect of the intervention on the outcome” In other words it is the portion of the observed change in any outcome caused by the intervention of interest

What is Impact Evaluation? Counterfactual analysis isolates the causal effect of an intervention on an outcome Effect of subsidy on fertilizer use Effect of information on market prices Compare same individual with & without subsidy, information etc. at the same point in time to measure the effect This is impossible Impact evaluation uses large numbers (farmers, communities) to estimate the effect

How is this done? Select one group to receive treatment (subsidy, information…) Find a comparison group to serve as counterfactual Use these counterfactual criteria: Treated & comparison groups have identical initial average characteristics (observed and unobserved) The only difference is the treatment Therefore the only reason for the difference in outcomes is due to the treatment

How is monitoring different from impact evaluation? Monitoring is trend analysis Change over time Compare results before and after on the “treated” group After Before A B t0 t1 Intervention Change Y B’ Impact Impact evaluation Change over time and relative to comparison Compare results before and after in the “treated” group and relative to the “untreated” group

Monitoring & Impact Evaluation monitoring to track implementation efficiency (input-output) impact evaluation to measure effectiveness (output-outcome) BEHAVIOR MONITOR EFFICIENCY INPUTS OUTPUTS OUTCOMES EVALUATE EFFECTIVENESS $$$

Question types and methods Monitoring and process evaluation Is program being implemented efficiently? Is program targeting the right population? Are outcomes moving in the right direction? Impact Evaluation What was the effect of the program on outcomes? How would outcomes change under alternative program designs? Is the program cost-effective? Descriptive analysis Causal analysis

When would you use M&E and when IE? Are grants to communities being delivered as planned? Does participation reduce elite capture? What are the trends in agricultural productivity? Does agricultural extension increase technology adoption? M&E IE

Separate performance from quality of intervention: babies & bath water Uganda Community-Based Nutrition Failed project Project ran into financial difficulties Parliament negative reaction Intervention stopped …but… Strong impact evaluation results Children in treatment scored half a standard deviation better than children in the control Recently, Presidency asked to take a second look at the evaluation: saving the baby?

Why Evaluate? Improve quality of programs Separate institutional performance from quality of intervention Test alternatives and inform design in real time Increase program effectiveness Answer the “so what” questions Build government institutions for evidence-based policy-making Plan for implementation of options not solutions Find out what alternatives work best Adopt better way of doing business and taking decisions

Institutional framework PM/Presidency: Communicate to constituencies Treasury/ Finance: Allocate budget Line ministries: Deliver programs and negotiate budget Cost-effectiveness of different programs Effects of government program BUDGET SERVICE DELIVERY CAMPAIGN PROMISES Accountability Cost-effectiveness of alternatives and effect of sector programs

Shifting Program Paradigm From: Program is a set of activities designed to deliver expected results Program will either deliver or not To: Program is menu of alternatives with a learning strategy to find out which work best Change programs overtime to deliver more results

Shifting Evaluation Paradigm From retrospective, external, independent evaluation Top down Determine whether program worked or not To prospective, internal, and operationally driven impact evaluation /externally validated Set program learning agenda bottom up Consider plausible implementation alternatives Test scientifically and adopt best Just-in-time advice to improve effectiveness of program over time

Internal and operationally-driven impact evaluation Bottom up requires capacity development for IE in implementing agencies Some formal training Mainly application and learning by doing by being part of the evaluation team Objective use impact evaluation as an internal and routine management tool secure policy feedback

Operational questions: managing for results Question design-choices of program Institutional arrangements, Delivery mechanisms, Packages, Pricing/incentive schemes Use random trials to test alternatives Focus on short term outcomes take up rates, use, adoption Follow up data collection and analysis 3-6-12 months after exposure Measure impact of alternative treatments on short term outcomes and identify “best” Change program to adopt best alternative Start over

Policy questions: accountability How much does the program deliver? Is it cost-effective? Use most rigorous method of evaluation possible Focus on higher level outcomes educational achievement, health status, income Measure impact of operation on stated objectives and a metric of common outcomes One, two, three year horizon Compare with results from other programs Inform budget process and allocations

Is this a one shot analytical product? This is a technical assistance product to change the way decisions are taken It is about building a relationship Adds results-based decision tools to complement existing sector skills The relationship delivers not one but a series of analytical products Must provide useful (actionable) information at each step of the impact evaluation