Development Impact Evaluation in Finance and Private Sector

Slides:



Advertisements
Similar presentations
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Advertisements

Second Cross-country Workshop of the Africa Programs for Education and AIDS Impact Evaluation Dakar, December 2008 Africa Impact Evaluation Initiative.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
PAI786: Urban Policy Class 2: Evaluating Social Programs.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Using Randomized Evaluations.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
Measuring Impact: Experiments
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Mattea Stein Quasi Experimental Methods.
AADAPT Workshop South Asia Goa, December 17-21, 2009 Nandini Krishnan 1.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Use of impact evaluation for operational research PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL)
PREM Week April 2009 PREM Week April 2009 Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real.
Private involvement in education: Measuring Impacts Felipe Barrera-Osorio HDN Education Public-Private Partnerships in Education, Washington, DC, March.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Why Use Randomized Evaluation? Isabel Beltran, World Bank.
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Workshop of the Africa Program for Impact Evaluation of HIV/AIDS.
Applying impact evaluation tools A hypothetical fertilizer project.
Impact Evaluation for Evidence-Based Policy Making
An introduction to Impact Evaluation and application to the Ethiopia NFSP Workshop on Approaches to Evaluating The Impact of the National Food Security.
AADAPT Workshop for Impact Evaluation in Agriculture and Rural Development Goa, India 2009 With generous support from Gender Action Plan.
Measuring Impact 1 Non-experimental methods 2 Experiments
Development Impact Evaluation in Finance and Private Sector 1.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 Steps in Implementing an Impact.
Randomized Assignment Difference-in-Differences
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Using Randomized Evaluations to Improve.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Randomization.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank.
Impact Evaluation Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative AFTRL.
1 Developing country capacity for impact evaluation PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL) Africa Impact Evaluation.
Kenya Evidence Forum - June 14, 2016 Using Evidence to Improve Policy and Program Designs How do we interpret “evidence”? Aidan Coville, Economist, World.
Looking for statistical twins
Using Randomized Evaluations to Improve Policy
Operational Aspects of Impact Evaluation
Impact Evaluation for Real Time Decision Making
Measuring Results and Impact Evaluation: From Promises into Evidence
Food and Agriculture Organization of the United Nations
Quasi Experimental Methods I
Quasi Experimental Methods I
Seminario Avaliacão de Impacto em Agricultura e Desinvolvimento Rural
Quasi-Experimental Methods
Using Randomized Evaluations to Improve Policy
Institutionalizing the Use of Impact Evaluation
Workshop of the Africa Program for Impact Evaluation of HIV/AIDS
Impact Evaluation Methods
1 Causal Inference Counterfactuals False Counterfactuals
Evaluating research Is this valid research?.
Implementation Challenges
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation for Real Time Decision Making
III. Practical Considerations in preparing a CIE
Evaluating Impacts: An Overview of Quantitative Methods
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Designs for Male Circumcision
Class 2: Evaluating Social Programs
Sampling for Impact Evaluation -theory and application-
Class 2: Evaluating Social Programs
Positive analysis in public finance
Steps in Implementing an Impact Evaluation
Steps in Implementing an Impact Evaluation
Presentation transcript:

Development Impact Evaluation in Finance and Private Sector

Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real Time Decision-Making in Finance and Private Sector

Do we know… How much do regulatory reforms and one-stop-shops generate entry and employment? What incentives can be used to convince local politician to implement the reforms? What information and services improve business organization and lower production costs? What networking and market access structures lower firm death rates? What type of trade facilitation will increase exports and export diversification?

Trial and error We turn to our best judgment for guidance and pick a package of services, access to credit mechanisms, registration rules, trade facilitation incentives … Is there any other package, mechanism, rule or incentive that will do better?

The decision process is complex Design Early roll out Implementation A few big decisions are taken during design but many more decisions are taken during roll out & implementation

Developing a decision tree for an SME support program… SME services Public information campaign Non-financial services Center-based training Vouchers + information On-site training Non-financial + financial services Door-to-door promotion

How to select between plausible alternatives? Establish which decisions will be taken upfront and which will be tested during roll-out Experimentally test critical nodes: measure the impact of one option relative to another or to no intervention Pick better and discard worse during implementation Cannot learn everything at once Select carefully what you want to test by involving all relevant partners

Walk along the decision tree for your SME program to get more results SME services Public information campaign Non-financial services Center-based training Vouchers + information On-site training Non-financial + financial services Door-to-door promotion Take-up MAX Use of services Benefits from services

H0w we support you

What is Impact Evaluation? Impact evaluation measures the effect of an intervention on outcomes of interest relative to a counterfactual (what would have happened in the absence of) It identifies the causal effect of an intervention on an outcome separately from the effect of other time-varying conditions

What is counterfactual analysis? Compare same individual with & without training, information etc. at the same point in time to measure the effect This is impossible Impact evaluation uses large numbers of entrepreneurs, or municipalities to estimate the effect

What is a good counterfactual? Treated & control groups have identical observed and unobserved characteristics The only reason for the difference in outcomes is due to the intervention How? Assign intervention to some and not some other eligible populations on a random basis or on the basis of clear and measurable criteria Obtain a treatment and a control group Measure and compare outcomes in those groups over time

Ethical considerations It is not ethical to deny benefits to something that is available and we know works HIV medicine proven to prolong life It is ethical to test interventions before scale up if we don’t know if it works and whether it has unforeseen consequences Food aid may impair local markets and create perverse incentives Most times we use opportunities created by roll out and budget constraints to evaluate so as to minimize ethical considerations

Methods (later & tomorrow) Experimental or random assignment Equal chance of being in the treatment or comparison group By design treatment and comparison have the same characteristics (observed and unobserved), on average Simple analysis (means comparison) and unbiased impact estimates Non-experimental (Regression discontinuity, IV and encouragement designs, Difference in difference) Require more assumptions or might only estimate local treatment effects May suffer from non-observed variable bias Use more than one method to check robustness of results

How is monitoring different from impact evaluation? Monitoring is trend analysis Change over time Compare results before and after on the “treated” group After Before A B t0 t1 Intervention Change Y B’ Impact Impact evaluation Change over time and relative to comparison Compare results before and after in the “treated” group and relative to the “untreated” group

Monitoring & Impact Evaluation monitoring to track implementation efficiency (input-output) impact evaluation to measure effectiveness (output-outcome) BEHAVIOR MONITOR EFFICIENCY INPUTS OUTPUTS OUTCOMES EVALUATE EFFECTIVENESS $$$

Question types & methods Monitoring and process evaluation Is program being implemented efficiently? Is program targeting the right population? Are outcomes moving in the right direction? Impact Evaluation What was the effect of the program on outcomes? How would outcomes change under alternative program designs? Is the program cost-effective? Descriptive analysis Causal analysis

Why Evaluate? Improve quality of programs Separate institutional performance from quality of intervention Test alternatives and inform design in real time Increase program effectiveness Answer the “so what” questions Build government institutions for evidence-based policy-making Plan for implementation of options not solutions Find out what alternatives work best Adopt better way of doing business and taking decisions

Institutional framework PM/Presidency: Communicate to constituencies Treasury/ Finance: Allocate budget Line ministries: Deliver programs and negotiate budget Cost-effectiveness of different programs Effects of government program BUDGET SERVICE DELIVERY CAMPAIGN PROMISES Accountability Cost-effectiveness of alternatives and effect of sector programs

Shifting Program Paradigm From: Program is a set of activities designed to deliver expected results Program will either deliver or not To: Program is menu of alternatives with a learning strategy to find out which work best Change programs overtime to deliver more results

Shifting Evaluation Paradigm From retrospective, external, independent evaluation Top down Determine whether program worked or not To prospective, internal, and operationally driven impact evaluation /externally validated Set program learning agenda bottom up Consider plausible implementation alternatives Test scientifically and adopt best Just-in-time advice to improve effectiveness of program over time

Retrospective (designed & evaluated ex-post) vs Retrospective (designed & evaluated ex-post) vs. Prospective (designed ex-ante and evaluated ex-post) Retrospective impact evaluation: Collecting data after the event you don’t know how participants and nonparticipants compared before the program started Have to try and disentangle why the project was implemented where and when it was, after the event Prospective evaluation: design the evaluation to answer the question you need to answer collect the data you will need 22

Is this a one shot analytical product? Must provide useful (actionable) information at each step of the impact evaluation

Thank you Financial support from Is gratefully acknowledged