Development Impact Evaluation in Finance and Private Sector 1.

Slides:



Advertisements
Similar presentations
Presented by: Denise Sjahkit SURINAME. Introduction Overview of the main policy issues Scope Current compilation practices Data-sources Requirements for.
Advertisements

Role of CSOs in monitoring Policies and Progress on MDGs.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The concepts/mechanisms/tools for developing a Joint Programme: Critical issues and UNDG Joint Programme Guidance and formats.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
1 Service Providers Capacity Assessment Framework Presentation to the Service Delivery Advisory Group August 28, 2008.
Group 3 Financing EFA: Domestic resource mobilization and external support Facilitator: Robert Prouty, The World Bank.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
Concept note for Pakistan’s Water Sector Strategy Asjad Imtiaz Ali, Prject Director, Water Sector Capacity Building, Ministry of Water and Power Zahoor.
The French Youth Experimentation Fund (Fonds d’Expérimentation pour la Jeunesse – FEJ) Mathieu Valdenaire (DJEPVA - FEJ) International Workshop “Evidence-based.
Operationalizing IE: Case Study example: Textbooks and Teacher Training in Sierra Leone APEIE Workshop, May
1 Module 6 Putting It All Together. 2 Learning Objectives At the end of this session participants will understand: The monitoring and evaluation process.
 The objective was to provide the World Bank, the League of Arab States and CAWTAR with a better understanding of your needs and interests, and of how.
Monitoring & Evaluation in World Bank Agriculture and Rural Development Operations SASAR Monitoring & Evaluation Workshop New Delhi; June 20, 2006.
SAMPLE IMPLEMENTATION PLAN OF A POVERTY ASSESSMENT TOOL To Report to the Management of Microfinance Association of Patharland (MAP), Patharland This proposal.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Using Randomized Evaluations.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Part 1: overview of the UNESCO Template ANPRO-Model (Slides 2 - 9)
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Nandini Krishnan Africa Impact Evaluation Initiative World Bank April 14, 2009.
SEDA IMPACT EVALUATION WESTERN CAPE (SOUTH AFRICA) Varsha Harinath (the dti) Francisco Campos (World Bank) Finance and Private Sector Development IE Workshop.
Use of impact evaluation for operational research PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL)
Evaluating a Research Report
Sierra Leone Education Impact Evaluation Implementation Challenges & Solutions Philip F. Kargbo Philip F. Kargbo Field Coordinator.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
PREM Week April 2009 PREM Week April 2009 Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real.
Workshop Development Impact Evaluation in Finance and Private Sector Dakar February 2010 With generous support from Gender Action Plan The Gambia Team.
Learning Lessons from Experience: good practice case studies Inclusive Education Title: Inclusive education planning in school and local authority levels.
Workshop Development Impact Evaluation in Finance and Private Sector Dakar February 2010 With generous support from Gender Action Plan South Africa Competitiveness.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Workshop of the Africa Program for Impact Evaluation of HIV/AIDS.
European Conference on Quality in Official Statistics Session 26: Census 2011 « Helsinki, 6 May 2010 « Census quality control with BSC: the Portuguese.
EAP Task Force Handbook for Appraisal of Environmental Projects Financed from Public Funds Nelly Petkova Paris, 22 February 2007 EAP Task Force.
MAFF Workshop on INTERNATIONAL BEST Monitoring & Evaluation (M&E) PRACTICES 24 August 2010.
AADAPT Workshop for Impact Evaluation in Agriculture and Rural Development Goa, India 2009 With generous support from Gender Action Plan.
Managing Public Budget to Facilitate Economic Growth and Reduce Poverty Public Expenditure Analysis & Management Staff Training Course May , 2001.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Revisions Proposed to the CIS Plan by the Global Office Misha V. Belkindas Budapest, July 3-4, 2013.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 Steps in Implementing an Impact.
AADAPT Workshop South Asia Goa, India, December 17-21, 2009 Arianna Legovini Manager, Development Impact Evaluation Initiative The World Bank.
Country: Mauritius Manufacturing and Services Development and Competitiveness Project CROSS-COUNTRY WORKSHOP FOR IMPACT EVALUATIONS IN FINANCE AND PRIVATE.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
UNEP EIA Training Resource ManualTopic 14Slide 1 What is SEA? F systematic, transparent process F instrument for decision-making F addresses environmental.
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
Dr. Emmanuel Nkurunziza Director General, Rwanda Natural Resources Authority & Chief Registrar of Land Titles Rwanda’s experience in evidence-based policy.
MALAWI Business Environment Strengthening Technical Assistance Project (BESTAP) Impact Evaluation CROSS-COUNTRY WORKSHOP FOR IMPACT EVALUATIONS IN FINANCE.
World Bank Group Impact Evaluations: Relevance and Effectiveness Jeff Tanner Task Team Leader for Impact Evaluations Independent Evaluation Group Based.
URBACT IMPLEMENTATION NETWORKS. URBACT in a nutshell  European Territorial Cooperation programme (ETC) co- financed by ERDF  All 28 Member States as.
Development of Gender Sensitive M&E: Tools and Strategies.
Evaluation Planning Checklist (1 of 2) Planning Checklist Planning is a crucial part of the evaluation process. The following checklist (based on the original.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
1 Developing country capacity for impact evaluation PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL) Africa Impact Evaluation.
Introduction and Overview
Global Youth Tobacco Survey (GYTS): Overview
Operational Aspects of Impact Evaluation
Impact Evaluation for Real Time Decision Making
Seminario Avaliacão de Impacto em Agricultura e Desinvolvimento Rural
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
Introduction and Overview
Institutionalizing the Use of Impact Evaluation
Workshop of the Africa Program for Impact Evaluation of HIV/AIDS
Development Impact Evaluation in Finance and Private Sector
Implementation Challenges
III. Practical Considerations in preparing a CIE
Operational Aspects of Impact Evaluation
Steps in Implementing an Impact Evaluation
Steps in Implementing an Impact Evaluation
Dakar, February 1-4, 2010 Country: Mauritius
Presentation transcript:

Development Impact Evaluation in Finance and Private Sector 1

Workshop Development Impact Evaluation in Finance and Private Sector Dakar February 2010 With generous support from Gender Action Plan Steps in Implementing an Impact Evaluation Arianna Legovini Head, Development Impact Evaluation Initiative The World Bank

Steps Build capacity Set learning agenda Design impact evaluation Plan for IE implementati on Conduct baseline Roll out intervention Collect follow up data Analyze data Feed results into policy

Step 1. Build capacity for IE  Objectives:  Become informed consumers of impact evaluation  Set the learning agenda  Use it as an internal management tool to improve program over time  How  Training  Learning by doing

Step 2: Set learning agenda  Objective:  Get answers to relevant policy and operational questions  How?  Dialectic discussion involving key policy makers and program managers  Technical facilitation to structure framework of analysis  Focus on few critical policy (what) and operational (how to) questions  Discuss agenda with authorizing environment and constituencies

Cont. 2: Questions  Operational: design-choices of program ▪ Institutional arrangements, Delivery mechanisms, Packages, Pricing/incentive  Management purpose  Use random trials to test alternatives  Measure effects on short term outcomes (months) ▪ take up rates, use, adoption  Scale up better implementation modalities  Policy: effectiveness of program  Accountability purpose  Use random assignment or next best method  Measure effects medium to long term  Scale up/down, negotiate budget, inform

Step 3: Design IE  Exploit opportunities:  Will roll-out take time?  Is the budget allocated insufficient to cover everyone?  Are there quantitative eligibility rules?  If the program has universal access, does it have imperfect take- up?  Set scale:  Pilot to try out an intervention  Large scale w. representative sample: more costly, externally valid  Large scale with purposeful sample: less costly, indicative  Do power calculation to determine minimum sample size

Cont. Step 3  Select “best” method for each of your questions  Feasible  Requires least assumptions  Ethics  No to deny access to something for which there is irrefutable evidence  Test interventions before scale up when you have no solid evidence

Step 4: Planning implementation  Budget cost items ▪ Staff time (PROJECT FUNDS) and training (DIME) ▪ Analytical services and field coordination (DIME) ▪ Data collection (PROJECT FUNDS) ▪ Discussions and dissemination (shared)  Timeline ▪ Use it to organize activities, responsibilities and work backwards to know when to start  Team ▪ Government (program manager, economist/statistician); WB Project team (Task manager or substitute); Research team (Lead researcher, co-researchers, field coordinator); Data collection agency

Step 5: Assignment to treatment and control  The smallest unit of assignment is the unit of intervention  Credit: individual or group  SME services: enterprise  Municipal registration system: municipality  Create listing of treatment units assigned to the intervention and control units that are not  Explain assignment to responsible parties to avoid contamination

Step 6: Baseline data  Q uality assurance : IE team (not data collection agency) to  Design questionnaire and sample  Define terms of reference for data collection agency  Train enumerators  Conduct pilot  Supervise data collection  Do not collect data before your design is ready and agreed

Cont. Step 6: Baseline data  Contract data collection agency  Bureau of Statistics: Integrate with existing data  Ministry concerned: Ministry of Agriculture/Water Resources/Rural Development  Private agency  Analyze baseline data a feed back into program and evaluation design if needed  Check for balance between treatment and control group: do they have similar average characteristics?

Step 7: Roll out intervention  Conduct intensive monitoring of roll-out to ensure evaluation is not compromised  What if treatment and control receive the intervention?  What if all the control group receive some other intervention?

Step 8: Follow-up data  Collect follow-up data with the same sample and questionnaire as baseline data  At appropriate intervals

Step 9: Estimate program effects  Randomization: compare average outcomes for treatment and control group  Other methods: Use relevant econometric analysis, test assumptions, check robustness  Are the effects statistically significant?  Basic statistical test tells whether differences are due to the program or to noisy data  Are they significant in real terms?  If a program is costly and its effects are small, may not be worthwhile  Are they sustainable?  Is the trajectory of results sustained?

Step 10: Discuss, Disseminate and Feedback into policy  Are you thinking about this only now?  Discuss what are the policy implications of the results  What actions should be taken  How to present them to higher ups to justify changes/budget/scale up?  Talk to policy-maker and disseminate to wider audience  If no one knows about it, it won’t make a difference  Make sure the information gets into the right policy discussions  Real time discussions  Workshops  Reports  Policy briefs

Final step: Iterate  What do you need to learn next?