How to evaluate ultimate impact of value chain interventions? Mixed methods design for attributing indirect interventions to farmers’ income. The case.

Slides:



Advertisements
Similar presentations
Armenias Millennium Challenge Account: Assessing Impacts Ken Fortson, MPR Ester Hakobyan, MCA Anahit Petrosyan, MCA Anu Rangarajan, MPR Rebecca Tunstall,
Advertisements

Linking regions and central governments: Indicators for performance-based regional development policy 6 th EUROPEAN CONFERENCE ON EVALUATION OF COHESION.
1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
The complex evaluation framework. 2 Simple projects, complicated programs and complex development interventions Complicated programs Simple projects blue.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
23th September 2004 Council of Europe Conference PUBLIC RESPONSIBILITY FOR HIGHER EDUCATION AND RESEARCH – Review of the Economic Literature Alain M. Schoenenberger.
M&E Issues: RAFIP and REP Kaushik Barua Accra, 12 Dec
Participants should expect to understand: Concept of M&E Importance of gender in M&E Different steps in the M&E process Integrating gender into program/project.
1 Monitoring and evaluation after 2013 – some first ideas, mainly for ERDF / CF Evaluation network DG REGIO 14 th October 2010.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Proposal Writing for Competitive Grant Systems
Impact evaluation-approaches and methodologies. Pursuit for Effective Impact Evaluation Tools for Market Development Programmes Most Significant Change.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
UNFCCC Meeting on experiences with performance indicators for monitoring and evaluation of capacity building in developing countries Rio De Janeiro, Brazil.
Network of Networks on Impact Evaluation Impact evaluation design for: PADYP Benin Jocelyne Delarue - AFD NONIE design clinic 1 Cairo April, Original.
FAO NAMA learning tool to support NAMA preparation in agriculture
SAI’s role in development and use of key indicators for R&D evaluation: a quantitative example and some concluding remarks INTOSAI Working Group on Key.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
1 Enhanced Commonwealth Performance Framework: A Programme Manager’s Perspective Government Programmes Community of Practice Forum – 23 March 2015 Suzanne.
Assessing the Impact of Policy Oriented Research Peter Hazell External Coordinator for Impact Assessment at IFPRI 1.
Evaluating FAO Work in Emergencies Protecting Household Food Security and Livelihoods.
Discussion From Republic of Science to Audit Society, Irwin Feller S. Charlot ASIRPA, Paris, France; June 13, 2012.
The inspection of local area responsibilities for disabled children and young people and those who have special educational needs Charlie Henry HMI National.
QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH.
Enterprise Challenge Fund for the Pacific & South East Asia Designing a results system for improved development outcomes 22 July 2015.
Rome, May 3, 2007 How Organic Agriculture Contributes to Food Availability Lukas Kilcher and Christine Zundel Conference on Organic Agriculture.
MARKETS II M&E FRAMEWORK AND CHALLENGES Joseph Obado.
STRENGTHENING “SEED” SECTOR/MARKET OF BANGLADESH.
Impact Evaluations and Development Draft NONIE Guidance on Impact Evaluation Cairo Conference: Perspectives on Impact Evaluation Tuesday, March 31, 2009.
Potential and Pitfalls of Experimental Impact Evaluation: Reflections on the design and implementation of an experimental Payments for Environmental Services.
LEARNING FROM PRACTICE: OPENING THE BLACK BOX OF CONSULTING ENGAGEMENTS Supporting material: SMS Conference Dr. Paul N. Friga.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
Monitoring and Evaluation Orientation 17 th May 2011 MCA Namibia.
Advice on Data Used to Measure Outcomes Friday 20 th March 2009.
Applying impact evaluation tools A hypothetical fertilizer project.
Can a Market-Assisted Land Redistribution Program Improve the Lives of the Poor? Evidence from Malawi Gayatri Datar (World Bank, IEG) Ximena V. Del Carpio.
Evaluation design and implementation Puja Myles
Michelle Kennedy Child Poverty Sector Specialist 1.
"The challenge for Territorial Cohesion 2014 – 2020: delivering results for EU citizens" Veronica Gaffey Acting Director EUROPEAN COMMISSION, DG for Regional.
1 Joint meeting of ESF Evaluation Partnership and DG REGIO Evaluation Network in Gdańsk (Poland) on 8 July 2011 The Use of Counterfactual Impact Evaluation.
World Bank Group Impact Evaluations: Relevance and Effectiveness Jeff Tanner Task Team Leader for Impact Evaluations Independent Evaluation Group Based.
Baseline & impact assessments & lessons learnt: UTZ Certified Ghana and Ivory Coast ICCO International Workshop On Cocoa Certification Yaoundé, Cameroon.
”Land grabs” and contract farming: A win-win situation? Land and Poverty Conference 2016: Scaling up Responsible Land Governance March 14-18, 2016Washington,
LESSONS LEARNT FROM THE GFCS ON DISSEMINATING CIS TO SMALLHOLDER FARMERS IN MALAWI AND TANZANIA Jeanne Coulibaly ICRAF/CGIAR "The Last Mile" workshop organized.
Evaluating the Performance of PBOs Helaina Gaspard Institute of Fiscal Studies and Democracy (IFSD) June 6-10, 2016 World Bank GN-PBO Annual Meeting Washington,
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
Evaluation What is evaluation?
Monitoring and Evaluation Presentation by Kanu Negi Office of Development Effectiveness (ODE), DFAT and David Goodwins Coffey International Development.
Evaluation Value Chain Interventions LEI approach in practice Giel Ton Agricultural Economics Research Institute - LEI Wageningen UR 3 November 2011 The.
Seite 1 Martina Vahlhaus GTZ Evaluation Unit Monitoring & Evaluation Efforts: How do they influence resource allocation? GMF-IFC-GTZ Meeting.
Monitoring and Evaluating Rural Advisory Services
How to improve FADN efficiency in the field of economic analysis
Quasi Experimental Methods I
Session 1 – Study Objectives
Corporate-level Evaluation of IFAD’s Decentralization Experience
Strategic Planning for Learning Organizations
Evaluation of Nutrition-Sensitive Programs*
Evaluating Partnerships
Business Environment Dr. Aravind Banakar –
Business Environment Dr. Aravind Banakar –
Business Environment
Business Environment
Business Environment
Business Environment
Quasi-Experimental Methods
The Use of Counterfactual Impact Evaluation Methods in Cohesion Policy
Evaluating agricultural value chain programs: How we mix our methods
Monitoring and Evaluating FGM/C abandonment programs
Presentation transcript:

How to evaluate ultimate impact of value chain interventions? Mixed methods design for attributing indirect interventions to farmers’ income. The case of maize in Bangladesh Gideon Kruseman and Shovan Chakraborty 25 March 2013

Overview  Case  Evaluation questions  Design (conference theme)  Impact logics  Mixed methods design  Results  Conclusions  Communication and results (conference theme)  Lessons learned

Case description  2004 Katalyst introduced maize through  Retailer training  Extension officers training  Contract farming  2011 impact evaluation indicated strong income impact but method lacked before-after and with-without  2012 new impact evaluation based on rigorous scientific methods

Questions 1.What ultimate impact can be expected 8 years later? 2.How can that be measured?  No baseline possible in VC (uncertain who will benefit ex-ante)  Difficult to determine comparison group with notorious spill-over effects 3.What is the impact?  Outreach  Impact  Attribution Q3 challenging -> Mixed methods Q1&2 uncommon -> Multi- disciplinary team with knowledge of value chains

Design of IE What expectations underpinned choices for design? 1.There is impact that is attributable to the intervention 2.Measurement of impact is limited by lack of baseline, no clean control groups and many external factors 3.Attribution is strongest low in impact chain (with retailers, officials and companies, not with farmers) 4.If intervention is unique, attribution is easier to establish

Impact logics and the IE (Q1)

Design (Q2)  Approach  For each question test assumptions of the impact logic at every step  Core methods  Large n farmer survey data  Small n in-depth interviews with farmers, contractors and retailers, treated and non-treated  Maize sector study  Analysis  Production cost-revenue analysis  Factor analysis  Qualitative analysis of in-depth interviews

Assumptions in impact logic (Q1&2)

Mixed methods (Q3)  What outreach?  In-depth contractors interviews  Validation with in-depth farmers interviews  What income effects?  Large n farmer survey for production cost analysis and quantitative cohort comparison  Validation with in-depth farmers interviews  To what extent can impact be attributed?  Qualitative analysis of all in-depth interviews  Factor analysis of large n survey

Results: outreach Farmers affected by contract farming char newchar old mainland new mainland old Total Direct outreach Indirect outreach

Results: income effect Relevant assumption to be tested: farmers have higher yields than the benchmark char newchar old mainland new mainland oldcopy yield (maund/dec) Total Rev. Impact not considering Land size Change (BDT)17,41042,9801,9383,8797,947 Total Rev. Impact including Land size Change (BDT)38,52380,7574,74210,39916,902 Total income increase (1 yr after intervention)33,399,450177,908,5154,580,29318,230,162395,988,876

Results: contribution of Katalyst  Contract farming has started as a result of Katalyst interventions. ● Only contractors are those involved in the intervention ● These contractors have growing number of contract farmers  Conclusion: True  Knowledge passed on through Katalyst training of contractors is crucial for contractors ● Knowledge comes from many sources including Katalyst  Conclusion: True / False

Results: contribution of Katalyst  Knowledge passed on from contractors to farmers is crucial for farmers ● Knowledge comes from many sources including Contractors ● Contractors are main source of information  Conclusion: partly true  Service provision by contractors is important to contract famers  Conclusion: true  There is a spill over effect of maize cultivation from contract farmers to neighbouring farmers  Conclusion: true

Results: contribution of Katalyst  Knowledge passed on through Katalyst training of Retailers is crucial for retailers ● Knowledge comes from many sources including Katalyst  Conclusion: True / False  Knowledge passed on from retailers to farmers is crucial for farmers ● Knowledge comes from many sources including retailers  Conclusion: True / False

Conclusions  There is impact of contract farming especially because of service provision specifically related to this production form  The impact of contract farming is 100% attributable to Katalyst  Knowledge on maize cultivation comes from many sources, knowledge is vitally important but relative importance of information sources cannot be attributed to any single source.  There is a contribution to the knowledge base through Katalyst interventions

Communication and use of IE Communication is focussed on:  Underpinning and justifying estimated impact  To which interventions could this be attributed  Some interventions (contract farming) have attributable impact  Some interventions (retailer training) do not  Results are used for:  Fine-tuning current interventions  Design of improved monitoring of interventions and immediate and intermediate results to ensure more robust future impact evaluations (counterfactual analysis)

Lessons learned  Importance of impact logic framework  Importance of defining hypothesis based on impact logic  Importance of design based on hypotheses and application of mixed methods to overcome threats to validity of conclusions  Separation of impact and attribution/contribution  Need to inform donors on costs benefits of IE

END Thank you for your attention.