Evaluation Value Chain Interventions LEI approach in practice Giel Ton Agricultural Economics Research Institute - LEI Wageningen UR 3 November 2011 The.

Slides:



Advertisements
Similar presentations
Reasons for Monitoring and Evaluation at the Project Level
Advertisements

Mywish K. Maredia Michigan State University
Philip Davies The Challenges Ahead for Impact Evaluation Studies in Poland Using Mixed Methods of Evaluation for Policy Making Purposes.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
ICRISAT-MIP Activities in SA, WCA, ESA. Research questions Which upgrading options? How inclusive? How to empower women? Revised proposal submitted.
How to evaluate ultimate impact of value chain interventions? Mixed methods design for attributing indirect interventions to farmers’ income. The case.
Quantitative vs. Qualitative Research Method Issues Marian Ford Erin Gonzales November 2, 2010.
VIII Evaluation Conference ‘Methodological Developments and Challenges in UK Policy Evaluation’ Daniel Fujiwara Senior Economist Cabinet Office & London.
V MEASURING IMPACT Kristy Muir Stephen Bennett ENACTUS November 2013.
Evaluation Research COMT 502. topics Evaluation research –Process of determining whether intervention has its intended result.
Seminar on selected evaluation methodology issues 评估方法论研讨会 Independent Office of Evaluation of IFAD Beijing, 16 July 2014 国际农业发展基金独立评估办公室 2014 年 7 月 16.
Network of Networks on Impact Evaluation Impact evaluation design for: PADYP Benin Jocelyne Delarue - AFD NONIE design clinic 1 Cairo April, Original.
Impact evaluation of Climate Change interventions Dr Virinder Sharma, DFID India.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
EE325 Introductory Econometrics1 Welcome to EE325 Introductory Econometrics Introduction Why study Econometrics? What is Econometrics? Methodology of Econometrics.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
CAUSAL INFERENCE Presented by: Dan Dowhower Alysia Cohen H 615 Friday, October 4, 2013.
Impact Evaluations and Development Draft NONIE Guidance on Impact Evaluation Cairo Conference: Perspectives on Impact Evaluation Tuesday, March 31, 2009.
The Dutch Innovation voucher Brussels Oct 15, 2010 The Dutch innovation voucher and evaluation issues Marc Van der Steeg CPB Netherlands Bureau for Economic.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Copyright 2003 – Cedar Enterprise Solutions, Inc. All rights reserved. Business Process Redesign & Innovation University of Maryland, University College.
Monitoring and Evaluation Orientation 17 th May 2011 MCA Namibia.
Advice on Data Used to Measure Outcomes Friday 20 th March 2009.
Can a Market-Assisted Land Redistribution Program Improve the Lives of the Poor? Evidence from Malawi Gayatri Datar (World Bank, IEG) Ximena V. Del Carpio.
Current practices in impact evaluation Howard White Independent Evaluation Group World Bank.
1 General Elements in Evaluation Research. 2 Types of Evaluations.
Evaluation Research Dr. Guerette. Introduction Evaluation Research – Evaluation Research – The purpose is to evaluate the impact of policies The purpose.
Baseline & impact assessments & lessons learnt: UTZ Certified Ghana and Ivory Coast ICCO International Workshop On Cocoa Certification Yaoundé, Cameroon.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
Guidance for Analyses of Pilot Interventions European Workplace and Alcohol Berlin, 9 November 2012 Jon Dawson.
Monitoring and Evaluating Rural Advisory Services
Moving Ahead in Rangeland Education
Issues in Evaluating Educational Research
Technical Assistance on Evaluating SDGs: Leave No One Behind
DATA COLLECTION METHODS IN NURSING RESEARCH
Evaluation: For Whom and for What?
Part Two.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Enacting Multiple Strategies and Limiting Potential Successes: Reflections on Advocacy Evaluation, Competing Objectives, and Pathways to Policy Change.
Monitoring and Evaluation Frameworks
Technical Assistance on Evaluating SDGs: Leave No One Behind
Food and Agriculture Organization of the United Nations
Current status of the planning of the AC
Measuring the Effects of an Irrigation and Land Tenure Security Initiative in the Senegal River Valley Baseline findings and evaluation challenges March.
UKES Annual Conference
Impact evaluations at IFAD-IOE
Evaluation of Nutrition-Sensitive Programs*
Pioneering Real-time Monitoring and Evaluation in Small- and Medium Enterprises in Developing Countries Giel Ton Institute of Development Studies Centre.
The added value of evaluation
Modelling Training: A case study from Army Training Branch
Gender Equality Ex post evaluation of the ESF ( )
Module 7 Key concepts Part 2: EVALUATING COMPLEX DEVELOPMENT PROGRAMS
Introductory Econometrics
Mixing methods to assess the impact of private sector support: experiences from the Dutch PRIME-programme Giel Ton Thursday 19th October –
ESF EVALUATION PARTNERSHIP MEETING Bernhard Boockmann / Helmut Apel
Quantitative vs. Qualitative Research Method Issues
Evaluating agricultural value chain programs: How we mix our methods
III. Practical Considerations in preparing a CIE
Introduction to Experimental Design
Evaluating Impacts: An Overview of Quantitative Methods
Albania 2021 Population and Housing Census - Plans
Sampling for Impact Evaluation -theory and application-
SUSTAINABLE MICRO-FINANCE for WOMEN’S EMPOWERMENT
Monitoring and Evaluating FGM/C abandonment programs
Outcome Opportunity to meet participants from other countries
CHAPTER 4 Marketing Information and Research
of the impact of 4 projects of the governmental cooperation
Reviewing RIS3 in Catalonia
Title Team Members.
Presentation transcript:

Evaluation Value Chain Interventions LEI approach in practice Giel Ton Agricultural Economics Research Institute - LEI Wageningen UR 3 November 2011 The Hague

Introduction Each researcher in LEI has its own expertise and methodological wish-list: multiple ways to do evaluations: Models/scenarios Econometrics Case studies Stakeholder processes Steps taken in the Impact Evaluation Theme-group in LEI ( ) We adopt Theory-Based Evaluations We want to improve our research designs in a peer-to-peer process of quality checks and mixed-method design We generated a track-record on credible and rigourous methods (LEI’s ‘selling point’)

Improving rigour

Core steps Making value chain impacts researchable: Define what is considered to be the ‘intervention’ Define what are relevant ‘outcome indicators’ (what is the ‘intervention logic’) Choose a core design (considering ‘counterfactual’) Find additional methods to decrease validity threats Anticipate possible implementation failures of core method

Making impacts of interventions researchable

Checking for validity threats We propose to check the core research method design on the most obvious threats to validity, exploring the issue from four different angles: a)statistical conclusion validity –when using statistics, do it properly b)internal validity –resolve the issue of causality/attribution c)construct validity are the concepts used properly defined and operationalized d)external validity under what conditions/settings does the conclusion/recommendation apply Source: Shadish, W. R., T. D. Cook, et al. (2002). Experimental and Quasi-Experimental Designs for Generalized Causal Inference, Houghton Mifflin Co. Boston, MA.

Income impacts of micro-irrigation technology

Intervention Logic

Intervention: micro-irrigation technology for horticulture Core method: ‘pipeline design’ with retrospective baseline Comparing income streams between yearly customer cohorts Asking in each interview also on the respondents agricultural system ‘before’ and ‘after’ adoption Added mixed method: On non-monetary outcomes (‘wellbeing’) Livelihood impact case studies On context Sector-studies on dynamics in markets and the institutional environment On methodological assumptions Recall bias (repeating measurements in the same households with different recall periods) Selection bias: applying a matching procedure to reduce context difference between respondents in the cohorts that are compared Method

Impacts of certification schemes

Intervention Logic

Intervention: training on Good Agricultural Practices coupled with niche market access Core method: difference-in-difference, impact on farmers knowledge and practices Added mixed methods: On differences in context Qualitative case studies on differences between tea factories (e.g. history of training, additional stimuli to lead farmers) Inclusion of questions to check on differences in ‘access conditions’ of households for some ‘necessary’ equipment/resources On wider impacts Qualitative studies on how certification influenced/contributed to sector-wide policies (e.g. ‘child labour’, ‘traceability systems’, ‘internal control systems’) Method

Impacts of innovation grants on collective marketing groups

Intervention Logic

Intervention: support to value-addition through collective processing Core method: difference-in-difference and time-series; largely through-qualitative interviews on a random sample of enterprises to explore patterns (“for whom does it work under what conditions”) Additional mixed methods: To enhance learning (in network/platform of organisations) Use data to compile training material on illustrative learning experiences (resolving tensions in collective action) To understand context Household survey on key variables in the geographical areas where these enterprises function (rich/poor, trust levels, support context) Method

INTERVENTION LOGICS - Time-consuming process to align different stakeholders to define logic, question and indicators OUTCOME INDICATORS – Away form ‘nitty-gritty’ inmediate outcome indicators or ‘far-away’ ultimate outcomes (MDGs): need for more simple intermediate outcome indicators that allow benchmarking, that can be (partly) attributed to the intervention, and that are still ‘researchable’ RIGOUR IN RESEARCH METHODS - Budget, time and political constraints inherent to contracted research Challenges