Evaluating agricultural value chain programs: How we mix our methods

Slides:



Advertisements
Similar presentations
Mywish K. Maredia Michigan State University
Advertisements

Supporting continuous improvement in the replication process Getting to Grips with Replication Seminar 3: Monitoring, evaluation & continuous improvement.
Regional Policy Meta-Evaluation Some Reflections Evaluation Conference, Warsaw 12/13 November 2012 Veronica Gaffey Head of Evaluation DG for Regional.
Philip Davies The Challenges Ahead for Impact Evaluation Studies in Poland Using Mixed Methods of Evaluation for Policy Making Purposes.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
How to evaluate ultimate impact of value chain interventions? Mixed methods design for attributing indirect interventions to farmers’ income. The case.
Friday, November 14 and Monday, November 17 Evaluating Scientific Argument: Peer Review IPHY 3700 Writing Process Map.
Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Policy evaluation and empirical.
Click to add title Household energy efficiency programme evaluation: does it tell us what we need to know? Dr Joanne Wade CXC
Qualitative Studies: Case Studies. Introduction l In this presentation we will examine the use of case studies in testing research hypotheses: l Validity;
Michael Abbott The Impacts of Integration and Trade on Labor Markets: Methodological Challenges and Consensus Findings in the NAFTA Context.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Bryman: Social Research Methods, 4 th edition What is a concept? Concepts are: Building blocks of theory Labels that we give to elements of the social.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Mattea Stein Quasi Experimental Methods.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
CAUSAL INFERENCE Presented by: Dan Dowhower Alysia Cohen H 615 Friday, October 4, 2013.
Impact Evaluations and Development Draft NONIE Guidance on Impact Evaluation Cairo Conference: Perspectives on Impact Evaluation Tuesday, March 31, 2009.
29 February 2012 Inter-Agency Group on Economic and Financial Statistics (IAG) and the G-20 Data Gaps Initiative Laurs Nørlund Director - National Accounts,
CAPRI EC4MACS Kick Off meeting, Laxenburg, Peter Witzke, EuroCARE The role of EuroCARE and Bonn University in EC4MACS The role of EuroCARE.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
GENERALIZING RESULTS: the role of external validity.
Representing Simple, Complicated and Complex Aspects in Logic Models for Evaluation Quality Presentation to the American Evaluation Association conference,
Bringing Diversity into Impact Evaluation: Towards a Broadened View of Design and Methods for Impact Evaluation Sanjeev Sridharan.
Evaluation Value Chain Interventions LEI approach in practice Giel Ton Agricultural Economics Research Institute - LEI Wageningen UR 3 November 2011 The.
Integration of generic competencies and content in the Skills for a Changing World Curriculum Key Issues to Consider.
Stages of Research and Development
New Survey Questionnaire Indicators in PISA and NAEP
Theory of change Francis Rathinam, Evaluation Specialist
Introduction to Marketing Research
New Product Innovation
DATA COLLECTION METHODS IN NURSING RESEARCH
Introduction Front-End Analysis
Enacting Multiple Strategies and Limiting Potential Successes: Reflections on Advocacy Evaluation, Competing Objectives, and Pathways to Policy Change.
Thursday 2nd of February 2017 College Development Network
Constructing hypotheses & research design
Quasi Experimental Methods I
Introduction to Evaluation
Quasi Experimental Methods I
WRITING AND PUBLISHING RESEARCH ARTICLES
Preface to the special issue on context-aware recommender systems
Evaluation of Nutrition-Sensitive Programs*
Research Design: Terms to Know
Conducting Efficacy Trials
Critical / Academic Reading
The added value of evaluation
Monitoring and measuring success in the Sport for Development sector
CASE STUDY RESEARCH An Introduction.
Building an Intervention Logic
Measuring Social Life: How Many? How Much? What Type?
Application of Logic Modeling Processes to Explore Theory of Change from Diverse Cultural Perspectives Ricardo Millett, Sharon Dodson, & Cynthia Phillips.
Mixing methods to assess the impact of private sector support: experiences from the Dutch PRIME-programme Giel Ton Thursday 19th October –
Research Methods: Concepts and Connections First Edition
Public Policy Evaluation Office National Department of Planning
Social Research Methods
Development Impact Evaluation in Finance and Private Sector
Adriaan Dierx European Commission* 15th Annual ACE Conference 2017
The Use of Counterfactual Impact Evaluation Methods in Cohesion Policy
Formative assessment of the Engineering Design process
Formulating the research design
Consider the Evidence Evidence-driven decision making
DHET/NSDS III RME Capacity Building Workshop
Reflective Writing.
Chapter 4 Summary.
The way we make reasoning
Title Team Members.
Presentation transcript:

Evaluating agricultural value chain programs: How we mix our methods 15/01/2019 Marieke de Ruyter de Wildt AEA Conference San Antonio 10-13 November 2010

Regulatory environment From Agricultural Value Chains to Systems 15/01/2019 Supporting services Inputs, land, energy, price information, R&D Inputs Farmers Brokers Processors Trader Retailers Consumer Regulatory environment Standards, regulation, sector policies International prices, trade agreements, tariffs policies, speculations

Change in Approach, change in impact patterns 15/01/2019 End program Now: developing market systems around chain actors Outreach Before: direct delivery to chain actors Time 3

Guideline 1: Critical Ingredients 15/01/2019 Any evaluation should at least have: Logic Model (beliefs, activities and results) Methods that can face scrutiny Insights that allow replication

Guideline 2: Test Against Validity Threats 15/01/2019 Explore robustness of ingredients from different angles: Construct validity: Are concepts properly defined and operationalized Internal validity: Resolve issues of causality/attribution Statistical conclusion When using statistics, do it validity: properly External validity: Under what conditions do conclusions apply Source: Shadish, W. R., T. D. Cook, et al. (2002). Experimental and Quasi-Experimental Designs for Generalized Causal Inference

Combining Ingredients and Validity Threats 15/01/2019 Combining Ingredients and Validity Threats Construct Internal Statistic External Logic model High Method Replication

1. Focus Define logic model On what basis do we expect success? 15/01/2019 Define logic model On what basis do we expect success? What ‘level’ of definition suits our evaluative question? What are critical assumptions, obviously about impact but also about assumed causalities (arrows)? How to test logic model and counterfactuals (critics)? Construct Internal. X

2. Method Mix methods to anticipate validity threats 15/01/2019 Mix methods to anticipate validity threats 1. Negotiate core methodology that fits main evaluative questions and ‘real-world constraints’: how can we anticipate implementation issues? 2. Add additional methods for assumptions in A. Program theory: are concepts precise enough to measure? B. Methodology: Right timing? Have enough indicators? Control group biased or sufficiently clean (spillover)? Construct Statistical X

3. Replication Explore conditions that make it work 15/01/2019 Explore conditions that make it work Reflect on common elements in pilots: do we have ‘generalisation domain’ defined? Focus on mechanisms in context: What works for whom under what conditions? We have methods that allow more general conclusions? Construct External X

Example: training coffee farmers in Vietnam

1. Logic Model Critical assumptions Methodological implications 15/01/2019 More knowledge on good practices More income, better quality and less damage (PPP) Better agricultural practices Training farmers Critical assumptions Methodological implications Training is fairly homogeneous: Realist case comparison of access criteria, modules and delivery More knowledge leads to better practices: and of application mechanisms of knowledge More training = better: compare intensive training to less intensive training (factor or cluster analysis) Reduce threats to construct and internal validity

Reduce threats to Construct and 2. Method 15/01/2019 Core method: Difference in Difference to scan for results in PPP Added mixed methods: For key assumption in program theory Realist case studies to scan unexpected outcomes (eg increase in self esteem) Considering mediating and moderating variables (thanks to Kathleen, Research Works) For methodological assumptions Nested survey (power analysis) Pilots (data availability) Reduce threats to Construct and Statistical validity

Reduce threats to Construct and 3. Replication 15/01/2019 Reduce threats to Construct and External validity

Conclusions 15/01/2019 One-method research might be good for publication in top journals, but rarely for generating convincing evidence for involved agents Evaluation design needs to be Theory-based (clarify evaluative questions) Using mix methods (minimize validity threats) Address policy relevance (make sense of diversity) Considering validity threats up front helps to find a more robust mix of methods

Marieke.ruyter@wur.nl