Systematic analysis and synthesis in qualitative evaluation Case study evaluation of the Oxfam Novib programme in Burundi (2005-2009) Ferko Bodnar CDI.

Slides:



Advertisements
Similar presentations
The Emergency Capacity Building (ECB) Project (OXFAM-led) and University of East Anglia (UEA) Contributions to Change: A Guide to Evaluating Change after.
Advertisements

Tools for Policy Influence. RAPID Programme SMEPOL, Cairo, February, Practical Tools.
Towards a model M&E system for AIDS programs Kampala April
February Dakar, Senegal
Evidence-Based Decision Making: The Contribution of Systematic Reviews in Synthesizing Evidence.
Social Development: Proposed Strategic Directions for the World Bank
Phase 1 Do No Harm Basic Phase 2 Partners Beneficiaries Diversion Phase 3 Complaints Flexibility Communication Phase 4 Review Building Capacity Good Enough.
Enhancing the Effectiveness of School Feeding/Nutrition Programmes through Rights-based Approaches A Project Note SCN – Working Group on Nutrition, Ethics.
AN INTRODUCTION TO SPHERE AND THE EMERGENCY CONTEXT
THE NEW FOOD SECURITY ASSESSMENT TOOLS Central Asia Regional Risk Assessment Conference Almaty April 2011.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Screen 1 of 27 Targeting Monitoring and Evaluation of Targeting LEARNING OBJECTIVES Understand the basic measures to monitor and evaluate targeting activities.
Knowledge Translation Curriculum Module 3: Priority Setting Lesson 2 - Interpretive Priority Setting Processes.
Global Poverty Action Fund Community Partnership Window Funding Seminar January 2014 Global Poverty Action Fund Community Partnership Window Funding Seminar.
Evaluating Socio-Economic Development - the ever-evolving EVALSED Guide Elliot Stern Presentation to Conference: Evaluating Public Interventions.
Evaluation. Practical Evaluation Michael Quinn Patton.
Bond.org.uk The Bond Effectiveness Programme: developing a sector wide framework for assessing and demonstrating effectiveness July 2011.
Food Monitoring System (FMS) Madagascar RANDRIANASOLO Fidy Nirina Commodity Manager CRS.MG March 20, 2013 ICT4D Conference Accra, Ghana Mobile Services.
Plan © Plan Assessing programme effectiveness at the global level in a large and complex organisation Presentation delivered to the conference on Perspectives.
Piloting the Household Vulnerability Index to Improve Targeting in WVI programmes in Lesotho, Swaziland and Zimbabwe Tendayi Kureya
BotswanaGhanaMaliSenegalTanzaniaZambia Development and Energy in Africa (DEA) Risø National Laboratory Denmark Energy Research Centre (ECN) Netherlands.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Gambling Research Think Tank December 7, About NSHRF Speaking the Same Language –Evidence, research and evaluation –Types of gambling research strategies.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Why a report on children? What reaches adults does not always reach children  50 % of population  50 % of people in absolute poverty = children = 600.
Stakeholder Analysis.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Project Implementation and Beneficiary Assessment, 2008 (PIBA, 2008) Presented by Robert Deutsch, PIBA Team Leader to the Building a Better Land Administration.
Evaluating FAO Work in Emergencies Protecting Household Food Security and Livelihoods.
Citizens’ contributions to the public agenda on animal cloning: project manager Ida-Elisabeth Andersen Structure of the presentation: 1.What is the Danish.
Impact evaluation: External and internal stakes Impact evaluation seminar - 2 to 6 December, Phnom Penh.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Framework for Monitoring Learning & Evaluation
1 Indicators and gender audits Juliet Hunt IWDA Symposium on Gender Indicators 15 June 2006.
Assessments. Assessment in the Project Cycle DESIGN IMPLEMENTATION MONITORING EVALUATION ASSESSMENT.
Advanced EFSA Learning Programme Session 3.1. Situation Analysis Step 2 Qualitative Data Analysis in EFSA.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
Promoting CARICOM/CARIFORUM Food Security (Project GTFS/RLA/141/ITA) (FAO Trust Fund for Food Security and Food Safety – Government of Italy Contribution)
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
Public Health Advocacy in Low Income Settings: Views and Experiences on Effective Strategies and Evaluation of Health Advocates in Malawi IFGH Conference:
CEPA 10 th Anniversary Colloquium 30 June – 1 July 2011 Azra Abdul Cader, CEPA.

Synthesis of Guidance on Agriculture-nutrition linkages Anna Herforth, consultant to FAO AIARD conference June 5, 2012.
BCO Impact Assessment Component 3 Scoping Study David Souter.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
Impact Evaluation Conference Wageningen March 25, 2013 “Are the children better off”, a large scale concurrent evaluation in India Pam Baatsen, Senior.
WASH Cluster Response Plan Summary Cluster lead agency United Nations International Children’s Emergency Fund Number of projects Estimated 18 from.
Does Reconciliation Affect Conflict and Development? Evidence from a Field Experiment in Sierra Leone.
DAC Evaluation Quality Standards Workshop, Auckland 6/2 & 7/ Evaluation quality standards in Dutch Development Cooperation Ted Kliest Policy and.
THE REPUBLIC OF UGANDA National AIDS Conference Presentation during the 4 th Uganda AIDS partnership Forum, Munyonyo, 31 st January 2006 By James Kaboggoza-Ssembatya,
Measuring Risk, Vulnerability and Impact in Conflict: Doing Better Through Data Collaboration What is DFID doing in Afghanistan?
Using results frameworks to shift the focus of evaluation to a strategic level Emerging research on the principles underpinning results frameworks Kate.
SJI Impact Assessment 2014 Based on Impact Assessment for Development Agencies – Learning to Value Change (Chris Roche, Oxfam, digital file 2010)
DFID – WFP Country Partnership Agreement 1 WFP and DFID Partnership Agreement …Towards greater collaboration.
Brief Introduction Dr R Vincent: 1 Most Significant Change: using stories to assess impact.
Bangladesh Title II Multi Year Assistance Program Program for Strengthening Household Access to Resources (PROSHAR)
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Pedro Graça, Inequalities and nutrition status - Portuguese needs and EEA Grants approach Lisboa, June 5 h 2014.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Evaluation What is evaluation?
Improving Effectiveness and Outcomes for the Poor in Health, Nutrition, and Population An Evaluation of World Bank Group Support Since 1997 Martha Ainsworth,
Monitoring and Evaluating Rural Advisory Services
Approaches to Partnership
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
Somalia NGO Consortium
Impact at Scale
SUSTAINABLE MICRO-FINANCE for WOMEN’S EMPOWERMENT
ESF EVALUATION PARTNERSHIP
Presentation transcript:

Systematic analysis and synthesis in qualitative evaluation Case study evaluation of the Oxfam Novib programme in Burundi ( ) Ferko Bodnar CDI Conference on Impact Evaluation Wageningen, March 2013 ‹#›

Presentation outline 1.Purpose of the evaluation 2.Oxfam Novib programme under evaluation 3.Design (1): evaluation method 4.Analysis and synthesis of different opinions 5.Expectations underpinning design 6.Expectations underpinning communication 7.Factors affecting use of Evaluation results 2

Purpose of the evaluation Accountability: Dutch Ministry, other donors, Dutch public Learning: – input for strategic decisions in current and upcoming Oxfam-Novib (ON) programme in Burundi; – input for ON thematic policies and ON strategic programme , – input for Oxfam International; – support organisational learning by local partner organisations and other stakeholders. ‹3›

Oxfam Novib programme under evaluation Oxfam Novib (ON) worked on 5 themes: 1)Sustainable livelihood: food security, income, employment, markets, 2)Social services: health, education, 3)Security: emergency aid, conflict prevention, 4)Social and political participation, 5)Gender and diversity : 31 projects, 12 partner NGOs, 5.7m Euros. Post-conflict context. Weak social cohesion. Programme outcomes and impact, not outputs individual projects 4

5

6

7

8

9

Design (1): evaluation method overview Reconstruct intervention logic with ON staff Netherlands Identify judgement criteria with partner organisations Burundi Interviews with resource persons : partner organisations, other organisations and government Focus group discussions with beneficiaries and non-targeted households Survey in targeted and non-targeted communities (300 households) Restitution / synthesis meeting Analysis and synthesis table for different opinions 10

Design (2): Reconstruction intervention logic per theme Impact beneficiaries Outcome beneficiaries (behaviour) Outcome other organisations Outcome partner organisations (PO) (Outputs PO) (Activities PO) ‹11›

Analysis and synthesis of different opinions (1) Sources: – S: Survey – B: Discussion beneficiaries – N: Discussion non-beneficiaries, as reference – P: partner organisations, implementing ON programme – O: Other organisations (NGO) – G: Government Interpretation opinion: + Confirms expected effect of ON programme □ Does not confirm expected effect, or change not attributed to ON - Confirms a change contrary to the expected effect 12

Analysis and synthesis of different opinions (2) 13 Evaluation question 1. Judgement criterion 1.1SBNPOG Opinion Opinion 2- Judgement criterion 1.2SBNPOG Opinion 3- Opinion 4+ - Draft conclusion, answering evaluation question (indicating the source : S, B, N, P, O or G)

14 1. Impact on food security? % HH eating 2 meals / daySBNPOG Number of meals has not changed □□ (□)(□) Better-off, active HH increased, but poor HH reduced number of meals per day - Careful, HH tend to underestimate, hoping for WFP assistance □ HH eat better now than 5 years ago, due to:++(+)  Irrigation swamp rice fields+  Peace and collaboration Number of months per year with sufficient food increased SBNPOG Number of months with sufficient food decreased, especially among vulnerable groups -(-) Number of months with sufficient food increased, due to disease resistant cassava cuttings + Irrigation in swamp rice has increased production of actively involved participants (B), but this has not improved nutrition for the majority of surveyed households in targeted areas (S). The positive effect of irrigation is annulled by the general trend of declining food production (S, B). [incomplete] Example:

15

Expectations underpinning design 1.Individual project objectives fit in programme objectives – Outcome and impact: + – Project outputs: - 2.Inclusion independent sources strengthens contribution to ON – Valuable info, context, unintended effects, contribution: + – Relatively few targeted beneficiaries: - 3.Systematic presentation different opinions reduces evaluator bias – Balancing findings, reduce bias: + – (Time consuming: -) 16

Expectations underpinning communication 1.Transparent presentation and synthesis makes conclusions more acceptable – PO who felt their results were underappreciated could add clarifications and opinions without dominating conclusion. – PO appreciated the overview and focus on higher level outcome and impact. Recommend this for planning 17

Factors affecting use of Evaluation results Design ++: Concern attribution / contribution: including independent sources  Stronger recommendations. Concern acceptable conclusions: transparent analysis  More consensus ON concern IOB evaluation assessment: rigorous evaluation questions, criteria, triangulation  More reliable conclusions  ON adaptation (see ON mgt response) Communication +/-: Restitution workshop: consensus  acceptable conclusions ON management response adopted recommendations Follow up in planning ON programme : limited? 18