The Emergency Capacity Building (ECB) Project (OXFAM-led) and University of East Anglia (UEA) Contributions to Change: A Guide to Evaluating Change after.

Slides:



Advertisements
Similar presentations
Questions and Answers: About the ECB Project AIM Standing Team workshop: Joint Evaluations Casablanca, November 2011.
Advertisements

Elements of Survey Methodology Documentation MICS3 Data Analysis and Report Writing Workshop.
1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
Fundamental questions Good Enough Guide Training [insert location], [insert date]
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
1FANIKISHA Institutional Strengthening Project First Author: Henry Kilonzo Second Author: Dr. Daraus Bukenya Enabling Kenyan Civil Society Organizations.
SAI Performance Measurement Framework
Applying Conflict Sensitivity in Emergency Response: Current Practice and Ways Forward Conflict Sensitivity Consortium ODI Humanitarian Practice Network.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
ACCRA Who we are. What we do, and Save the Children’s role. What have we learnt so far? How have we worked with government, and what have we learnt? Keeping.
Tired of hanging around Evaluating projects with young people.
AN INTRODUCTION TO SPHERE AND THE EMERGENCY CONTEXT
1 Developing an effective system of service user and carer involvement in research School of Health and Social Care University of the West of England Jane.
1 Professionalising Programme & Project Management Developing programme & project management capacities for UNDP and national counterparts External Briefing.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Evidence for effective learning and teaching: ways and means Professor Marilyn Hammick July 2009.
Project Monitoring Evaluation and Assessment
Bond.org.uk/effectiveness The Bond Effectiveness Framework What we are hearing so far…. February 2010.
B121 Chapter 7 Investigative Methods. Quantitative data & Qualitative data Quantitative data It describes measurable or countable features of whatever.
Reflective Practice Leadership Development Tool. Context recognised that a key differentiator between places where people wanted to work and places where.
R2HC Third Call for Proposals Launch
ASSESSMENTS PSYCHOSOCIAL INTERVENTIONS ASSESSMENT.
Emergency Planning at ACF-Paris
Accountability in the humanitarian system Global Cluster Leads Donor Meeting April 21 st 2009.
Systematic analysis and synthesis in qualitative evaluation Case study evaluation of the Oxfam Novib programme in Burundi ( ) Ferko Bodnar CDI.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Step 6: Implementing Change. Implementing Change Our Roadmap.
Integrating Climate Change Adaptation into field projects: Experiences & Lessons with the CRiSTAL tool Anne Hammill, IISD
“making humanitarian action accountable to beneficiaries” The HAP Standard Review Highlights Monica Blagescu
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
ECB Project Accountability Activities Overview Andrea Stewart ECB Communications Manager
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1. IASC Operational Guidance on Coordinated Assessments (session 05) Information in Disasters Workshop Tanoa Plaza Hotel, Suva, Fiji June
International humanitarian standards under the leadership of the GoV Ha Noi, 4 November 2008.
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
A conflict-sensitive approach involves:  Gaining a sound understanding of the two- way interaction between activities and context;  Acting to minimize.
QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH.
Assessments. Assessment in the Project Cycle DESIGN IMPLEMENTATION MONITORING EVALUATION ASSESSMENT.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
McRAM Pakistan A Lesson in Assessment Preparedness.
Draft Zero Terms of Reference FSC (Humanitarian) Technical Team FSC Meeting Dhaka 18 April 2012.
BASELINE SURVEYS AND MONITORING OF PHARMACEUTICAL SITUATION IN COUNTRIES. Joseph Serutoke NPO/EDM WHO Uganda November 2002.
Profiling exercise of internally displaced persons’ situations in_______ General presentation of the project Official project launch event Date.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Evaluation of the Quebec Community Learning Centres: An English minority language initiative Learning Innovations at WestEd May 21, 2008.
Emergency Preparedness Planning: Middle East January 9 th -11 th.
Brief Introduction Dr R Vincent: 1 Most Significant Change: using stories to assess impact.
IASC Task Force on Meeting Humanitarian Challenges in Urban Areas (MHCUA) Draft Strategic Framework TF meeting GVA Roger Zetter.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
What evaluation attributes, stakeholder characteristics & contextual factors are important for evaluation influence? Sarah Appleton-Dyer Dr Janet Clinton,
Presentation By L. M. Baird And Scottish Health Council Research & Public Involvement Knowledge Exchange Event 12 th March 2015.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Saving lives, changing minds. Shelter cluster workshop The Shelter Cluster Approach.
Supporting community action on AIDS in developing countries ‘Evaluating HIV and AIDS-related advocacy’ Skills Building Session International.
Pacific Feasibility studies
Enacting Multiple Strategies and Limiting Potential Successes: Reflections on Advocacy Evaluation, Competing Objectives, and Pathways to Policy Change.
LEARNING REPORT 2016 Disasters and Emergencies Preparedness Programme
Corporate-level Evaluation of IFAD’s Decentralization Experience
Slow-onset crises – review of surge practices
Business Environment Dr. Aravind Banakar –
Business Environment Dr. Aravind Banakar –
Business Environment
Business Environment
Business Environment
Business Environment
Online Session 4.2: Designing Methodologies and Data Collection
Online Session 4.1: Designing Methodologies and Data Collection
RRI Baseline and Endline
Developing Deaf and Disabled Leaders for the future
Presentation transcript:

The Emergency Capacity Building (ECB) Project (OXFAM-led) and University of East Anglia (UEA) Contributions to Change: A Guide to Evaluating Change after Rapid Onset Emergencies

An Applied Research Partnership Oxfam and UEA (equal partners); Steering Committee with ECB members. Collaboration aims research, pilot and publish a guide to support NGOs measure contribution to change resulting from their interventions in rapid onset emergencies. Builds on ECB Good Enough Guide to Impact Measurement and Accountability (which did not provide much guidance on impact!)

Why talk about measuring contribution to change Needed a way to talk about impact that was simple, and realistic. –Difficult to measure impact in emergency settings. Demand for NGOs/UN to demonstrate impact but few examples of impact evaluations of humanitarian responses to rapid on-set emergencies. ECB partners expressed that they needed clearer guidance to assist them measure impact quantitatively and qualitatively. Rather than get stuck in argument about rigorous impact decided to focus on developing and testing methodology to help document evidence of contributions to change?

Approach Aims to help agencies evaluate their contribution to change (+/-) Focused on changes to lives of affected populations (household/community) Influenced by the livelihoods approach Informed by work of others - ALNAP, INTRAC, Tufts etc. Recognizes that external aid is only one (not always the most significant) of dynamics after a disaster.

The Methodology Designed to be robust enough to collect credible evidence of a contribution to change but simple enough for field staff to use (with supervision). Draws on range of well known qualitative and quantitative tools to measure change against baseline. –Household survey –Focus group and/or group interviews, KI interviews. –PRA techniques After first pilot decided to focus on Retrospective method only to establish baseline. A work in progress… not in a position to make recommendations.

Retrospective Methodology If no baseline is already established, this methodology will enable the collection of data to best measure changes for households. Assumes that evaluation field work will take place no more than months after the disaster has occurred (is 15 months too long?). Data collection includes information on livelihoods and household assets to reconstruct a baseline on situation before disaster / how the situation has changed since. Triangulate to increase reliability.

Impacts recognised by affected populations when they result in changes in daily activities and livelihoods of households. As livelihoods differ within communities, need to study impacts at the household level to see how livelihoods have changed Why a Household Approach?

Contents of the Guide Guidance on how to sample data, train staff in data collection Household and community surveys Qualitative and quantitative studies How to process and analyse findings Single-agency and multi-agency evaluation contexts

Field Testing Have competed 3 pilot studies. Criteria: –Partner on ground to help with logistics, access, data collection –Significant event in previous 6-15 months. –Significant response to emergency. Bihar, India, November 2011 (Floods in July/August 11) Guatemala, in March 2012 (Hurricane in Oct 11) Sri Lanka in May 2012 (Floods in Dec 10-Feb 11) Data analysed in field by partner organisations –Re-analyzed at UEA following Once the guide has been drafted, a final field test will be undertaken.

Kosi River, Madhubhani District, Bihar, 2011

Advantages of Contribution to Change? Hopefully useful in contexts where rigorous impact evaluation is not feasible, affordable, ethical. Household and livelihoods focus may avoid silo-effect of sectors or clusters making changes to the lives of affected populations central to evaluation of NGO contributions. Provides techniques to establish a retrospective baseline, as reality of most emergencies is baseline information from before the event is not easily available. Assumes humanitarian assistance provided by external actors is only one part of the change story - captures communities own coping mechanisms. But the ultimate test will be whether others find it useful.

Challenges No such thing as a typical rapid onset emergency! Has made selecting pilot studies very difficult. Balance between simplicity and rigour is hard to get right. Balance between flexibility and structured guidance tricky. – (The guide needs to flexible to be adjusted to accommodate the unique features e.g. event characteristics, gender dynamics, livelihood systems etc. of each site.) The logic, rationale and utility of the approach must be clear to multiple agencies and users with varying mandates and levels of expertise. Need to clarify how approach complements can be integrated with existing initiatives (e.g. Sphere). Questions of scale.

For further information contact: Dr Vivien Walden: Dr Roger Few: Daniel McAvoy: Dr Marcela Tarazona: