Humanitarian aid evaluation at Medecins sans Frontieres

Slides:



Advertisements
Similar presentations
Child Protection Units
Advertisements

TVET working group contributions. What are the possible options for obtaining decent living and working conditions without joining the informal economy?
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Applying Conflict Sensitivity in Emergency Response: Current Practice and Ways Forward Conflict Sensitivity Consortium ODI Humanitarian Practice Network.
Key Challenges in the Field of Violence Against Women with Disabilities and Deaf Women Overview Overarching Challenges Barriers to Services Barriers to.
Donald T. Simeon Caribbean Health Research Council
The EU and Resilience. Core EU Document Document Overview 1.The need to address chronic vulnerability 2.The resilience paradigm 3.The EU’s experience.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
IPDET Lunch Presentation Series Equity-focused evaluation: Opportunities and challenges Michael Bamberger June 27,
CENTRAL EUROPE PROGRAMME SUCCESS FACTORS FOR PROJECT DEVELOPMENT: focus on activities and partnership JTS CENTRAL EUROPE PROGRAMME.
Quality evaluation and improvement for Internal Audit
In-depth look at ISACS Conducting small arms and light weapons surveys.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Evaluation of OCHA’s Role in Humanitarian Civil-Military Coordination Findings and Recommendations Seminar on Evaluation of UN Support for Conflict Affected.
Participants should expect to understand: Concept of M&E Importance of gender in M&E Different steps in the M&E process Integrating gender into program/project.
Capacity building for NGOs to support people to make healthy choices and take an active role in maintaining good health and wellbeing.
The Process of Conducting a Post Disaster Needs Assessment (PDNA) United Nations Development Programme Bureau for Crisis Prevention and Recovery Bangkok,
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Assessing Humanitarian Performance: Where are we now? 24 th Biannual Meeting Berlin, 3 rd December 2008.
ORIENTATION SESSION Strengthening Chronic Disease Prevention & Management.
MAST: the organisational aspects Lise Kvistgaard Odense University Hospital Denmark Berlin, May 2010.
Shelter Training 08b – Belgium, 16 th –18 th November, 2008 based on content developed by p This session describes the benefits of developing a strategic.
Abu Raihan, MD, MPH Director of Program, Asia IAPB 9th GA, Hyderabad, September 2012 Symposium 6: Challenges in Monitoring.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
CARE International Humanitarian Accountability Framework (HAF) Nairobi, 26 January 2012.
Evaluating FAO Work in Emergencies Protecting Household Food Security and Livelihoods.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
1 Indicators and gender audits Juliet Hunt IWDA Symposium on Gender Indicators 15 June 2006.
1 Mid-Term Review of the Hyogo Framework for Action Roadmap to Disaster Risk Reduction in the Americas & HFA Mid-Term Review.
Regional Capacity Building Activities in the Caribbean UNFCCC Expert Workshop on Monitoring and Evaluating Capacity- building in Developing Countries Carlos.
Health needs assessment of the 50+ year old Irish population of Calderdale SMT Tim Fielding.
Assessments. Assessment in the Project Cycle DESIGN IMPLEMENTATION MONITORING EVALUATION ASSESSMENT.
DG ECHO GENDER POLICY and GENDER-AGE MARKER
ROLE OF INFORMATION IN MANAGING EDUCATION Ensuring appropriate and relevant information is available when needed.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Coordination Page Multi-Sectoral, Coordinated Action General coordination responsibilities of a multi- sectoral and community-based approach include:
UNDP Handbook for conducting technology needs assessments and Preliminary analysis of countries’ TNAs UNFCCC Seminar on the development and transfer on.
1 PROJECT CYCLE MANAGEMENT Gilles Ceralli TR Methodology – HI Luxembourg 06/2008.
Regional Training/Consultations on Capacity Development for Sustainable Growth and Human Development in Europe and CIS Application of Capacity Development.
Gender in Humanitarian Aid Different Needs, Adapted Assistance Commission Staff Working Document July 2013.
Needs Assessment Presented By Ernest D. Pérez Capacity Building Assistance Trainer BORDER HEALTH FOUNDATION Tucson, Arizona CAPACITY BUILDING ASSISTANCE.
Independent Evaluation Group World Bank November 11, 2010 Evaluation of Bank Support for Gender and Development.
Measuring Risk, Vulnerability and Impact in Conflict: Doing Better Through Data Collaboration What is DFID doing in Afghanistan?
Inter-agency Global Evaluation of RH Services for Refugees and IDPs Component 4 Part B: Assessment of the Minimum Initial Service Package (MISP) of Reproductive.
Provincial M&E Forum 18 August 2011 The Presidency Department of Performance Monitoring and Evaluation Draft National Evaluation Policy Framework.
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
FAO Turkey Partnership Programme (FTPP) FAO Final Evaluation of the FTPP Summary for FTPP Programming Meeting, 14 December
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
SEL1 Implementing an assessment – the Process Session IV Lusaka, January M. Gonzales de Asis and F. Recanatini, WBI
ACTED AME Appraisal, Monitoring and Evaluation. Summary 1/ ACTED AME department 2/ AME Responsibilities 3/ AME throughout project cycle 4/ Involvement.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Coordination with health service providers and local authorities Module 3 Session 3.3 National Disaster Management Practitioners, Islamabad, Pakistan.
Tools for Mainstreaming Disaster Risk Reduction: Guidance Notes for Development Organisations Charlotte Benson and John Twigg Presented by Margaret Arnold.
URBACT IMPLEMENTATION NETWORKS. URBACT in a nutshell  European Territorial Cooperation programme (ETC) co- financed by ERDF  All 28 Member States as.
Supporting measurement & improvement of primary health care (PHC) at the facility and community levels Dr. Jennifer Adams, Deputy Assistant Administrator,
Support National Social Protection Strategy (NSPS) CARD/SPCU 1.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Assessments ASSESSMENTS. Assessments The Rationale and Purpose for Assessments.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Criteria for Assessing MHPSS Proposals Submitted through the CAP, CERF and HRF Funding Mechanisms to the Protection Cluster.
Module 2 The SDG Agenda: No-one left behind
LEARNING REPORT 2016 Disasters and Emergencies Preparedness Programme
MOSH Leading Practices Adoption System
Session 2: The SDG Agenda: No-one left behind
Developing a shelter strategy
Role of Evaluation coordination group and Capacity Building Projects in Lithuania Vilija Šemetienė Head of Economic Analysis and Evaluation Division.
Presentation transcript:

Humanitarian aid evaluation at Medecins sans Frontieres Sabine Kampmüller, MIH MSF Vienna Evaluation Unit http://evaluation.msf.at

Dimensions of evaluation: Tool to describe the effectiveness of a program, measure etc. The goal of a logic model is to give an explicit description of the attended link between the inputs (what we invest), the performances/ outputs (what we do), the reactions of the target groups and the consequences for the affected people Such a description is helpful, because a lot of assumptions concerning the efficiency of performance are implicit External influences (positive and negative) can be taken in consideration A logic model allows the evaluator to identify goals and indicators to measure the effects of a programme Source: SDC (2002)

Evaluation post-disaster: Haiti earthquake response Dramatic and chaotic situation Massive mobilisation of aid, media, military Heath sector response largely uncoordinated during the first weeks Needs overwhelming: Dead, Injured and traumatised One million and a half homeless Extreme conditions in IDP camps and streets Vulnerability for disases, outbreaks, malnutrition and violence Evaluation conducted 5 – 10/2010

Evaluation process 1st time (!) all section Review 6 specific reviews: Global/operational, medical/surg, log/supply, Comms, FR, HR Quantitative & qualitative methods data, field visits, interviews, qualitative research - beneficiaries, web survey -staff Limitations: Incomplete and non-uniform data Recall problems Sabine ToR f specific reviews by teh specific platforms

First line emergency and outpatient services Consultations 123,108 Dressings 34,044 Antenatal care consultations 8,353 Victims of sexual violence 38 (?) Total 165,543 Surgery Major surgical interventions 6,259 Other surgical interventions5,903 Total12,162 Inpatient care Admissions for Surgery1,243 Maternity3,425 Medical1,982 Paediatrics1,132 Total 7,782 Postoperative care 2,604 Physiotherapy sessions 10,241 Mental Health Individual consultations 14,765 Group sessions 4,310

Patients / Community’s Perception Very positive perception of MSF´s interventions and services Fear/experience of stigmatisation, increase of violence in the camps  Lack of space for occupational and educational activities.   More attention to socio-cultural and spiritual needs Others actors Sabine

Haiti evaluation findings: MSF was one of the biggest emergency health actors Operational choices in line with emergency needs; less coherent over time Common data collection difficult to impossible International platforms / working groups are underused / undervalued Successful adocacy on some issues, complicated decision making prevented more Even if all the surgical referents have been in Haïti Assessment: done at the beginning nothing or near nothing apart OCP and OCG later for needs population

Main recommendations: Organize inter-section capacity for assessment and monitoring of evolving needs and assistance Define strategy on mass casualty Revise emergency supply strategies Ensure uniform data collection Strengthen technical working groups Focus on organisational learning Engage with national and international actors

Thematic Evaluation: Response to displacement Dramatic and chaotic situation Massive mobilisation of aid, media, military Heath sector response largely uncoordinated during the first weeks Needs overwhelming: Dead, Injured and traumatised One million and a half homeless Extreme conditions in IDP camps and streets Vulnerability for disases, outbreaks, malnutrition and violence Evaluation conducted 2009/2010

Evaluation process Comparison of 6 case studies (urban/rural, low/middle income, etc) Quantitative & qualitative methods Literature research, document/tools & data review, field visits, interviews, Limitations: Availability of data and key people Poor documentation Sabine ToR f specific reviews by teh specific platforms

Particularity of open settings Lack of clear boundaries : Geographical spread, invisibility, needs difficult to identify and measure, protection issues Displaced settle in environment with available resources and existing health system Better survival capacities, but deterioration likely Protracted, chronic or intermittent character: Mortality near-normal levels (pre-emergency), might rise slowly over time Invisibility due to legal status or mixity, as result of progressive break down of health infrastructure and exhausted coping

Findings on assessment Complexity of open settings requires more attention to assessment Critical aspects of information missing Quantitative information difficult to obtain Health system issues, access barriers little addressed Concerns on the use of surveys Views of displaced and host often omitted Capacities and vulnerabilities not assessed Same reflexes than in camp like situations: look for quantitative data, mortality measured in 3 of the vcountries, surveillance only in cameroon, not very effective, no validity of quaitative methods without triangulation

Recommendations on assessment Develop innovative assessment approaches for inaccessible areas Distance assessment“ Promote systematic use of qualitative methods Adopt concept of “continual” assessment Develop a frame to assess vulnerabilities, capacities and coping Provide better support and guidance assessment toolbox, experts, training

Findings on intervention Engagement with the health system a main challenge Outreach workers invaluable, set ups improvable Non-medical assistance is marginal Overambitious targets of coverage Strategy adapted to the level of emergency (top ten priorities to reduce excess mortality), parallel health system

Recommendations on intervention Need for new intervention frameworks Adopt existing models Generalize the practice to cover both displaced and host where appropriate Define the criteria / scope of “light support” Develop community based strategies

Evaluation criteria (donors) Appropriateness (Connectedness) (Coherence) Timeliness Coordination Coverage Relevance Effectiveness Efficiency Impact Sustainability Adopted from: Hallam, A. Good practice review (ODI), 1998

Ways to ensure UTILISATION Get original purpose very clear Ensure ownership, participatory process Get key stakeholders behind Share findings in the field Allow debate with evaluators Ensure good readibility of report Disseminate info widely / on all levels Promote credibility of evaluators Use in training Formulate recommendations clear and realistically

10 Steps in the process Defining purpose and scope Writing the terms of reference Analysing stakeholders Choosing the methodology Deciding the budget for evalution Selecting the evaluation team Preparing the field Research phase Reporting and dissemination Managment follow-up 18