Introduction Bilateral and multi-lateral donors often fund a single project effort in multiple countries. Such projects have similar goals and objectives.

Slides:



Advertisements
Similar presentations
Building M&E Capacity for OVC National Plans of Action The Working Group on Monitoring and Evaluation, Inter-Agency Task Team on Children and HIV/AIDS.
Advertisements

Guidance Note on Joint Programming
Setting internal Quality Assurance systems
Monitoring and Evaluation Framework for the Africa Fit for Children.
February Dakar, Senegal
Step by step guide.
Action Plan Skills Building: Module 2 Defining Action Plan Purpose and Scope January 2013.
Prepared by: Marelize Gorgens-Albino, The World Bank Global AIDS M&E Team (GAMET) 1 12 Components of a Functional National HIV M&E System A short introduction.
Comprehensive M&E Systems
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
Management Response to the Annual Report on the Evaluation Function in UN Women in 2014.
WSIS Thematic Meeting Measuring the information society Geneva, 7-9 February 2005 SCAN-ICT Experience ECONOMIC COMMISSION FOR AFRICA.
UIS Data gathering mechanisms Said Voffal Kampala, 6 May 2008.
Developing a Strategy: Managing the process Neil Fantom Development Data Group.
IRRIGATION, RURAL LIVELIHOODS AND AGRICULTURAL DEVELOPMENT PROJECT 1 MONITORING AND EVALUATION: GOOD PRACTICES REGIONAL IMPLEMENTATION WORKSHOP FOR IFAD-
United Nations Development Programme UNDP Africa United Nations Department of Economic and Social Affairs Presented by John M. Kauzya Tunis, Tunisia 17.
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
SPA-CABRI Project on “Putting Aid on Budget” Presentation to DAC Joint Venture on Public Finance Management Paris, July 2007 Peter Dearden, Strategic Partnership.
Development and Implementation of a National Multisectoral Output Monitoring System (SHAPMoS) for HIV Responses in Swaziland:  Challenges and lessons learned.
A Sourcebook for Monitoring and Evaluating Agricultural and Rural Development Measuring Results in less-than-ideal Conditions Global Donor Platform for.
Copyright 2010, The World Bank Group. All Rights Reserved. International Statistics Part 1 Crime, Justice & Security Statistics Produced in Collaboration.
 SADC TREATY  RISDP  ENERGY PROTOCOL  COOPERATION POLICY AND STRATEGY  ACTIVITY PLAN-
SEILA Program and the Role of Commune Database Information System (CDIS) Poverty and Economic Policy (PEP) Research Network Meeting June 2004, Dakar,
Unit 10. Monitoring and evaluation
Military Family Services Program Participant Survey Training Presentation.
1. IASC Operational Guidance on Coordinated Assessments (session 05) Information in Disasters Workshop Tanoa Plaza Hotel, Suva, Fiji June
Developing Indicators
Workshop on the Improvement of Civil Registration and Vital Statistics in the SADC Region, Blantyre, Malawi, 1 – 5 December 2008 Vital statistics and their.
SECTOR POLICY SUPPORT PROGRAMMES A new methodology for delivery of EC development assistance. 1.
Linked NHA Tables-Examples Background The National Health Accounts (NHA) is a standardized methodology that describes flow of funds through the health.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Youth for Christ Board of Trustees Training 2-Hour Training (December 2010)
SUB-MODULE 5. MEANS OF VERIFICATION RESULTS BASED LOGICAL FRAMEWORK TRAINING Quality Assurance and Results Department (ORQR.2)
Introduction Governments and donors supporting social development programmes rely on implementing partners to produce high quality data for reporting on.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
Partnership Analysis & Enhancement Tool Kit Cindy S. Soloe Research Triangle Institute (RTI) April Y. Vance Centers for Disease Control and Prevention.
Implementing the revised TB/HIV indicators and data harmonisation at country level Christian Gunneberg MO WHO Planning workshop to accelerate the implementation.
Military Family Services Program Participant Survey Briefing Notes.
Early Childhood Development (ECD) encompasses the many ways in which young children grow and thrive: physically, mentally, emotionally, morally and socially.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Differences in development. Three types of differences in development: local regional global.
Reporting on Bilateral Relations Andreas Aabel Bilateral Officer EEA and Norway Grants Tel: Reporting.
Elements of an Effective Regional Strategy for Development of Statistics - SADC Ackim Jere SADC Secretariat Gaborone, Botswana PARIS 21 Forum on Reinforcing.
Africa Programme on Gender Statistics Status of implementation United Economic Commission for Africa Meeting of Committee of Directors General November.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
September 2005 DEA: First National Workshop 1 Development and Energy in Africa First National Workshop: Tanzania 12 September 2005 Introduction Gordon.
Comprehensive M&E Systems: Identifying Resources to Support M&E Plans for National TB Programs Lisa V. Adams, MD E&E Regional Workshop Kiev, Ukraine May.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
Session 6: Data Flow, Data Management, and Data Quality.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Interreg Programmes Preliminary Conclusions May 2016.
Evaluation What is evaluation?
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Summary and Highlights of the 2014 Sharing Meeting Southern Sun O.R. Tambo - Johannesburg, South Africa November 2014.
MONITORING, EVALUATION & REPORTING UPDATES 2014 Annual Partners Forum 15 April 2014.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
UN international demographic data collection
Technical Assistance and Capacity Building Activities in SADC
05 April 2016 Briefing to the Portfolio Committee on review of the draft APP - Department of Arts and Culture.
Multi-Sectoral Nutrition Action Planning Training Module
World Health Organization
Planning, Monitoring and Evaluation System M E S
OVC_HIVSTAT HIV Risk Assessment Prototype.
Key components of an NSDS
Review of CSTL MER Framework
monitoring & evaluation THD Unit, Stop TB department WHO Geneva
AICT5 – eProject Project Planning for ICT
Comprehensive M&E Systems
National level Objective:
M & E Plans and Frameworks
Presentation transcript:

Introduction Bilateral and multi-lateral donors often fund a single project effort in multiple countries. Such projects have similar goals and objectives which relate to the overall multi-country programme, but each country’s sub-project is generally encouraged to contextualise their implementation to local situations. However, a common M&E framework works best to “tell the story” of the overall project effort. Efforts to contextualise the M&E framework to each country’s situation can lead to a wide variety of problems with aggregation of data and reporting. MONITORING & EVALUATING PROJECTS ACROSS MULTIPLE COUNTRIES Selvaggio, MP; Mangxaba JW; Tsigoida, M Khulisa Management Services, P.O. Box 923, Parklands 2121, South Africa Khulisa’s Experience with Multi-Country Monitoring & Evaluation Khulisa has been involved in M&E activities for numerous projects that are implemented in multiple countries, including : RECLISA – Reduction of Exploitative Child Labour in Southern Africa. This project funds work in Botswana, Lesotho, Namibia, South Africa, and Swaziland. Khulisa’s role is to establish the project monitoring system for reporting on core indicators, as well as country-specific indicators at output level. Khulisa also verifies the data reported from all countries, and conducts evaluations on the source data to ensure that this is accurate and valid. APPLE – AIDS Prevention, Positive Living, and Empowerment Project. This project was implemented in Malawi and Mozambique. Khulisa’s role was to monitor project Logframe indicators at activity, output, and outcome levels. Research and a thematic study on Governance, Management and Accountability at Secondary School Level in Africa. Three countries (South Africa, Zambia, and Senegal) were chosen as case studies with the idea of identifying the best and most promising practices for Governance, Management and Accountability in these countries. Assessment of SADC Higher Learning Institutions to be Centres of Specialization in Education Policy Development. Over a period of two years, three centres were to be established and 60 ministry staff members from across the SADC region were to be trained in Education Policy Development. Khulisa’s role was to assess the applying SADC institutions through administering questionnaires and site visits. Issues, Concerns, and Lessons Learned EVALUATIONS and RESEARCH: It is important to note that the criteria and/or parameters of source data at government institutions (health and education) can differ from country to country, and this can cause difficulties in data collection. For example, the criteria for enrolment in ART programmes, or the definitions of child labour are not the same in each country. Some countries require ethics clearance for research and evaluation, including project-specific surveys. It is important to understand when this is required so that adequate planning can be undertaken. To the extent possible the same data collection tools (or same items in the tools) should be used for all countries and sites in order to easily aggregate the data at high levels to tell the story of the overall project. MONITORING SYSTEMS: The Governments’ own health information systems and education information systems need to be utilised and built upon for programme monitoring. The project(s) should not attempt to replace the government’s data collection system. M&E Systems, Logframes, and their indicators must be defined broadly enough to allow OUTPUT and OUTCOME measures to be readily aggregated. Indicator definitions must be specific but at the same time, applicable to all governments or country situations. Multi-country programmes must have common indicators that are used in all countries so that aggregation can occur to “tell the story of the overall project”. These common indicators must be clearly defined and vetted with implementation partners in all countries to ensure that they are measurable in all countries Sometimes indicator data is combined with other data which differs slightly in definition. When this is required, the unit of measure must be the same. For example, the detailed definitions of “No. of children-at-risk enrolled” can differ from country to country-- one country might emphasize educational enrolment, and another country might emphasize PSS service -- but the unit of measure (i.e. ‘child enrolled’) is the same from country to country, M&E systems, Logframes, and indicators must be revisited regularly throughout the implementation of the project to correct any deficiencies with the framework itself or with the definitions of common indicators DATA FLOW / TECHNOLOGY ISSUES IN THE MONITORING SYSTEM Generally, we have found it best to design the data management system as a combination of paper-based and computer-based steps: For collection of data at source, manual (or paper-based) record keeping can be used at sites. Generally, it is not realistic to expect sites to have computer infrastructure or capacity. Monthly summary forms of project outputs can be compiled at site and sent to the project office as a paper form or electronically. Aggregation of the monthly summary data from each site should be done electronically (through ACCESS or excel spreadsheets) Even if monthly reports are not required by the donor, the best practice is for each country project office to get receive monthly summary reports from sites. These can then be aggregated into quarterly or semi-annual reports to the donor. ANALYSIS: In analysing data from such projects, the main goal is to aggregate outcome- and output-level data to “tell the story” of the overall multi-country project. In aggregating indicator data, avoid using indicators expressed as percentages, which are prone to errors and miscalculations. Rather, keep indicators as counts so that they can be more readily aggregated across countries. However, when percentage values must be used, the aggregation of the multi-country result can only be computed using the raw denominator and numerator values from each country programme. Accurately Measuring Progress th Avenue Parktown North Johannesburg South Africa 2193 Tel: Fax: Web: