Implementing Sustainable Monitoring and Evaluation Systems Marelize Gorgens The World Bank.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Data Quality Considerations
Donald T. Simeon Caribbean Health Research Council
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
Project Monitoring Evaluation and Assessment
Global Poverty Action Fund Community Partnership Window Funding Seminar January 2014 Global Poverty Action Fund Community Partnership Window Funding Seminar.
Comprehensive M&E Systems
POLICY AND PLANNING BRANCH (PPB) Proposed M&E action plan Charles Mvula IAC WAGENINGEN UR February 9 –
Indicators, Data Sources, and Data Quality for TB M&E
1 Management Sciences for Health MSH Building Local Capacity Project Stronger health systems. Greater health impact. Strengthening M&E capacity of civil.
Results-Based Management
Performance Measurement and Analysis for Health Organizations
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Unit 10. Monitoring and evaluation
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Tracking of GEF Portfolio: Monitoring and Evaluation of Results Sub-regional Workshop for GEF Focal Points Aaron Zazueta March 2010 Hanoi, Vietnam.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Comprehensive M&E Systems: Identifying Resources to Support M&E Plans for National TB Programs Lisa V. Adams, MD E&E Regional Workshop Kiev, Ukraine May.
Session 6: Data Flow, Data Management, and Data Quality.
Session 2: Developing a Comprehensive M&E Work Plan.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
Assessing Logistics System Supply Chain Management 1.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Importance of Monitoring and Evaluation. Lecture Overview  Monitoring and Evaluation  How to Build an M&E System.
PMTCT PROGRAMME MONITORING DR. S.K CHATURVEDI DR. KANUPRIYA CHATURVEDI.
Marelize Gorgens The World Bank An M&E strategy Monitoring & Evaluation strategy Master & Execute Money and Energy is a waste of M&E that we do not M&E.
Methods and approaches of collecting education data
New WHO Guidelines on Person centred monitoring
Short Training Course on Agricultural Cost of Production Statistics
Project monitoring and evaluation
DATA COLLECTION METHODS IN NURSING RESEARCH
Monitoring and Evaluation: A Review of Terms
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Monitoring and Evaluation
iCCM Recommended Indicators
What is a grant? A direct financial contribution – donation – from EU budget An action - contributing to EU policy achievement Functioning of a body acting.
Session 1 – Study Objectives
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 8. Routine monitoring.
HEALTH IN POLICIES TRAINING
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
UNECE Work Session on Gender Statistics, Belgrade,
Session: 4 Using the RDQA tool for Data Verification
PEFA 2016 Slides selected from the training materials of the PEFA secretariat.
EMIS. EDUCATION MANAGEMENT INFORMATION SYSTEM (EMIS)
Strategic Prevention Framework - Evaluation
Multi-Sectoral Nutrition Action Planning Training Module
The importance of administrative data in the era of SDGs
Monitoring and Evaluation using the
Module 7: Monitoring Data: Performance Indicators
April 2011.
Introduction to CPD Quality Assurance
MONITORING AND EVALUATION IN FOOD SECURITY AND NUTRITION INTERVENTIONS KOLLIESUAH, NELSON P.
PEFA 2016 Slides selected from the training materials of the PEFA secretariat.
Monitoring and Evaluation
Measuring Data Quality
monitoring & evaluation THD Unit, Stop TB department WHO Geneva
Evaluation tools training
TRACE INITIATIVE: Data Use
Indicators, Data Sources and Data Quality for TB M&E
Monitoring and Evaluating FGM/C abandonment programs
Internal Control Internal control is the process designed and affected by owners, management, and other personnel. It is implemented to address business.
Comprehensive M&E Systems
Monitoring and Evaluation in Communication Management
Notes: Rapid assessments.
Integrating Gender into Rural Development M&E in Projects and Programs
Presentation transcript:

Implementing Sustainable Monitoring and Evaluation Systems Marelize Gorgens The World Bank

Training Session Outcomes Participants should have developed sector outcomes that are linked to the NDP outcomes Participants should have developed indicators for their sector outcomes Participants should be able to assess, design and revise their sector’s M&E system to measure the achievement of their sector outcomes Participants should be able to design or revise their sector’s M&E plan based on assessment results Participants should be able to develop a prioritised work plan for their M&E system, and cost it

Training Session Outcomes Participants should have developed sector outcomes that are linked to the NDP outcomes Participants should have developed indicators for their sector outcomes Participants should be able to assess, design and revise their sector’s M&E system to measure the achievement of their sector outcomes Participants should be able to design or revise their sector’s M&E plan based on assessment results Participants should be able to develop a prioritised work plan for their M&E system, and cost it

Types of Data Sources One-off data sources Periodic data sources Routine data sources

Types of Data Sources One-off data sources Periodic data sources Routine data sources

INDICATOR TYPE DATA COLLECTION TIME FRAME TYPES OF DATA SOURCES Input/ Process Continuously Routine data sources such as statistics about education or other government services Routine data sources such as routine monitoring data about training materials developed for schools OutputQuarterly, semi- annually, or annually Routine data sources such as statistics about education or other government services Routine data sources such as routine monitoring data about teacher absenteeism or the number of visits by agriculture extension officers Periodic data sources such as exit interview surveys Outcome1 to 3 years Periodic data sources such as population-based surveys One-off data sources such as special studies (research or evaluation) Impact2 to 5 years Periodic data sources such as surveillance Periodic data sources such as population-based surveys One-off data sources such as special studies (research or evaluation)

Classification of Routine Data BASED ON TYPE of DATA ▫‘Census’-type data about the need for (or demand for) services ▫Routine program monitoring data (or data about the ‘supply’ of services) ▫Routine financial monitoring data BASED ON GEOGRAPHIC LOCATION WHERE DATA ARE GENERATED ▫E.g. in community, at health facilities BASED ON ORGANISATION GENERATING THE DATA ▫Civil society ▫Private sector ▫Public sector

Location where data is generated (the same as the location where the service is provided) Government schools (schools built by the Government) Community schools (local schools built by the community) Type of organization generating the data Schools operated by ministry of education  Registers with the number of pupils enrolled in each class  Program monitoring data about teacher and pupil attendance  Financial monitoring data about the school’s expenditure  Registers with the number of pupils enrolled in each class  Program monitoring data about teacher and pupil attendance  Financial monitoring data about the school’s expenditure Schools operated by non-governmental organizations  Registers with the number of pupils enrolled in each class Program monitoring data about teacher and pupil attendance  Financial monitoring data about the school’s expenditure  Registers with the number of pupils enrolled in each class  Program monitoring data about teacher and pupil attendance  Financial monitoring data about the school’s expenditure Classification of Routine Data

Six routine data management processes Data sourcing Data collection Data collation Data analysis Data reporting Data use

Results for This Component to be Successful Long-term result: ▫Timely and high quality routine data are used for routinely assessing program implementation and for taking decisions to improve programs. Short and medium-term results: ▫Routine monitoring forms, data flow and manual. ▫Defined data management processes for routine data (sourcing, collection, collation, analysis, reporting, and use). ▫Routine procedures for data transfer from sub- national to national levels.

Routine Monitoring System for Program CoverageSample Survey for Program Coverage Collects ‘census type’ data about all who need services (individuals, groups or organizations that will be targeted with services) Selects a representative sample of beneficiaries Collects routine monitoring data about programs implemented, from organizations that implement programs Collects data from beneficiaries of programs (i.e. individuals, groups or organizations that receive programs services/benefits) Collects data from all organizations that implement programs Selects respondents on a sample basis Depends on implementers of programs to report dataActively collects data from respondents Provides feedback to implementers of programsProvides no feedback to individual survey respondents Covers all geographic areas where programs are implemented Only covers areas selected in the sample. If the sample is not representative of the country, then the survey results are equally unrepresentative Organizations collect data daily and summarize quarterly for report back (i.e. routinely) Survey respondents are selected only when a survey is being undertaken, and surveys are only undertaken periodically – typically every two years Not expensive to implementExpensive to implement Targets the implementers of programs and makes them responsible for data collection Does not directly target the implementers of programs or involve them in data collection

Using surveys vs. using routine data for determining service coverage Indicator: Percentage of OVCs that have received support Denominator Numerator There are two ways to calculate this percentage

Indicator: Percentage of OVCs that have received support Number of OVCs in the country Number of OVCs that received support Denominator Numerator METHOD 1: Using routine data

STEP 1. DENOMINATOR Count the number of OVCs in the country STEP 2. NUMERATOR Determine the number that have received support by asking each implementer of OVC interventions STEP 3. PERCENTAGE Calculate percentage Report from implementer 1 Report from implementer 2

STEP 1. DENOMINATOR Count the number of OVCs in the country

STEP 2. Select a sample of OVCs and survey them STEP 3. Calculate percentage METHOD 2: Using a survey STEP 1. Select a sample size that is representative

1. Collect only the data you need

2. Standardise data definitions

3. Use routine monitoring data to assess ‘if we implemented what we said we would’ 1.Register of orphans is kept at each district council office, and updated 2.All district councils report the number of orphans in their district to the Ministry of Social Welfare at national level, on a given date 3.Ministry of Social Welfare determines that there are 200 orphans in total, and that as a target, the 40% orphans most in need needs to be reached (80 orphans). This is the national target.

Number of orphans on register as of 1 January 2009 District 160 District 240 District 380 District 420 TOTAL200

4. Now organizations start planning how many orphans they will reach in which district District 1District 2District 3District 4TOTAL NATIONAL TARGET Organization 1 target Organization 2 target

5. Each organization develops a work plan and a budget to show how they will reach the targeted number of orphans ORGANIZATION 1 Activity Q1Q2Q3Q4 TargetBudgetTargetBudgetTargetBudgetTargetBudget Train orphan caregivers Procure materials for support 40 uniforms; 40 books5000 weekly food parcels for 20 households20000 weekly food parcels for 30 households35000 weekly food parcels for 40 households50000 Support orphans cumulative target TOTAL

5. Each organization develops a work plan and a budget to show how they will reach the targeted number of orphans Activity Q1Q2Q3Q4 TargetBudgetTargetBudgetTargetBudgetTargetBudget Procure materials for support weekly food parcels for 40 households50000 weekly food parcels for 40 households50000 weekly food parcels for 40 households50000 weekly food parcels for 40 households50000 Support orphans TOTAL ORGANISATION 2

6. Every quarter, organizations report on how many orphans they have actually reached ORGANISATION 1 Activity Q1Q2Q3Q4 ReachedSpentReachedSpentReachedSpentReachedSpent Train orphan caregivers Procure materials for support 20 uniforms; 40 books2400 weekly food parcels for 25 households17000 weekly food parcels for 30 households32000 weekly food parcels for 44 households53100 Support orphans TOTAL

6. Every quarter, organizations report on how many orphans they have actually reached ORGANISATION 2 Activity Q1Q2Q3Q4 ReachedSpentReachedSpentReachedSpentReachedSpent Procure materials for support Weekly food parcels for 20 households25000 Weekly food parcels for 35 households24100 Weekly food parcels for 40 households56000 weekly food parcels for 32 households42050 Support orphans TOTAL

7. National coverage (percent of targeted number) is determined every quarter, and implementation progress is known Orphans reachedFunding PlannedActualBudgetedSpent Quarter Quarter Quarter Quarter TOTAL Coverage86% of target reached Expenditure80% of funding spent

Types of Data Sources One-off data sources Periodic data sources Routine data sources

INDICATOR TYPE DATA COLLECTION TIME FRAME TYPES OF DATA SOURCES Input/ Process Continuously Routine data sources such as statistics about education or other government services Routine data sources such as routine monitoring data about training materials developed for schools OutputQuarterly, semi- annually, or annually Routine data sources such as statistics about education or other government services Routine data sources such as routine monitoring data about teacher absenteeism or the number of visits by agriculture extension officers Periodic data sources such as exit interview surveys Outcome1 to 3 years Periodic data sources such as population-based surveys One-off data sources such as special studies (research or evaluation) Impact2 to 5 years Periodic data sources such as surveillance Periodic data sources such as population-based surveys One-off data sources such as special studies (research or evaluation)

What is a survey? A method of collecting information from respondents – who can be either a sample of the population or selected, targeted organizations (or facilities). This may involve gathering information either at a point in time (cross-sectional survey), or tracking a group of people over a period of time (longitudinal survey, or panel data). Information generated through surveys can include factual information; levels of knowledge; attitudes; personality types; beliefs and preferences. For many national-level surveys, standard protocols have been developed to ensure that the survey gets conducted in the same way every time it is done: this enables trend analysis over time and comparisons.

Biases in Surveys Biases are 'features of studies which makes a particular result more likely - like a football pitch which slopes from one end to the other' Three main types of bias: ▫Information bias  Recall bias  Observer bias  Social desirability bias ▫Non-response bias ▫Selection bias

Sampling in surveys Sampling = selecting samples for the survey Sampling unit = what you will select Sampling frame = a list of potential sampling units from which to choose sampling Sampling universe = all the sampling units Sampling methodology = how you go about sampling

Results to be achieved with this Component Long-term result: ▫Surveys that: answer relevant objectives, are unbiased and accurate, generalizable, ethical and economical are undertaken, or existing survey results used, as required by the program’s data needs. Short and medium-term results: ▫Inventory of relevant surveys already conducted. ▫Specified schedule for future surveys (to be conducted by the organization, or from where the organization should draw its own data). ▫Protocols for all surveys based on international or national standards (if in existence).

Should a survey be done?

Ethics and surveys Respondents need to have… ▫The choice to participate or not to participate in the research (the ability to ‘opt-out’ entirely from the survey); ▫An understanding of why the research is being carried out, the possible positive outcomes associated with the research, and the possible negative outcomes associated with the research; ▫A clear understanding of the possibility that there will be no individual impact of the research; ▫The knowledge that they are free to stop the survey at any point during the survey (the ability to ‘opt-out’ after beginning to respond). ▫The knowledge that they are free to refuse to answer any questions they do not want to answer (the ability to ‘opt-out’ of particular questions). ▫The reassurance that their answers are strictly confidential, will be aggregated with no name identifiers, and will not be attributed to any individual.

Types of Data Sources One-off data sources Periodic data sources Routine data sources

INDICATOR TYPE DATA COLLECTION TIME FRAME TYPES OF DATA SOURCES Input/ Process Continuously Routine data sources such as statistics about education or other government services Routine data sources such as routine monitoring data about training materials developed for schools OutputQuarterly, semi- annually, or annually Routine data sources such as statistics about education or other government services Routine data sources such as routine monitoring data about teacher absenteeism or the number of visits by agriculture extension officers Periodic data sources such as exit interview surveys Outcome1 to 3 years Periodic data sources such as population-based surveys One-off data sources such as special studies (research or evaluation) Impact2 to 5 years Periodic data sources such as surveillance Periodic data sources such as population-based surveys One-off data sources such as special studies (research or evaluation)

Results to be achieved with this Component Long-term result: ▫Evaluation and research results are used to inform policy, programming and intervention selection. Short and medium term results: ▫Inventory of completed and ongoing program evaluations and research studies. ▫Inventory of evaluation and research capacity, including major research institutions and their areas of work.

Results to be achieved with this Component Short and medium term results (cont’d): ▫Program evaluation and research agenda. ▫Ethical approval procedures and standards in place. ▫Guidelines on evaluation and research standards and methods. ▫Participation in a conference or forum to disseminate and discuss evaluation and research findings. ▫Evidence of use of evaluation/research findings (e.g. research results referenced in planning documents).

43 Key concepts Evaluation ▫the systematic and objective assessment of an on- going or completed project, program or policy, its design, implementation and results. Types of evaluations ▫Formative Evaluation ▫Process Evaluation ▫Outcome Evaluation ▫Economic Evaluation ▫Impact Evaluation

What kind of evaluation is this? Conducting a situation assessment of the CO2 emissions by factories in Turkey to understand the current emission levels Comparing which HIV interventions will avert the most new HIV infections Understanding whether maternal health programmes have resulted in lower maternal and infant mortality and morbidity Assessing the driving practices of motorists in Turkey and how these have changed over time Assessing the implementation of public servant training programmes to understand how implementation took place

45 Key concepts Types of research (different ways of classification) ▫By the purpose:  Basic science research  Applied research ▫By the type of data  Quantitative  Qualitative ▫By the way in which the research will be used  Policy analysis  Services research  Operations research

46 Key concepts Difference between research and evaluation ▫Primary intent of the activity  Generation of generalisable knowledge  Assess something specific ▫Key distinguishing factor From research and evaluation to learning

47 Ethics in research ▫Is it true? ▫Is it fair? ▫Is it wise? Key concepts

Results to be achieved with this Component Long-term result ▫Data quality (data are valid, reliable, comprehensive, and timely) and the thoroughness of all six data management processes are externally verified on a periodic basis, and actions implemented to address obstacles to production of high quality data. Short and medium term results: ▫Guidelines for supportive supervision are developed. ▫Data auditing protocols are followed. ▫Supportive supervision visits, including data assessments and feedback, take place. ▫Data audit visits take place periodically. ▫Supervision reports and data audit reports are produced.

50 Key concepts Data quality assurance entails is a set of internal and external mechanisms and processes to ensure that data meets the six dimensions of quality Data quality assurance processes include ▫planning for quality, ▫controlling quality and ▫implementing remedial actions to improve quality Supportive supervision and data auditing is one of the DQA processes (controlling for quality)

51 Key concepts M&E supervision vs. implementation supervision Supportive supervision and data auditing are integral parts of a routine monitoring system Adequate funding and skilled human resources Supervision should be supportive Supervision and data auditing takes place at all levels where data are managed

52

53 Key concepts Guidelines and protocols are needed ▫Guidelines for supportive supervision ▫Data audit protocols Data auditing involves: ▫Verifying that appropriate data management systems are in place ▫Quality of reported data Five types of verifications during data auditing

54 Verification Types Verification n o. 1: Observation If practically possible, observe the connection between the delivery of services/commodities and the completion of the source document that records that service delivery. Verification n o. 2: Documentation Review Review availability and completeness of all source documents (records) for the selected reporting period. Verification n o. 3: Trace and Verification Trace and verify reported numbers: (1) Recount the reported numbers from available source documents; (2) Compare the verified numbers to the site reported number; (3) Identify reasons for any differences. Verification n o. 4: Cross-checks Cross-check the verified report totals with other data-sources (e.g. inventory records, laboratory reports, etc), if these are available. Verification n o. 5: Spot checks Perform spot checks to verify the actual delivery of services or commodities to the target populations, if practically feasible. Source: the Global Fund