Introduction Governments and donors supporting social development programmes rely on implementing partners to produce high quality data for reporting on.

Slides:



Advertisements
Similar presentations
Harmonized Approach to Cash Transfers to Implementing Partners UNDG ExCom.
Advertisements

Data Quality Considerations
Donald T. Simeon Caribbean Health Research Council
The ISO 9002 Quality Assurance Management System
Chapter 6 Methodology Conceptual Databases Design Transparencies © Pearson Education Limited 1995, 2005.
Copyright 2002 Prentice-Hall, Inc. Chapter 1 The Systems Development Environment 1.1 Modern Systems Analysis and Design Third Edition Jeffrey A. Hoffer.
Pertemuan 16 Matakuliah: A0214/Audit Sistem Informasi Tahun: 2007.
Quality Management.
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
AUDIT PROCEDURES. Commonly used Audit Procedures Analytical Procedures Analytical Procedures Basic Audit Approaches - Basic Audit Approaches - System.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 27 Slide 1 Quality Management 1.
ACRIN 6698 Diffusion-weighted MRI Biomarkers for Assessment of Breast Cancer Response to Neoadjuvant Treatment: An I-SPY 2 Trial Substudy Presented by:
Copyright 2002 Prentice-Hall, Inc. Chapter 1 The Systems Development Environment 1.1 Modern Systems Analysis and Design.
Presented to President’s Cabinet. INTERNAL CONTROLS are the integration of the activities, plans, attitudes, policies and efforts of the people of an.
Introduction Bilateral and multi-lateral donors often fund a single project effort in multiple countries. Such projects have similar goals and objectives.
Technical Approach to and Experiences from Strengthening National Monitoring and Evaluation System for Most Vulnerable Children Program in Tanzania Prisca.
- 1 - Roadmap to Re-aligning the Customer Master with Oracle's TCA Northern California OAUG March 7, 2005.
South Africa Data Warehouse for PEPFAR Presented by: Michael Ogawa Khulisa Management Services
Ensuring the quality of Qualitative Data Presented by Lorie Broomhall, Ph.D. Senior HIV/AIDS Advisor Nigeria Monitoring and Evaluation Management Services.
Methodology - Conceptual Database Design Transparencies
Methodology Conceptual Databases Design
1. IASC Operational Guidance on Coordinated Assessments (session 05) Information in Disasters Workshop Tanoa Plaza Hotel, Suva, Fiji June
1 Chapter 15 Methodology Conceptual Databases Design Transparencies Last Updated: April 2011 By M. Arief
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
GOVERNANCE IN DISASTER RISK REDUCTION: Issues for CDM By Jeremy Collymore.
January 19, 2013 Chair: Mr. Bob O’Neill Making Studies Work —a discussion of challenges and solutions to the management of studies.
Linked NHA Tables-Examples Background The National Health Accounts (NHA) is a standardized methodology that describes flow of funds through the health.
S14: Analytical Review and Audit Approaches. Session Objectives To define analytical review To define analytical review To explain commonly used analytical.
Methodology - Conceptual Database Design. 2 Design Methodology u Structured approach that uses procedures, techniques, tools, and documentation aids to.
Methodology: Conceptual Databases Design
School of Health Systems and Public Health Monitoring & Evaluation of HIV and AIDS Programs Data Quality Wednesday March 2, 2011 Win Brown USAID/South.
Good Practice for monitoring plans Viewpoint of a DOE Marco van der Linden, SGS Climate Change Programme
Data Quality Assessment
Methodology - Conceptual Database Design
Early Childhood Development (ECD) encompasses the many ways in which young children grow and thrive: physically, mentally, emotionally, morally and socially.
QUALITY MANAGEMENT STATEMENT
Part4 Methodology of Database Design Chapter 07- Overview of Conceptual Database Design Lu Wei College of Software and Microelectronics Northwestern Polytechnical.
Tier III Implementation. Define the Problem  In general - Identify initial concern General description of problem Prioritize and select target behavior.
Selecting Evidence Based Practices Oregon’s initial attempts to derive a process Implementation Conversations 11/10.
Quality of Voluntary Medical Male Circumcision Services during Scale-Up: A Comparative Process Evaluation in Kenya, South Africa, Tanzania and Zimbabwe.
MANAGEMENT REVIEWS AND AUDITING IN SOCIAL INSURANCE INSTITUTIONS by: Jean-Victor Gruat, EUSE, Business Processes MANAGEMENT REVIEW AND AUDITING.
FIJI PERSPECTIVE. Donor programs well aligned to strategic priorities of Government However, the lack of a proper framework to guide the Government- Donor.
Analytical Review and Audit Approaches
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Data Quality Assessment of PEPFAR ART Sites In Nigeria Final Report February 17, 2006 Nigeria/Monitoring and Evaluation Management Services in collaboration.
Session 6: Data Flow, Data Management, and Data Quality.
Session 2: Developing a Comprehensive M&E Work Plan.
Data Quality. Learning Objectives 1.Identify data quality issues at each step of a data management system 2.List key criteria used to assess data quality.
Data Quality Management. Learning Objectives  At the end of this session, participants will be able to:  Summarize basic terminology regarding data.
1 Ambient Monitoring Program PM 2.5 Data Lean 6 Sigma Air Director’s Meeting May 2015.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Building New Institutional Capacity in M&E: The Experience of National AIDS Coordinating Authority V F Kemerer Getting To Results IDEAS Global Assembly.
USAID’s Data Quality Standards & Conducting a DQA # ? % ?
Improving health worldwide Implications for Monitoring of the HIV Care Cascade? Jim Todd MeSH Satellite Session IAS Durban, Monday 18 th.
Methodology Conceptual Databases Design
Introduction to Data Quality
External Verification Report 2014/15
Methodology Conceptual Database Design
CLINICAL DATA MANAGEMENT
2018 OSEP Project Directors’ Conference
Regulatory Oversight of HOF in Finland
Operationalizing Export Certification and Regionalization Programmes
OVC_HIVSTAT HIV Risk Assessment Prototype.
Methodology Conceptual Databases Design
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Stewardship in biotechnology
Technical Assistance Webinar
Data collection and validation support for the management of the ESF
Presentation transcript:

Introduction Governments and donors supporting social development programmes rely on implementing partners to produce high quality data for reporting on programmatic performance. Data derived from monitoring helps managers and funders to judge the efficiency and effectiveness of programme efforts. However, if data is not of high quality, the quality of programme efforts is questionable. RELIABILITY – the measure of consistency of data management processes – has been shown to be one of the most important aspects of overall data quality. This is particularly true of programmes that operate in multiple sites. Based on Khulisa’s experience in assessing data quality in SA Government programmes and among PEPFAR/South Africa partners, inconsistent data collection processes between sites represents one of the largest risks to Data Quality. Khulisa’s work in Data Quality is funded by the President’s Emergency Plan for AIDS Relief (PEPFAR) through a contract with USAID/South Africa. Tips for Improving Data Reliability  Comprehensive data management procedures and guidelines are given to all personnel who handle data (across sites).  Data Management Guidelines include Indicator Protocol Reference Sheets to fully define and operationalize indicators being measured.  An audit trail exists of all the data management steps, so that the organization itself can retrace the steps.  Documents and tools are version controlled with dates provided.  All decisions made with regard to the data management system are documented and incorporated into the IPRS and data management procedures and guidelines.  Same data collection tools used at all programme sites.  Ideally, same tool used for data collection, collation and reporting (thus eliminating transcription steps).  All data collectors are trained prior to data collection. Ideally, all data collectors are trained by the same person to ensure consistency in instructions.  Double entry of data into database, system designed to catch any discrepancies.  Quality control reviews for any data transcription or manipulation steps.  Secondary and tertiary data are checked for validity and reliability. Programme Monitoring: Data Collection Methods that Enhance Reliability J. Welty Mangxaba 1 and M.P. Selvaggio 1 1 Khulisa Management Services, P.O. Box 923, Parklands 2121, South Africa Common Pit-Falls Affecting Data Reliability  No documented standard operating procedures, thus resulting in inconsistencies in data handling methodology.  Not ensuring inter-tool reliability when different tools are used at different sites.  Different manipulation, analysis, or aggregation techniques used from one period to the next.  No standard collation tool used.  No audit trail kept of manual aggregations.  No version control of documents – can cause confusion if obsolete documents are inadvertently used.  An Indicator Protocol Reference Sheet contains multiple indicators; can cause confusion on specific process for each indicator.  Weak or no existing performance review process for data management personnel. Accurately Measuring Progress th Avenue Parktown North Johannesburg South Africa 2193 Tel: Fax: Web: