Data Quality Management. Learning Objectives  At the end of this session, participants will be able to:  Summarize basic terminology regarding data.

Slides:



Advertisements
Similar presentations
MONITORING OF SUBGRANTEES
Advertisements

MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Survey Quality Control.
Data Quality Considerations
Software Quality Assurance Plan
Chapter 7: Key Process Areas for Level 2: Repeatable - Arvind Kabir Yateesh.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Multiple Indicator Cluster Surveys Survey Design Workshop
Auditing Computer Systems
Laboratory Personnel Dr/Ehsan Moahmen Rizk.
Indicators, Data Sources, and Data Quality for TB M&E
Pertemuan Matakuliah: A0214/Audit Sistem Informasi Tahun: 2007.
Quality evaluation and improvement for Internal Audit
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
Codex Guidelines for the Application of HACCP
Copyright 2010, The World Bank Group. All Rights Reserved. Training and Procedural Manuals Section A 1.
Martha Thurlow and Laurene Christensen National Center on Educational Outcomes CEC Preconvention Workshop #4 April 21, 2010.
Lecture #9 Project Quality Management Quality Processes- Quality Assurance and Quality Control Ghazala Amin.
Ensuring the quality of Qualitative Data Presented by Lorie Broomhall, Ph.D. Senior HIV/AIDS Advisor Nigeria Monitoring and Evaluation Management Services.
FCS - AAO - DM COMPE/SE/ISE 492 Senior Project 2 System/Software Test Documentation (STD) System/Software Test Documentation (STD)
DATA QUALITY How closely do the data used reflect the truth about results?
Certification and Accreditation CS Phase-1: Definition Atif Sultanuddin Raja Chawat Raja Chawat.
Understanding Meaning and Importance of Competency Based Assessment
European Conference on Quality in Official Statistics Session 26: Quality Issues in Census « Rome, 10 July 2008 « Quality Assurance and Control Programme.
School of Health Systems and Public Health Monitoring & Evaluation of HIV and AIDS Programs Data Quality Wednesday March 2, 2011 Win Brown USAID/South.
Data Quality Assessment
Introduction Governments and donors supporting social development programmes rely on implementing partners to produce high quality data for reporting on.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
LIBERIA DATA QUALITY How closely do the data used reflect the truth about results?
Copyright 2010, The World Bank Group. All Rights Reserved. Principles, criteria and methods Part 2 Quality management Produced in Collaboration between.
ISO DOCUMENTATION. ISO Environmental Management Systems2 Lesson Learning Goals At the end of this lesson you should be able to:  Name.
1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.
Data quality: adapted from a presentation given by Mr. B Sikhakhane, Gauteng.
Healthy Futures Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Copyright © 2007 Pearson Education Canada 23-1 Chapter 23: Using Advanced Skills.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
MICS Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Overview of the MICS Process.
Copyright 2010, The World Bank Group. All Rights Reserved. Recommended Tabulations and Dissemination Section B.
Linking Data with Action Part 2: Understanding Data Discrepancies.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Data Quality Assessment of PEPFAR ART Sites In Nigeria Final Report February 17, 2006 Nigeria/Monitoring and Evaluation Management Services in collaboration.
Session 6: Data Flow, Data Management, and Data Quality.
Session 2: Developing a Comprehensive M&E Work Plan.
Day 6: Supervisors’ Training This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency.
Data Quality. Learning Objectives 1.Identify data quality issues at each step of a data management system 2.List key criteria used to assess data quality.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Developing Program Indicators Measuring Results MEASURE Evaluation.
1 Auditing Your Fusion Center Privacy Policy. 22 Recommendations to the program resulting in improvements Updates to privacy documentation Informal discussions.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
PLANNING FOR QUALITATIVE DATA COLLECTION Day 2 - Session 4.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
MEASURE EVALUATION Session: 7 Developing Action Plans Based on Results Data Quality Assurance Workshop.
USAID’s Data Quality Standards & Conducting a DQA # ? % ?
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Data Quality Assurance Workshop
Introduction to Data Quality
Data Quality Assurance
Session: 5 Using the RDQA tool for System Assessment
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
Monitoring and Evaluation of HIV/AIDS Programs Workshop Overview
Session: 4 Using the RDQA tool for Data Verification
Assessment Training Session 9: Assessment Analysis
Introduction to the PRISM Framework
How to conduct Effective Stage-1 Audit
Measuring Data Quality
Session: 9 On-going Monitoring & Follow Up
Internal Control Internal control is the process designed and affected by owners, management, and other personnel. It is implemented to address business.
Presentation transcript:

Data Quality Management

Learning Objectives  At the end of this session, participants will be able to:  Summarize basic terminology regarding data quality management  List and describe the 5 threats to quality data  Identify possible threat to data quality in an information system  Generate a plan to managaged identify threats to data quality

Why M&E is important  M&E promotes organizational learning and encourages adaptive management

Content of Workshop Session  What is Data Quality?  The criteria for data quality  Constructing a data quality plan  Data quality auditing

What is Data Quality ?  Chess game of cost versus quality  Criterion based evaluation of data  Criterion based system of data management

Dimensions of Data Quality  Validity  Reliability  Timeliness  Precision  Integrity

Definition of Validity  A characteristic of measurement in which a tool actually measures what the researcher intended to measure  Have we actually measured what we intended?

Threats to Validity  Definitional issues  Proxy measures  Inclusions / Exclusions  Data sources

Validity: Questions to ask yourself…  Is there a relationship between the activity or program and what you are measuring?  What is the data transcription process? Is there potential for error?  Are steps being taken to limit transcription error (e.g., double keying of data for large surveys, built in validation checks, random checks)?

Definition of Reliability  ‘A characteristic of measurement concerned with consistency’  Can we consistently measure what we intended?

Threats to Reliability I Time People Place

Threats to Reliability II  Collection methodologies  Collection instruments  Personnel issues  Analysis and manipulation methodologies

Reliability: Questions to ask yourself…  Is the same instrument used from year to year, site to site?  Is the same data collection process used from year to year, site to site?  Are there procedures in place to ensure that data are free of significant error and that bias is not introduced (e.g., instructions, indicator information sheets, training, etc.)?

Definition of Timeliness  The relationship between the time of collection, collation and reporting to the relevance of the data for decision making processes.  Does the data still have relevance and value when reported?

Threats to Timeliness  Collection frequencies  Reporting frequencies  Time dependency

Timeliness: Questions to ask yourself…  Are data available on a frequent enough basis to inform program management decisions?  Is a regularized schedule of data collection in place to meet program management needs?  Are data from within the reporting period of interest (i.e. are the data from a point in time after the intervention has begun)?  Are the data reported as soon as possible after collection?

Definition of Precision  Accuracy (measure of bias)  Precision (measure of error)  Is the margin of error in the data less than the expected change the project was designed to effect?

Threats to Precision  Source error / bias  Instrumentation error  Transcription error  Manipulation error

Precision: Questions to ask yourself…  Is the margin of error less than expected change being measured?  Are the margins of error acceptable for program decision making?  Have issues around precision been reported?  Would an increase in the degree of accuracy be more costly than the increased value of the information?

Good Data are Valid, Reliable and Precise X X X X X X X X X X XXX XXXX XXX XXX XXXX XXX  Accurate/Valid  Reliable  Precise ≠ Accurate/Valid  Reliable  Precise ≠ Accurate/Valid ≠ Reliable ≠ Precise

Definition of Integrity  Measure of ‘truthfulness’ of the data  Is the data free from ‘untruth’ introduced by either human or technical means, whether willfully or unconsciously?

Threats to Integrity I Temptation Technology Time

Threats to Integrity II  Corruption, intentional or unintentional  Personal manipulations  Technological failures  Lack of audit verification and validation

The Data Quality Plan  Operational Plan for managing data quality  Indicator Information Sheets  Includes a Data Quality Risk Analysis  Includes an audit trail reference

Framework for Data Quality Assessments Relationships with a Data System Data Management System Data Management Processes / Procedures Data Quality System Data Quality Processes / Procedures Auditable System Risk Verification Source Validity Reliability Integrity Precision Timeliness Paper Trail that allows verification of the entire DMS and the data produced within it Collection Collation Analysis Reporting Usage

Data Quality Issues at SOURCE  The potential risk of poor data quality increases with secondary and tertiary data sources  Examples:  Validity: data could be incomplete (incomplete Drs notes, ineligible notes in patient files)  Reliability: inconsistent recording of information by different staff because of differing skills levels

To Ensure Data Quality at SOURCE  Design instruments carefully and correctly  Include data providers (community stakeholders) and data processors in decision to establish what is feasible to collect, to review process, and to draft instruments.  Develop & document instructions for the data collectors, on the collection forms, and for computer procedures

To Ensure Data Quality at SOURCE  Ensure all personnel are trained in their assigned task. Use 1 trainer if possible  Develop an appropriate sample

Data Quality Issues at COLLECTION  Incomplete entries in spreadsheets  incorrect data transcriptions  data entered in wrong fields in a database  Inconsistent entries of data by different data capturers

To ensure data quality during COLLECTION  Develop specific instructions for data collection  Routinely check to see if instructions are being followed  Identify what to do if you (or someone) wants to make a change to the data collection process or if you have problems during data collection (change mgmt process)  Check to see if people follow the change management process  Ensure all data collection, entry and analysis needs are available (pens, paper, forms, computers)

To ensure data quality during COLLECTION  Train data collectors in how to collect information  Develop SOPs for managing the collected data (e.g. moving data from 1 point to the next)  Develop SOPs for revising the collection tool  Communicate the process and SOPs  Conduct on-site reviews during the process

To ensure data quality during COLLATION  Develop check lists and sign off for key steps  Conduct reviews during entry process  Create an electronic or manual format that includes a data verification process by a second individual who is not entering the data

To ensure data quality during COLLATION  Randomly sample data and verify  Ensure problems are reported and documented, corrected and communicated and tracked back to the source of the problem

To ensure data quality during ANALYSIS  Ensure analysis techniques meet the requirements for proper use  Disclose all conditions /assumptions affecting interpretations for data  Have experts review reports for reasonableness of analysis

To ensure data quality during REPORTING  Synthesize results for the appropriate audience  Maintain integrity in reporting – don’t leave out key info  Have multiple reviewers within the organization - prior to dissemination!  Protect confidentiality in reports / communication tools  Review data / provide feedback with those who have a stake in the results

To ensure data quality during USAGE  Understand your data !!  Use your data !!

Minimizing Data Quality Risks  Technology  Ensuring that data analysis/statistical software is up-to-date.  Streamlining instruments and data collection methods.  Competence of personnel  Ensuring that staff are well-versed in all stages of the Data Management process (data collection, entry, assessment, risk analysis, etc).  Proficiency with data software.  Documentation and audit trails  Outsourcing

Data Quality Audits  Verification  Validation  Self-assessment  Internal audit  External audit

DQA Process Plan Do Check Act Data quality training Data quality plans Construct audit plan Close non- compliances Correct data practices Clean database Review self- evaluations Audit input from partners Review error logs Audit data in database Audit the output reports Submit audit report Self-evaluation Data input Run error logs Report generation The auditor is responsible for the areas indicated in yellow

M&E Work Plan tasks  Identify the risks associated with your current data management practices and assign a risk value to them  Identify the contingency plans needed to improve the data quality practice  Complete a Data Quality Plan for one of the Indicators you will be reporting against.

Acknowledgements  This presentation was the result of on-going collaborations between:  USG – The President’s Emergency Plan for AIDS Relief in South Africa  USAID  MEASURE Evaluation  Khulisa Management Services

MEASURE Evaluation is funded by the U.S. Agency for International Development (USAID) through Cooperative Agreement GPO-A and is implemented by the Carolina Population Center at the University of North Carolina in partnership with Futures Group, John Snow, Inc., Macro International, and Tulane University. Visit us online at