Download presentation
Presentation is loading. Please wait.
Published byDeirdre Barton Modified over 8 years ago
1
Data Quality Management
2
Learning Objectives At the end of this session, participants will be able to: Summarize basic terminology regarding data quality management List and describe the 5 threats to quality data Identify possible threat to data quality in an information system Generate a plan to managaged identify threats to data quality
3
Why M&E is important M&E promotes organizational learning and encourages adaptive management
4
Content of Workshop Session What is Data Quality? The criteria for data quality Constructing a data quality plan Data quality auditing
5
What is Data Quality ? Chess game of cost versus quality Criterion based evaluation of data Criterion based system of data management
6
Dimensions of Data Quality Validity Reliability Timeliness Precision Integrity
7
Definition of Validity A characteristic of measurement in which a tool actually measures what the researcher intended to measure Have we actually measured what we intended?
8
Threats to Validity Definitional issues Proxy measures Inclusions / Exclusions Data sources
9
Validity: Questions to ask yourself… Is there a relationship between the activity or program and what you are measuring? What is the data transcription process? Is there potential for error? Are steps being taken to limit transcription error (e.g., double keying of data for large surveys, built in validation checks, random checks)?
10
Definition of Reliability ‘A characteristic of measurement concerned with consistency’ Can we consistently measure what we intended?
11
Threats to Reliability I Time People Place
12
Threats to Reliability II Collection methodologies Collection instruments Personnel issues Analysis and manipulation methodologies
13
Reliability: Questions to ask yourself… Is the same instrument used from year to year, site to site? Is the same data collection process used from year to year, site to site? Are there procedures in place to ensure that data are free of significant error and that bias is not introduced (e.g., instructions, indicator information sheets, training, etc.)?
14
Definition of Timeliness The relationship between the time of collection, collation and reporting to the relevance of the data for decision making processes. Does the data still have relevance and value when reported?
15
Threats to Timeliness Collection frequencies Reporting frequencies Time dependency
16
Timeliness: Questions to ask yourself… Are data available on a frequent enough basis to inform program management decisions? Is a regularized schedule of data collection in place to meet program management needs? Are data from within the reporting period of interest (i.e. are the data from a point in time after the intervention has begun)? Are the data reported as soon as possible after collection?
17
Definition of Precision Accuracy (measure of bias) Precision (measure of error) Is the margin of error in the data less than the expected change the project was designed to effect?
18
Threats to Precision Source error / bias Instrumentation error Transcription error Manipulation error
19
Precision: Questions to ask yourself… Is the margin of error less than expected change being measured? Are the margins of error acceptable for program decision making? Have issues around precision been reported? Would an increase in the degree of accuracy be more costly than the increased value of the information?
20
Good Data are Valid, Reliable and Precise X X X X X X X X X X XXX XXXX XXX XXX XXXX XXX Accurate/Valid Reliable Precise ≠ Accurate/Valid Reliable Precise ≠ Accurate/Valid ≠ Reliable ≠ Precise
21
Definition of Integrity Measure of ‘truthfulness’ of the data Is the data free from ‘untruth’ introduced by either human or technical means, whether willfully or unconsciously?
22
Threats to Integrity I Temptation Technology Time
23
Threats to Integrity II Corruption, intentional or unintentional Personal manipulations Technological failures Lack of audit verification and validation
24
The Data Quality Plan Operational Plan for managing data quality Indicator Information Sheets Includes a Data Quality Risk Analysis Includes an audit trail reference
25
Framework for Data Quality Assessments Relationships with a Data System Data Management System Data Management Processes / Procedures Data Quality System Data Quality Processes / Procedures Auditable System Risk Verification Source Validity Reliability Integrity Precision Timeliness Paper Trail that allows verification of the entire DMS and the data produced within it Collection Collation Analysis Reporting Usage
26
Data Quality Issues at SOURCE The potential risk of poor data quality increases with secondary and tertiary data sources Examples: Validity: data could be incomplete (incomplete Drs notes, ineligible notes in patient files) Reliability: inconsistent recording of information by different staff because of differing skills levels
27
To Ensure Data Quality at SOURCE Design instruments carefully and correctly Include data providers (community stakeholders) and data processors in decision to establish what is feasible to collect, to review process, and to draft instruments. Develop & document instructions for the data collectors, on the collection forms, and for computer procedures
28
To Ensure Data Quality at SOURCE Ensure all personnel are trained in their assigned task. Use 1 trainer if possible Develop an appropriate sample
29
Data Quality Issues at COLLECTION Incomplete entries in spreadsheets incorrect data transcriptions data entered in wrong fields in a database Inconsistent entries of data by different data capturers
30
To ensure data quality during COLLECTION Develop specific instructions for data collection Routinely check to see if instructions are being followed Identify what to do if you (or someone) wants to make a change to the data collection process or if you have problems during data collection (change mgmt process) Check to see if people follow the change management process Ensure all data collection, entry and analysis needs are available (pens, paper, forms, computers)
31
To ensure data quality during COLLECTION Train data collectors in how to collect information Develop SOPs for managing the collected data (e.g. moving data from 1 point to the next) Develop SOPs for revising the collection tool Communicate the process and SOPs Conduct on-site reviews during the process
32
To ensure data quality during COLLATION Develop check lists and sign off for key steps Conduct reviews during entry process Create an electronic or manual format that includes a data verification process by a second individual who is not entering the data
33
To ensure data quality during COLLATION Randomly sample data and verify Ensure problems are reported and documented, corrected and communicated and tracked back to the source of the problem
34
To ensure data quality during ANALYSIS Ensure analysis techniques meet the requirements for proper use Disclose all conditions /assumptions affecting interpretations for data Have experts review reports for reasonableness of analysis
35
To ensure data quality during REPORTING Synthesize results for the appropriate audience Maintain integrity in reporting – don’t leave out key info Have multiple reviewers within the organization - prior to dissemination! Protect confidentiality in reports / communication tools Review data / provide feedback with those who have a stake in the results
36
To ensure data quality during USAGE Understand your data !! Use your data !!
37
Minimizing Data Quality Risks Technology Ensuring that data analysis/statistical software is up-to-date. Streamlining instruments and data collection methods. Competence of personnel Ensuring that staff are well-versed in all stages of the Data Management process (data collection, entry, assessment, risk analysis, etc). Proficiency with data software. Documentation and audit trails Outsourcing
38
Data Quality Audits Verification Validation Self-assessment Internal audit External audit
39
DQA Process Plan Do Check Act Data quality training Data quality plans Construct audit plan Close non- compliances Correct data practices Clean database Review self- evaluations Audit input from partners Review error logs Audit data in database Audit the output reports Submit audit report Self-evaluation Data input Run error logs Report generation The auditor is responsible for the areas indicated in yellow
40
M&E Work Plan tasks Identify the risks associated with your current data management practices and assign a risk value to them Identify the contingency plans needed to improve the data quality practice Complete a Data Quality Plan for one of the Indicators you will be reporting against.
41
Acknowledgements This presentation was the result of on-going collaborations between: USG – The President’s Emergency Plan for AIDS Relief in South Africa USAID MEASURE Evaluation Khulisa Management Services
42
MEASURE Evaluation is funded by the U.S. Agency for International Development (USAID) through Cooperative Agreement GPO-A-00-03-00003-00 and is implemented by the Carolina Population Center at the University of North Carolina in partnership with Futures Group, John Snow, Inc., Macro International, and Tulane University. Visit us online at http://www.cpc.unc.edu/measure
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.