Download presentation
Presentation is loading. Please wait.
Published byGodwin Marsh Modified over 9 years ago
1
DATA QUALITY How closely do the data used reflect the truth about results?
2
2 What is quality Quality is dynamic concept that is continuously changing to respond to changing customer requirements Conformance to specifications Fitness for use
3
Purpose of Data Quality Assessment MANDATORY: Data reported to Washington for Government Performance and Results Act (GPRA) reporting purposes or for reporting externally on Agency performance must have had a data quality assessment at some time within the three years before submission. (ADS203.3.8.3.) USAID Missions are mandate to conduct data quality assessments more frequently if needed. However, managers should be aware of the strengths and weaknesses of all indicators. USAID Missions are not required to conduct data quality assessments for data that are not reported to USAID/Washington. Managers are not required to do data quality assessments on all performance indicators that they use. 3
4
4 4 Issues MANAGEMENT Can you make decisions based on the data? Better quality data leads to better informed management and planning. REPORTING Are the data believable? Audiences want to know how “credible” your data are so they can trust your analysis and conclusions.
5
5 Quality issues Problems can result from: Human error: Machine error Process error:
6
6 6 Five standards for quality of data VALIDITY RELIABILITY PRECISION TIMELINESS INTEGRITY
7
7 7 Validity Key question: Do data clearly and directly measure what we intend? (7 indicator characteristics?) Issue: Bias Result: Modern sanitation practices improved Indicator: Number of residents in targeted villages who report using “clean household” practices Source: door-to-door survey conducted three times a year Most of the people in the targeted region work long hours in the fields during the harvest season Issue: Direct Result: Poverty of vulnerable communities in conflict region reduced Indicator: Number of people living in poverty Source: government statistics office The government doesn’t include internally displaced people (IDPs) in the poverty statistics
8
8 8 Reliability Key question: If you repeated the same measurement or collection process, would you get the same data? Issue: Consistency or Repeatability Result: Employment opportunities for targeted sectors expanded Indicator: Number of people employed by USAID-assisted enterprises Source: Structured interviews with USAID-assisted enterprises, as reported by implementing partner AAA, BBB, and CCC The DO Team found out that the implementing partners were using these definitions: AAA – employees means receives wages from the enterprise BBB – employees means receives full-time wages from the enterprise CCC – employees means works at least 25 hours a week
9
9 9 Timeliness Key question: Are data available timely enough to inform management decisions? Issue: How Frequent Result: Use of modern contraceptives by targeted population increased Indicator: Number of married women of reproductive age reporting using modern contraceptives (CPR) Source: DHS Survey The DHS survey is conducted approximately every 5 years Issue: How Current Result: Primary school attrition in targeted region reduced Indicator: Rate of student attrition for years 1 and 2 at targeted schools Source: Enrollment analysis report from Ministry of Education In July 2002 the MOE published full enrollment analysis for school year August 2000 – June 2001
10
10 Precision Key question: Are the data precise enough to inform management decisions? Issue: Enough detail Result: CSO representation of citizen interests at national level increased Indicator: Average score of USAID-assisted CSOs on the CSO Advocacy Index Source: Ratings made by Partner XXX after interviews with each CSO The DO team reported this data to the Mission Director: 1999 = 2.42 2000 = 3 2001 = 3.000 Issue: Margin of error Result: Primary school attrition in targeted region reduced Indicator: Rate of student attrition for years 1 and 2 at targeted schools Source: Survey conducted by partner. Survey is informal and has a margin of error of +/- 10% The USAID intervention is expected to cause 5 more students (for every 100) to stay in school longer
11
11 Integrity Key question: Are there mechanisms in place to reduce the possibility that data are manipulated for political or personal gain? Issue: Intentional manipulation Result: Financial sustainability of targeted CSOs improved Indicator: Dollars of funding raised from local sources per year Source: Structured interviews with targeted CSOs When a DO Team member conducted spot checks with the CSOs, she found out that organizations CCC and GGG counted funds from other donors as part of the “locally raised” funds.
12
12 Techniques to Assess Data Quality WHY Goal is to ensure DO team is aware of: Data strengths and weaknesses Extent to which data can be trusted when making management decisions and reporting All data reported to Washington must have had a data quality assessment at some time in the three years before submission. ADS 203.3.5.2
13
13 Examples of problems Invalid Key fields Data collection forms not standardized Location Accuracy Different wards Logical Inconsistencies Jobs completed before they started Mandatory Fields missing data Sex or age Data collectors Bias Selfish personal interest
14
14 Ways of improving quality Tackle quality at source, not downstream in the lifecycle Training data collectors is importance in getting it right Continual improvement with quality method
15
15 HOW? Steps to Conduct Assessment 1. Review performance data Examine data collection, maintenance, processing procedures and controls 2. Verify performance data against data quality standards Reliability, precision, timeliness, validity, integrity 3. If data quality limitations are identified, take actions to address them Triangulate; Supplement with data from multiple sources Report the limitations Revise indicator 4. Document the assessment and the limitations in reports to be sent to the Mission who in turn communicate with the IP about the outcome of the assessment 5. Retain supporting documentation in files Photocopies of documents collected during the exercise Approach for conducting data quality assessment 6. If data will be included in the annual report, disclose the DQA findings in the “data quality limitations” section of the Annual report
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.