DATA QUALITY How closely do the data used reflect the truth about results?

Slides:



Advertisements
Similar presentations
Role of CSOs in monitoring Policies and Progress on MDGs.
Advertisements

Data Quality Considerations
GPRA Government Performance and Results Act
Conceptualization, Operationalization, and Measurement
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Multiple Indicator Cluster Surveys Survey Design Workshop
Verification SY Objectives Identify the steps required for Verification. Calculate an accurate sample size and verify the correct number of applications.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Indicators, Data Sources, and Data Quality for TB M&E
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Gender Statistics and Human Rights Reporting – Regional Workshop Solomon Islands Presentation 4 – 8 August 2014 Novotel Hotel Nadi, Fiji.
Evaluation. Practical Evaluation Michael Quinn Patton.
Writing an Effective Assessment Plan
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
“Strengthening the National Statistical System of RM” Joint Project By 2011, public institutions with the support of civil society organizations (CSOs)
ASSESSING GRADE 5 PUPIL PERFORMANCE (Reading and Mathematics in 2001) Socialist Republic of Vietnam.
Census Census of Population, Housing,Buildings,Establishments and Agriculture Huda Ebrahim Al Shrooqi Central Informatics Organization.
A Study of Major Contributors to The Colonial Theatre Keene, New Hampshire A Research Study Presentation by: The Arts Management Seminar Class Franklin.
Sub-session 1B: General Overview of CRVS systems.
An Update on Florida’s Charter Schools Program Grant: CAPES External Evaluation 2014 Florida Charter Schools Conference: Sharing Responsibility November.
Program Evaluation Using qualitative & qualitative methods.
Liesl Eathington Iowa Community Indicators Program Iowa State University October 2014.
4. Establishing goals to guide & Measures to track.
Overview of the SPDG Competition Jennifer Doolittle, Ph.D. 1.
Ensuring the quality of Qualitative Data Presented by Lorie Broomhall, Ph.D. Senior HIV/AIDS Advisor Nigeria Monitoring and Evaluation Management Services.
Together for Health is funded by the United States Agency for International Development and implemented by JSI Research & Training Institute, Inc. in collaboration.
1 Chapter 15 Methodology Conceptual Databases Design Transparencies Last Updated: April 2011 By M. Arief
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
Education Indicators – Mechanisms to gather data from national sources Workshop on MDG Monitoring Bangkok, THAILAND December 2008.
Education Indicators – Mechanisms to gather data from national sources Workshop on MDG Monitoring Bangkok, THAILAND December 2008.
Welcome! Please join us via teleconference: Phone: Code:
Assessments. Assessment in the Project Cycle DESIGN IMPLEMENTATION MONITORING EVALUATION ASSESSMENT.
1 Things That May Affect Estimates from the American Community Survey.
Copyright 2010, The World Bank Group. All Rights Reserved. Sources of Agricultural Data Section A 1.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
DATE : 3 rd to 4 th June 2013 Location : Thida Baptist Church (Near Su Taung Pyae Pagoda), Myitkyina Time : 9.30 to 5.00 PM CCCM : CAMP PROFILING ENUMARATOR.
National Strategy for the Development of Statistics (NSDS): A Framework for Building Statistical Capacity Presented by Pali Lehohla, Statistician General,
School of Health Systems and Public Health Monitoring & Evaluation of HIV and AIDS Programs Data Quality Wednesday March 2, 2011 Win Brown USAID/South.
SUB-MODULE 5. MEANS OF VERIFICATION RESULTS BASED LOGICAL FRAMEWORK TRAINING Quality Assurance and Results Department (ORQR.2)
Data Quality Assessment
LIBERIA DATA QUALITY How closely do the data used reflect the truth about results?
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
15 step process for developing an inclusive and widely supported integrated RH/HIV Proposal R8 Richard Matikanya International HIV/AIDS Alliance.
Profiling exercise of internally displaced persons’ situations in_______ General presentation of the project Official project launch event Date.
1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.
Verification SY Objectives Identify the steps required for Verification. Calculate an accurate sample size and verify the correct number of applications.
Copyright 2010 Delmar, a part of Cengage Learning. All Rights Reserved. H O S P I T A L I T Y & T R A V E L M A R K E T I N G & T R A V E L M A R K E T.
Performance Reporting Under WIA Title 1B Candice Graham-Young ETA Performance Accountability Team.
UNHCR/e-Centre/InterWorks - Emergency Management Training Session 6.3. Assessment Basics & Bias: A Community-based Approach A Community-based Approach.
Tax Administration Diagnostic Assessment Tool MODULE 11 “POA 9: ACCOUNTABILITY AND TRANSPARENCY”
Program Evaluation Principles and Applications PAS 2010.
Audit of predetermined objectives PFMA Reputation promise/mission The Auditor-General of South Africa has a constitutional mandate and, as the.
Linking Data with Action Part 2: Understanding Data Discrepancies.
1 Integrated Managing for Results Managing for Results: USAID Nigeria & IPs  Day 3: Performance Information.
Lincoln Trail District Health Department Strategic Plan Our Foundation Strategic Goals & Objectives Measures of Success Mission: The Lincoln Trail District.
Adjusting the Organization of National Statistical Systems to Emerging Issues CENTRAL STATISTICS ORGANIZATION AFGHANISTAN by: Abdul Rahman Ghafoori President.
Session 2: Developing a Comprehensive M&E Work Plan.
Annual Report 2013/14. The causes of the causes  The social determinants of health underpin the stark inequalities in health in Camden and Islington.
2015 Afghanistan Demographic and Health Survey (AfDHS) Key Indicators Report.
Supporting measurement & improvement of primary health care (PHC) at the facility and community levels Dr. Jennifer Adams, Deputy Assistant Administrator,
A Training Course for the Analysis and Reporting of Data from Education Management Information Systems (EMIS)
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Data Quality Management. Learning Objectives  At the end of this session, participants will be able to:  Summarize basic terminology regarding data.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Developing Program Indicators Measuring Results MEASURE Evaluation.
© 2004 SHRM SHRM Weekly Online Survey: August 3, 2004 SHRM/Fortune Diversity Weekly Survey Sample comprised of 310 randomly selected HR professionals.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
USAID’s Data Quality Standards & Conducting a DQA # ? % ?
Multi-Sectoral Nutrition Action Planning Training Module
Presentation transcript:

DATA QUALITY How closely do the data used reflect the truth about results?

2 What is quality Quality is dynamic concept that is continuously changing to respond to changing customer requirements  Conformance to specifications  Fitness for use

Purpose of Data Quality Assessment MANDATORY: Data reported to Washington for Government Performance and Results Act (GPRA) reporting purposes or for reporting externally on Agency performance must have had a data quality assessment at some time within the three years before submission. (ADS ) USAID Missions are mandate to conduct data quality assessments more frequently if needed. However, managers should be aware of the strengths and weaknesses of all indicators. USAID Missions are not required to conduct data quality assessments for data that are not reported to USAID/Washington. Managers are not required to do data quality assessments on all performance indicators that they use. 3

4 4 Issues MANAGEMENT Can you make decisions based on the data? Better quality data leads to better informed management and planning. REPORTING Are the data believable? Audiences want to know how “credible” your data are so they can trust your analysis and conclusions.

5 Quality issues Problems can result from:  Human error:  Machine error  Process error:

6 6 Five standards for quality of data VALIDITY RELIABILITY PRECISION TIMELINESS INTEGRITY

7 7 Validity Key question: Do data clearly and directly measure what we intend? (7 indicator characteristics?) Issue: Bias  Result: Modern sanitation practices improved  Indicator: Number of residents in targeted villages who report using “clean household” practices  Source: door-to-door survey conducted three times a year  Most of the people in the targeted region work long hours in the fields during the harvest season Issue: Direct  Result: Poverty of vulnerable communities in conflict region reduced  Indicator: Number of people living in poverty  Source: government statistics office  The government doesn’t include internally displaced people (IDPs) in the poverty statistics

8 8 Reliability Key question: If you repeated the same measurement or collection process, would you get the same data? Issue: Consistency or Repeatability  Result: Employment opportunities for targeted sectors expanded  Indicator: Number of people employed by USAID-assisted enterprises  Source: Structured interviews with USAID-assisted enterprises, as reported by implementing partner AAA, BBB, and CCC  The DO Team found out that the implementing partners were using these definitions: AAA – employees means receives wages from the enterprise BBB – employees means receives full-time wages from the enterprise CCC – employees means works at least 25 hours a week

9 9 Timeliness Key question: Are data available timely enough to inform management decisions? Issue: How Frequent  Result: Use of modern contraceptives by targeted population increased  Indicator: Number of married women of reproductive age reporting using modern contraceptives (CPR)  Source: DHS Survey  The DHS survey is conducted approximately every 5 years Issue: How Current  Result: Primary school attrition in targeted region reduced  Indicator: Rate of student attrition for years 1 and 2 at targeted schools  Source: Enrollment analysis report from Ministry of Education  In July 2002 the MOE published full enrollment analysis for school year August 2000 – June 2001

10 Precision Key question: Are the data precise enough to inform management decisions? Issue: Enough detail  Result: CSO representation of citizen interests at national level increased  Indicator: Average score of USAID-assisted CSOs on the CSO Advocacy Index  Source: Ratings made by Partner XXX after interviews with each CSO  The DO team reported this data to the Mission Director:  1999 = = = Issue: Margin of error  Result: Primary school attrition in targeted region reduced  Indicator: Rate of student attrition for years 1 and 2 at targeted schools  Source: Survey conducted by partner. Survey is informal and has a margin of error of +/- 10%  The USAID intervention is expected to cause 5 more students (for every 100) to stay in school longer

11 Integrity Key question: Are there mechanisms in place to reduce the possibility that data are manipulated for political or personal gain? Issue: Intentional manipulation  Result: Financial sustainability of targeted CSOs improved  Indicator: Dollars of funding raised from local sources per year  Source: Structured interviews with targeted CSOs  When a DO Team member conducted spot checks with the CSOs, she found out that organizations CCC and GGG counted funds from other donors as part of the “locally raised” funds.

12 Techniques to Assess Data Quality WHY Goal is to ensure DO team is aware of: Data strengths and weaknesses Extent to which data can be trusted when making management decisions and reporting All data reported to Washington must have had a data quality assessment at some time in the three years before submission. ADS

13 Examples of problems  Invalid Key fields Data collection forms not standardized  Location Accuracy Different wards  Logical Inconsistencies Jobs completed before they started  Mandatory Fields missing data Sex or age  Data collectors Bias Selfish personal interest

14 Ways of improving quality Tackle quality at source, not downstream in the lifecycle Training data collectors is importance in getting it right Continual improvement with quality method

15 HOW? Steps to Conduct Assessment 1. Review performance data Examine data collection, maintenance, processing procedures and controls 2. Verify performance data against data quality standards Reliability, precision, timeliness, validity, integrity 3. If data quality limitations are identified, take actions to address them Triangulate; Supplement with data from multiple sources Report the limitations Revise indicator 4. Document the assessment and the limitations in reports to be sent to the Mission who in turn communicate with the IP about the outcome of the assessment 5. Retain supporting documentation in files Photocopies of documents collected during the exercise Approach for conducting data quality assessment 6. If data will be included in the annual report, disclose the DQA findings in the “data quality limitations” section of the Annual report