Download presentation
Presentation is loading. Please wait.
Published byMaurice Maxwell Modified over 8 years ago
1
Data Validation ETA/ASTD Regional Technical Assistance Forum November 2, 2011 San Francisco, CA
2
2 Outline Data Validation background and requirements Components of Data Validation – Report and Data Element Federal approach to Monitoring What’s New Ideas for improving Data Accuracy and Reliability 2
3
3 Background and Requirements
4
4 Reports and Information Data Collection Data Processing Data validation affects every aspect of performance management. “Performance is your reality. Forget everything else.” - Harold Geneen DATA VALIDATION 4
5
5 Data Validation Initiative Purpose of USDOL’s Initiative To support improved management of federal programs and respond to data quality issues cited by oversight agencies Most recent Guidance for Program Year 2010 Performance Reporting and Data Validation TEGL 27-10 and Change 1 Source Documentation Requirements TEGL 27-10, Change 1 - Attachment A 5
6
6 Data Validation and Eligibility States have the burden of identifying required Program Eligibility documents, or have given that responsibility to the Local Area. While not required, many States use the Data Validation Source Documentation Guide as the required documents listing for Program Eligibility.
7
7 Components of Data Validation
8
8 What Comprises Data Validation? Report Validation (RV) Are the performance calculations reported by the state accurate based on federal reporting specifications? Data Element Validation (DEV) Is the data used in the calculations accurate? 8
9
9 Report Validation Compares the Outcomes reported by the State, using their own system or ETA’s DRVS, to the Outcomes computed through DRVS, resulting in an Error Rate. 9
10
10 Data Element Validation Using approved Source Documentation, compares the Data in an individual record in order to verify the accuracy of the data elements used in calculating the Reports, resulting in an Error Rate.
11
11 Data Element Validation Was allowable Source Documents used to verify each required Data Element? Documentation must either MATCH or SUPPORT each data element. Most source documentation is located at the One Stop Level. Wage Records, for example, would only be stored at the State level.
12
12 Source Documentation Whether scanned, paper, or system cross-match, the purpose of source documentation is to have an auditable trail that documents the participant, services delivered and outcomes received.
13
13 MIS (Management Information System) MIS refers to specific, detailed information that is stored in the state’s management information system that supports an element. An indicator, such as a checkmark on a computer screen, is not acceptable source documentation in and of itself.
14
14 Cross Match Cross match requires accessing a non- WIA MIS to find detailed supporting evidence for the data element. An indicator or presence of a SSN or checkmark in a non-WIA database is not sufficient evidence.
15
15 Self-Attestation Self-attestation is when a participant states their status for a particular data element, and then signs and dates a form acknowledging this status.
16
16 Case Notes Case notes refer to either paper or electronic statements by the case manager that identify, at a minimum, the following: a participant’s status for a specific data element the date on which the information was obtained the case manager who obtained the information
17
17 Monitoring
18
18 Review Approach Where Review Takes Place Traditional On-Site Review Remote Review Combination Focus of the Review Program Reporting and DV DV as component of comprehensive review
19
19 Review Scope Programs Workforce Investment Act (WIA ) Trade (TAA) (currently on hold) LX (Wagner-Peyser/VETS) Validation Cycles Most recently completed cycle (PY/FY) Additional cycle sometimes included Components 1. Report Validation 2. Data Element Validation
20
20 Review Components Data management and the resultant quality of reported data are derived from and influenced by the policies, procedures and protocols utilized at the state and/or local levels Review of Reporting and Validation Process (including MIS) Review of Policies, Procedures, Protocols (including training) and how deployed
21
21 Training and Monitoring Data collection and data entry: Routine training should be provided for data management guidance All staff involved in the collection or entry of data should be trained in the procedures The data entry process should include steps for verifying entered data against original sources on a sample basis or for entire population of records
22
22 Policies/Procedures and Training Grantees should develop guidance for staff and sub-grantees involved in the collection of data: Definitions of data elements Sources of information Participant record and documentation requirements Procedures for collecting, entering and reporting data and associated “business rules” that cover timeliness and completeness Procedures for entering data into an automated database Procedures for correcting data
23
23 Common Source Documentation Issues Using documents that are not on ETA’s list of Acceptable Source Documents Not using most recent Guidance Checkbox in MIS Date of Birth Youth who Needs Additional Assistance Failure to Accurately record ‘Dates’ Poor Case Notes No participant signature for Self-Attestation 23
24
24 Non-Compliance with EXIT Requirements Exit dates not reflective of dates of last service Gaps of service spanning years Case management used to extend exit date Hard exits utilized Date of last contact = Exit date Date of employment = Exit date Services provided within 90 days Lack of common exit date (across core workforce programs) Exit dates not consistent with dates in MIS
25
25 Incorrect Capture/Coding of Required Data Elements Several data elements routinely cited across reviews Ethnicity and Race Veteran Status UI Claimant Status UC Eligible Status School Status at Participation Needy Family Status Dislocation Date Example: Ethnicity and race combined (in MIS, on forms, etc.) No response for ethnicity interpreted as a negative response (MIS default)
26
26 Record Retention Two Primary Issues Participant files missing, cannot be located, or documents missing Wage records and validation extract files purged or just not kept
27
27 DEV Error Rates > 5% “High” error rates are only noted as an Area of Concern at this time due to ‘provisional’ error rates In some cases, high error rates continue across validation cycles for same data elements What has the state done to eliminate or minimize errors?
28
28 Quality of Participant Files Lack of consistent file structure Documents missing Case notes missing, skimpy, illegible, irrelevant MIS contains different information One file for multiple periods of participation for same individual
29
29 What’s New
30
30 DV Monitoring Guide Supplement to Core Monitoring Guide Still in development as a Supplement to the Core Monitoring Guide Will be ETA’s official Monitoring Guide Conveys basic requirements but allows for some regional flexibility
31
31 Data Validation Requirements Workgroup made recommendations for change Started prior to, but now even more important with reduced Statewide Funds. Recommendations include reduced sample sizes and areas of scope. No official approval at this time – request was to start by PY2011
32
32 Tools for Improving Data Quality
33
33 Report Validation Summary as a Tool for Quality Compare ‘Reported’ counts to ‘Validated’ counts Analyze ‘Reported’ counts with State WIA Annual and LEX 4 th Qtr reports. Review data submitted by numerators and denominators 33
34
34 Using Report Validation Summary as a Tool for Quality Possible data quality issues: Difficulties in submitting WIA Annual Report/LEX 4 th Qtr data Wrong WIA Annual Report/LEX 4th Qtr file was submitted Incorrect data element values in State system 34
35
35
36
36
37
37 Data Element Validation Summary Reported Data Error Rate # of records in error divided by total # of records with particular element validated. Overall Error Rate # of records in error divided by total # of records sampled for specific funding stream. 37
38
38
39
39
40
40
41
41 Wrap Up Questions? Comments? 41
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.