Data Validation ETA/ASTD Regional Technical Assistance Forum November 2, 2011 San Francisco, CA.

Slides:



Advertisements
Similar presentations
MONITORING OF SUBGRANTEES
Advertisements

IWDS – Extensive Training for Case Managers
Data Quality Considerations
COMMON MEASURES Presented by: Phil Degon Workforce Administration September 2006.
COMMON MEASURES. Training Objectives Review Common Measures data sources Discuss Reports and Desk Aids.
Key Reliability Standard Spot Check Frank Vick Compliance Team Lead.
CWA Conference Long Beach, CA January 27, 2015 *As known today
1 Program and Compliance Management Workshop: UNDERSTANDING PARTICIPATION CYCLES V I R T U A L L Y.
Performance Accountability Improving Data Accuracy and Reporting Washington State Web-Ex August 22, 2014.
Workforce Investment Act (WIA) Eligibility and Required Documentation.
Final Determinations. Secretary’s Determinations Secretary annually reviews the APR and, based on the information provided in the report, information.
U.S. Department of Labor Employment and Training Administration 1 Common Measures and Reporting BINGO!
1 Program and Compliance Management Workshop: Documentation Desires, Duties, Dilemmas V I R T U A L L Y.
Data Validation Documentation for Enrollments. Learning Objectives As a result of this training you will be able to: Describe the data validation process.
Employment and Training Administration DEPARTMENT OF LABOR ETA Data Validation ARRA Performance Accountability Forum San Francisco, California December.
Supportive Services for Veteran Families (SSVF) Data
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Employment and Training Administration DEPARTMENT OF LABOR ETA Reporting and Data Validation Updates Presenters: Wes Day Barbara Strother Greg Wilson ETA’s.
NYS Integrated Reporting 1 New York State Integrated Reporting Plenary Session USDOL/ETA Performance and Reporting Summit 2007 Phoenix Marriott.
Certification of Market Values STEB PROGRAM Briefing Points 2011 Pennsylvania Department of the Auditor General Thomas E. Marks, CPA Deputy Auditor General.
Improper Authorizations for Payment October 21, 2008.
1 Program and Compliance Management Workshop: Data Element Validation—Issues from Federal Monitoring V I R T U A L L Y.
Reporting on the Federal Financial Report “SF-425” Presented by HOMELESS GRANT AND PER DIEM PROGRAM and The Financial Service Center 1.
ETA Data Validation July Overall ETA Data Validation Project Goals Develop a comprehensive, systematic data validation system to ensure data integrity.
Performance Counts Julian Hardy Lane Kelly. 2 Discussion Topics TAPR Reporting and Related Issues Common WIASRD Errors Data Validation Issues.
California Outcomes Measurement System – Treatment
Changing Perspectives on Workforce System Performance Data Validation Workforce Innovations San Antonio July, 2004.
Guidelines for the Development and Maintenance of RTF- Approved Measure Savings Estimates December 7, 2010 Regional Technical Forum Presented by: Michael.
DATA VALIDATION NORTH CAROLINA’S EXPERIENCE. NC Division of Employment & Training2 Program Year 2002 Data Validation We contracted with our former Finance.
1 Changing Perspectives on Workforce System Performance Employment and Training Administration Office of Performance and Technology
OSEP National Early Childhood Conference December 2007.
Employment and Training Administration DEPARTMENT OF LABOR ETA Back to Basics! Program Reporting U.S. Department of Labor Employment and Training Administration.
OFFICE OF NATIONAL RESPONSE NATIONAL EMERGENCY GRANTS (NEG) Performance and Reporting Summit.
U.S. Department of Labor Employment and Training Administration Keith Rowe ETA – Dallas Region Office Presenter ETA – PROTECH WISPR Quarterly Reports and.
Review of Data Validation Results for PY 2004 We’ve come a long way, baby!
Performance Measurement Under Title 1-B of the Workforce Investment Act Regional Training Richard West Social Policy Research Associates.
Data Validation (DV) What is DV? Who conducts DV Why conduct DV? How is DV done? When is DV? Common Fails Suggestions Expectations.
Florida’s Experience with Long-Term, Short-Term and Common Measures Mike Switzer Workforce Florida, Inc Commonwealth Lane Tallahassee, FL
PERFORMANCE REPORTING AND ANALYSIS UNIT Short and Long-Term Performance Reporting For the New Workforce System in Florida.
Letter of Explanation Copy of Data Disproportionality Initial Eligibility 60-day Timeline Early Childhood Transition Secondary Transition Corrected and.
Adjusting Erroneous Wage Record Match Results. 2 Agenda Background on the use of wage record data for tracking employment and earningsBackground on the.
Labor Exchange Validation July Labor Exchange Reporting ETA 9002 has five sections ETA 9002 has five sections –9002 A & B reports on job seeker.
Data Entry Procedures for Participant Activities & Outcomes Data Entry Procedures for Participant Activities & Outcomes.
Program Performance and Reporting Pacific Northwest TAT Forum April 20-22, 2010 Seattle, Washington.
Learning Objectives Conducting an On-Site Monitoring Review FPO calls the Grantee: “As you know, we’re a little more than nine months into your 24 month.
U.S. Department of Labor Employment and Training Administration 1 Practicum on DATA VALIDATION.
Earmark Grants Elizabeth Nicholson, Federal Project Officer Region 3 Discretionary Grants Training Forum 2011.
Workforce Innovations Conference July 2006 Workforce Investment Streamlined Performance Reporting (WISPR) System: “HOT Wiring” State Data for Workforce.
Trade Act Participant Report (TAPR) 2005 Revisions for Implementing Common Measures.
Table 5 and Follow-up Collecting and Reporting Follow-up Data Or What’s Behind Table 5? American Institutes for Research February 2005.
Changing Perspectives on Workforce System Performance Workforce Innovations Conference July 2004 Employment and Training Administration Performance and.
Performance Reporting Under WIA Title 1B Candice Graham-Young ETA Performance Accountability Team.
Data Quality Improvement This material was developed by Johns Hopkins University, funded by the Department of Health and Human Services, Office of the.
Unit 11.2a: Data Quality Attributes Data Quality Improvement Component 12/Unit 11 Health IT Workforce Curriculum Version 1.0/Fall
Participants will have a knowledge and understanding of priority of service in DOL funded programs.
1 Welfare Transition Monitoring A Detailed Look at the WT Quality Assurance Tool 2009.
FREQUENTLY ASKED QUESTIONS Common Measures. When did common measures become effective? Common measures became effective for W-P on 7/1/05.
NCIC VALIDATION PROCESS. Control, Validation, and Other Procedures  Agencies that enter records in NCIC are responsible for their accuracy, timeliness,
Temporary Assistance for Needy Families Part 265: Data Collection and Reporting.
Reporting & Performance Quarterly Performance Reports  Narrative  Performance  Financial.
Employment and Training Administration DEPARTMENT OF LABOR ETA Data Validation Looking Back and Moving Forward Region IV Technical Assistance Forum Retooling.
A Training Course for the Analysis and Reporting of Data from Education Management Information Systems (EMIS)
Common Performance Measures for Employment and Training Programs SC Workforce Development Partnership Conference October 26-29, 2003 Brad Sickles
1 On-Line Financial Management Workshops Cost Classification, Administrative Costs & Program Income June 2009.
WIOA Annual Performance Report
Poll Question Have you read the Supplemental Wage Information Guidance issued under TEGL 26-16, PM 17-6, and TAC 17-04?
SPR&I Regional Training
Early Childhood Transition APR Indicators and National Trends
Job Analysis CHAPTER FOUR Screen graphics created by:
Technical Assistance Webinar
Presentation transcript:

Data Validation ETA/ASTD Regional Technical Assistance Forum November 2, 2011 San Francisco, CA

2 Outline Data Validation background and requirements Components of Data Validation – Report and Data Element Federal approach to Monitoring What’s New Ideas for improving Data Accuracy and Reliability 2

3 Background and Requirements

4 Reports and Information Data Collection Data Processing Data validation affects every aspect of performance management. “Performance is your reality. Forget everything else.” - Harold Geneen DATA VALIDATION 4

5 Data Validation Initiative Purpose of USDOL’s Initiative  To support improved management of federal programs and respond to data quality issues cited by oversight agencies Most recent Guidance for Program Year 2010 Performance Reporting and Data Validation  TEGL and Change 1 Source Documentation Requirements  TEGL 27-10, Change 1 - Attachment A 5

6 Data Validation and Eligibility States have the burden of identifying required Program Eligibility documents, or have given that responsibility to the Local Area. While not required, many States use the Data Validation Source Documentation Guide as the required documents listing for Program Eligibility.

7 Components of Data Validation

8 What Comprises Data Validation? Report Validation (RV)  Are the performance calculations reported by the state accurate based on federal reporting specifications? Data Element Validation (DEV)  Is the data used in the calculations accurate? 8

9 Report Validation Compares the Outcomes reported by the State, using their own system or ETA’s DRVS, to the Outcomes computed through DRVS, resulting in an Error Rate. 9

10 Data Element Validation Using approved Source Documentation, compares the Data in an individual record in order to verify the accuracy of the data elements used in calculating the Reports, resulting in an Error Rate.

11 Data Element Validation Was allowable Source Documents used to verify each required Data Element? Documentation must either MATCH or SUPPORT each data element. Most source documentation is located at the One Stop Level. Wage Records, for example, would only be stored at the State level.

12 Source Documentation Whether scanned, paper, or system cross-match, the purpose of source documentation is to have an auditable trail that documents the participant, services delivered and outcomes received.

13 MIS (Management Information System) MIS refers to specific, detailed information that is stored in the state’s management information system that supports an element.  An indicator, such as a checkmark on a computer screen, is not acceptable source documentation in and of itself.

14 Cross Match Cross match requires accessing a non- WIA MIS to find detailed supporting evidence for the data element.  An indicator or presence of a SSN or checkmark in a non-WIA database is not sufficient evidence.

15 Self-Attestation Self-attestation is when a participant states their status for a particular data element, and then signs and dates a form acknowledging this status.

16 Case Notes Case notes refer to either paper or electronic statements by the case manager that identify, at a minimum, the following:  a participant’s status for a specific data element  the date on which the information was obtained  the case manager who obtained the information

17 Monitoring

18 Review Approach Where Review Takes Place  Traditional On-Site Review  Remote Review  Combination Focus of the Review  Program Reporting and DV  DV as component of comprehensive review

19 Review Scope Programs  Workforce Investment Act (WIA )  Trade (TAA) (currently on hold)  LX (Wagner-Peyser/VETS) Validation Cycles  Most recently completed cycle (PY/FY)  Additional cycle sometimes included Components 1. Report Validation 2. Data Element Validation

20 Review Components Data management and the resultant quality of reported data are derived from and influenced by the policies, procedures and protocols utilized at the state and/or local levels  Review of Reporting and Validation Process (including MIS)  Review of Policies, Procedures, Protocols (including training) and how deployed

21 Training and Monitoring Data collection and data entry:  Routine training should be provided for data management guidance  All staff involved in the collection or entry of data should be trained in the procedures  The data entry process should include steps for verifying entered data against original sources on a sample basis or for entire population of records

22 Policies/Procedures and Training Grantees should develop guidance for staff and sub-grantees involved in the collection of data:  Definitions of data elements  Sources of information  Participant record and documentation requirements  Procedures for collecting, entering and reporting data and associated “business rules” that cover timeliness and completeness  Procedures for entering data into an automated database  Procedures for correcting data

23 Common Source Documentation Issues Using documents that are not on ETA’s list of Acceptable Source Documents Not using most recent Guidance Checkbox in MIS Date of Birth Youth who Needs Additional Assistance Failure to Accurately record ‘Dates’ Poor Case Notes No participant signature for Self-Attestation 23

24 Non-Compliance with EXIT Requirements Exit dates not reflective of dates of last service Gaps of service spanning years Case management used to extend exit date Hard exits utilized  Date of last contact = Exit date  Date of employment = Exit date Services provided within 90 days Lack of common exit date (across core workforce programs) Exit dates not consistent with dates in MIS

25 Incorrect Capture/Coding of Required Data Elements Several data elements routinely cited across reviews  Ethnicity and Race  Veteran Status  UI Claimant Status  UC Eligible Status  School Status at Participation  Needy Family Status  Dislocation Date Example: Ethnicity and race combined (in MIS, on forms, etc.) No response for ethnicity interpreted as a negative response (MIS default)

26 Record Retention Two Primary Issues  Participant files missing, cannot be located, or documents missing  Wage records and validation extract files purged or just not kept

27 DEV Error Rates > 5% “High” error rates are only noted as an Area of Concern at this time due to ‘provisional’ error rates In some cases, high error rates continue across validation cycles for same data elements What has the state done to eliminate or minimize errors?

28 Quality of Participant Files Lack of consistent file structure Documents missing Case notes missing, skimpy, illegible, irrelevant MIS contains different information One file for multiple periods of participation for same individual

29 What’s New

30 DV Monitoring Guide Supplement to Core Monitoring Guide Still in development as a Supplement to the Core Monitoring Guide Will be ETA’s official Monitoring Guide  Conveys basic requirements but allows for some regional flexibility

31 Data Validation Requirements Workgroup made recommendations for change  Started prior to, but now even more important with reduced Statewide Funds.  Recommendations include reduced sample sizes and areas of scope.  No official approval at this time – request was to start by PY2011

32 Tools for Improving Data Quality

33 Report Validation Summary as a Tool for Quality Compare ‘Reported’ counts to ‘Validated’ counts Analyze ‘Reported’ counts with State WIA Annual and LEX 4 th Qtr reports. Review data submitted by numerators and denominators 33

34 Using Report Validation Summary as a Tool for Quality Possible data quality issues:  Difficulties in submitting WIA Annual Report/LEX 4 th Qtr data  Wrong WIA Annual Report/LEX 4th Qtr file was submitted  Incorrect data element values in State system 34

35

36

37 Data Element Validation Summary Reported Data Error Rate  # of records in error divided by total # of records with particular element validated. Overall Error Rate  # of records in error divided by total # of records sampled for specific funding stream. 37

38

39

40

41 Wrap Up Questions? Comments? 41