1 TITLE UNITED STATES DEPARTMENT OF LABOR Employment and Training Administration Region 3 – Atlanta Program Performance Reporting Discretionary Grants.

Slides:



Advertisements
Similar presentations
Key Concepts for AmeriCorps. Session Objectives Provide an opportunity for participants to network in program specific group Discuss key fiscal and grant.
Advertisements

MONITORING OF SUBGRANTEES
2 Session Objectives Increase participant understanding of effective financial monitoring based upon risk assessments of sub-grantees Increase participant.
Career Plan Development – Assessments W h o a m I ? What skills do I have? What skills do I need?
Data Quality Considerations
COMMON MEASURES. Training Objectives Review Common Measures data sources Discuss Reports and Desk Aids.
4/28/20151 Presented by: Anne Taylor, NECTAC David Steele, OSEP OSEP Part C Fiscal Management Verification: What Is It And How Do I Prepare For It?
Workforce Investment Act (WIA) Eligibility and Required Documentation.
 Customer Basics ◦ WIA Adults ◦ WIA Dislocated Workers ◦ Trade Act Participants  How can we Ensure Participation?  Do We have the Customer Base?  Forging.
Final Determinations. Secretary’s Determinations Secretary annually reviews the APR and, based on the information provided in the report, information.
Role of Youth Council Provide expertise on local youth policy and assists the Board in the development of local youth employment training policy and practice.
COSCDA Conference 2012 Washington, DC Karen DeBlasio, HUD March 13, 2012 Homeless Management Information Systems (HMIS)
Technical Assistance on the WIRED Performance Accountability Framework Workforce Innovations 2007 Kansas City Convention Center Wednesday, July 18, 2007.
Data Validation Documentation for Enrollments. Learning Objectives As a result of this training you will be able to: Describe the data validation process.
Discretionary Grants: Data Collection, Processing & Reporting 1 Back-to-Basics - Discretionary Grants Data Collection, Processing & Reporting Employment.
Employment and Training Administration DEPARTMENT OF LABOR ETA Data Validation ARRA Performance Accountability Forum San Francisco, California December.
Supportive Services for Veteran Families (SSVF) Data
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
1 WIA YOUTH PROGRAM Outreach, Recruitment, Intake, Registration, and Eligibility Requirements.
Employment and Training Administration DEPARTMENT OF LABOR ETA Simple Ways to Improve Your Reporting Greg Wilson Office of Performance and Technology Employment.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
1 Implementing Veterans’ Priority of Service Atlanta Recovery & Reemployment Forum May 28, 2009.
Employment and Training Administration DEPARTMENT OF LABOR ETA Reporting and Data Validation Updates Presenters: Wes Day Barbara Strother Greg Wilson ETA’s.
PROGRAMMATIC MONITORING Adult Education & Literacy.
1 WIA Youth Common Measures Webinar Attainment of a Degree or Certificate January 19, :00 am – 11:00 am.
TRAINING SERIES Attainment of Credentials, Degrees and Certificates WIA Workforce Investment Act.
Introducing WIA Processes for WIA Grant Funded-Staff Assisted Services Program Year 7/2008 – 6/2010.
U.S. DEPARTMENT OF LABOR EMPLOYMENT AND TRAINING ADMINISTRATION ARRA GREEN JOB AND HEALTH CARE / EMERGING INDUSTRIES NEW GRANTEE POST AWARD FORUM JUNE.
Performance Counts Julian Hardy Lane Kelly. 2 Discussion Topics TAPR Reporting and Related Issues Common WIASRD Errors Data Validation Issues.
Presented by Lois ScottAugust 21, Why We Are Here Financial and Participant Data Overview Program Year 2014 – 2015 – Program Performance – Performance.
1 Changing Perspectives on Workforce System Performance Employment and Training Administration Office of Performance and Technology
DOL Performance and Reporting Requirements for YouthBuild Grantees Redondo Beach, CA February 7, 2007.
FY07 COMMON MEASURES CHANGES FOR REPORTING AND MOSES TRACKING.
Understanding the NRS Rosemary Matt NYS Director of Accountability.
Employment and Training Administration DEPARTMENT OF LABOR ETA Grant Management and Monitoring – Discretionary Grantees Pacific Northwest Workforce Development.
Employment and Training Administration DEPARTMENT OF LABOR ETA Back to Basics! Program Reporting U.S. Department of Labor Employment and Training Administration.
OFFICE OF NATIONAL RESPONSE NATIONAL EMERGENCY GRANTS (NEG) Performance and Reporting Summit.
Employment and Training Administration DEPARTMENT OF LABOR ETA Grant Management and Monitoring – Discretionary Grantees Southern California Training and.
Data Collection Process: USDOL Data Requirements Colleges must be able to identify grant participants by program (credit & non-credit) I. Initial Point.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
U.S. Department of Labor Employment and Training Administration Keith Rowe ETA – Dallas Region Office Presenter ETA – PROTECH WISPR Quarterly Reports and.
Participants will gain knowledge of priority of service.
Employment and Training Administration DEPARTMENT OF LABOR ETA 1 Change in Reporting Requirements for the Workforce Investment Act Standardized Record.
COMMON PERFORMANCE MEASURES & REPORTING. New legislation requires the use of three outcome performance measures that are used in all ETA youth programs.
U.S. DEPARTMENT OF LABOR EMPLOYMENT AND TRAINING ADMINISTRATION ARRA GREEN JOB AND HEALTH CARE / EMERGING INDUSTRIES NEW GRANTEE POST AWARD FORUM JUNE.
Employment and Training Administration DEPARTMENT OF LABOR ETA Effective Grants Management 101 Bob Lanter U.S. DOLETA, Region 6 Denise Dombek U.S. DOLETA,
Back to Basics, The Application of Common Measures Lane Kelly Performance Specialist USDOL – ETA, Region 3 - Atlanta May 22, 2008 Ritz-Carlton Hotel Atlanta,
Trade Act Participant Report (TAPR) 2005 Revisions for Implementing Common Measures.
Monitoring Region 3 Discretionary Roundtable May 20 – 23, 2008 Mary Evans And Conyers Garrett EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT.
Changing Perspectives on Workforce System Performance Workforce Innovations Conference July 2004 Employment and Training Administration Performance and.
Performance Reporting Under WIA Title 1B Candice Graham-Young ETA Performance Accountability Team.
Data Quality Improvement This material was developed by Johns Hopkins University, funded by the Department of Health and Human Services, Office of the.
Earmark Grants- Developing Performance Measures
Participants will have a knowledge and understanding of priority of service in DOL funded programs.
Back to Basics The Application of Common Measures to Discretionary Grants Lane Kelly April 16, 2009.
FREQUENTLY ASKED QUESTIONS Common Measures. When did common measures become effective? Common measures became effective for W-P on 7/1/05.
Employment and Training Administration DEPARTMENT OF LABOR ETA Priority of Service for Veterans and Eligible Spouses – Discretionary Grantees Pacific Northwest.
Reporting & Performance Quarterly Performance Reports  Narrative  Performance  Financial.
Welcome to Workforce 3 One U.S. Department of Labor Employment and Training Administration WIOA Consultation Webinar: Non-Discrimination and Equal Opportunity.
Session 6: Data Flow, Data Management, and Data Quality.
1 Overview of the U.S. Public Workforce System March 2012.
Program Management 4. INDIAN AND NATIVE AMERICAN (INA) EMPLOYMENT AND TRAINING PROGRAM UNDER SECTION 166 OF THE WORKFORCE INVESTMENT ACT (WIA) An Orientation.
Common Performance Measures for Employment and Training Programs SC Workforce Development Partnership Conference October 26-29, 2003 Brad Sickles
Data Validation ETA/ASTD Regional Technical Assistance Forum November 2, 2011 San Francisco, CA.
Welcome. Contents: 1.Organization’s Policies & Procedure 2.Internal Controls 3.Manager’s Financial Role 4.Procurement Process 5.Monthly Financial Report.
Eta EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT OF LABOR eta EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT OF LABOR.
New Grantee Orientation Webinar Series Is Your House in Order?: Preparing for the Compliance Visit October 29 th 2012; 2:00 - 3:00 PM (EDT)
Attainment of Credentials, Degrees and Certificates
The Application of Common Measures
Technical Assistance Webinar
Presentation transcript:

1 TITLE UNITED STATES DEPARTMENT OF LABOR Employment and Training Administration Region 3 – Atlanta Program Performance Reporting Discretionary Grants Training Forum Atlanta Marriott Buckhead Hotel and Conference Center April 26-29, 2011

2 Overview Accountability in generalAccountability in general Data collectionData collection Data processingData processing Reports and informationReports and information Common performance measuresCommon performance measures

3 Establishing A Foundation DOL Accountability  To the Congress and American taxpayers –How was the money spent? Who was served?Who was served? What were the types of services?What were the types of services? What were the outcomes?What were the outcomes? –Did the programs operate as intended? –Was the return worth the investment? Grantee Accountability  To DOL –Did the grantee comply with the requirements in the Grant Agreement? –How was the money spent? –Did the programs attain desired results? –What factors accounted for success or failure? “Accountability” or Account-Giving Future Funding

4 Complete And Accurate Reporting All grantees are responsible for two types of reportingAll grantees are responsible for two types of reporting –Program, project or programmatic –Fiscal or financial Our focus today is on program reportingOur focus today is on program reporting –Who did you serve? –What services or products did they get? –What were the results? All data elements are defined in guidance issued by DOLAll data elements are defined in guidance issued by DOL Applies to both ‘Formula’ and ‘Discretionary’ grants

5 Data Quality “All data should come with a label – WARNING: These data were compiled by busy people with other priorities.” - ESP Solutions Group Take steps to avoid “garbage in, garbage out” in reportingTake steps to avoid “garbage in, garbage out” in reporting –The grant recipient’s data collection, management and reporting system must leave an audit trail –Use system edits, range checks and logic checks –Ensure data are collected and recorded according to set instructions –Ensure required computations are completed correctly A data management strategy is the key to reporting accurate dataA data management strategy is the key to reporting accurate data

6 Reports & Information Data Collection Data Processing Overview Of Program Reporting Grantees should design and implement a comprehensive data management strategy

7 Federal Requirements Data Collection & Reporting Requirements 29 CFR Parts 95 & 97 Public Law CFR Part 37 Approved Reporting Instructions Grant Agreement

8 Data Collection Process You need processes to gather, manage and utilize the data you collectYou need processes to gather, manage and utilize the data you collect Participant file and documentation requirements Procedures for collecting, entering and reporting data and associated “business rules” Procedures for entering data into an automated database Procedures for correcting data Establish data quality standards ReportsReports SourceSource Data Collection Data Entry Quality Checks

9 Data Collection A process used to gather and track information for the management of the grant or projectA process used to gather and track information for the management of the grant or project To develop a data collection and tracking mechanism, you must address four fundamental questions:To develop a data collection and tracking mechanism, you must address four fundamental questions: –What information needs to be collected and in what format? –When does the information need to be collected? –Where does the grantee obtain the information? –How does the grantee know the information is accurate or valid?

10 What Information Should You Collect? Refer to the reporting requirements for the grant Refer to the reporting requirements for the grant –Specifies the data to collect and sources –Details when, how, and where to collect the data for reports –Provides instructions on computing outcomes Collecting information to optimize performance Collecting information to optimize performance –Involve staff in the process of assigning data collection responsibilities –Create and produce summary reports on activities to better manage grant functions Remember: You’re collecting data (e.g., counts, characteristics) on who you’re serving, what they’re getting and with what results

11 Participant Characteristics Collect information about individual participants upon entrance into the programCollect information about individual participants upon entrance into the program –Social Security Number (for wage record matches) –Employment status at participation –Participant contact and emergency contact information (e.g., name, address, telephone) –Information to assess eligibility as appropriate –Demographic and EEO information (e.g., age, sex, ethnicity, race, disability status, veterans’ characteristics and status)

12 Additional information grantees may consider Additional information grantees may consider –Education information (e.g., highest school grade completed) –Employment status at enrollment and past/ current employment information –Information about supportive service needs and additional reportable characteristics (e.g., offender, runaway, low income, single parent, etc.) [In addition to what’s required, you want to collect data that’s important to you or that paints a more complete picture of your grant/project] Participant Characteristics

13 Who Counts As A Participant? Does it mean what we think it means?Does it mean what we think it means? This term is specifically defined in federal policyThis term is specifically defined in federal policy –A participant is an individual 1) determined eligible to participate in the program and 2) who received a service funded by the program after being determined eligible –The definition does not include, for instance, those who receive only eligibility determination

14 Participant Services Collect information about the services provided to individual participantsCollect information about the services provided to individual participants –Participation or service enrollment dates –Service completion dates –Types of services in which the participant is enrolled –Training provider information

15 Participant Outcomes Collect information to support performance accountabilityCollect information to support performance accountability –Common measures (applies to all DOL- funded programs) –Grant-specific participant training outcomes –Other information to tell the story of your grant’s accomplishments This includes performance-related metrics but these may not paint the whole pictureThis includes performance-related metrics but these may not paint the whole picture

16 Policy located in Training and Employment Guidance Letter No , dated 2/17/06Policy located in Training and Employment Guidance Letter No , dated 2/17/06 Two groups of measures:Two groups of measures: –Adult measures (also applied to dislocated worker programs) Entered Employment RateEntered Employment Rate Employment Retention RateEmployment Retention Rate Average Six-Month EarningsAverage Six-Month Earnings –Youth measures Placement in Employment or Education RatePlacement in Employment or Education Rate Attainment of a Degree or Certificate RateAttainment of a Degree or Certificate Rate Literacy and Numeracy GainsLiteracy and Numeracy Gains Common Measures

17  Pay stubs  Progress reports  Surveys  Self-attestation forms  Copy of diploma  Training certificates  Interviews  Public agency records  Student ID  Social Security Card  Driver’s license/ID card  Hospital records  Intake/eligibility forms  Attendance sheets  Sign-in sheets  School records  Activity forms  Assessment results Source Documentation Sources For Data

18 Means by which grantees input data into an information management system, find and correct errors in the data, and compile and aggregate the data into a user-friendly formatMeans by which grantees input data into an information management system, find and correct errors in the data, and compile and aggregate the data into a user-friendly format Grantees must address the following:Grantees must address the following: –What kind of information management system will be used to maintain/process the data? –Who has responsibility for data entry, compilation and processing? –What protocols are in place (e.g., timeliness of entry, process flow, making corrections, etc.) ? –How to ensure the data are accurate? Data Processing

19 What System Should We Use? How sophisticated does the information management system need to be?How sophisticated does the information management system need to be? –Ideally you should have an information management system that maintains client data and produces reports to assist staff in addressing issues and improving performance Examples include MS Access, MS Excel, or a proprietary system such as Client Tracking System, Explore Options-Take Action, ETO Cloud, Primeworks, or SamSoft WIA Intake softwareExamples include MS Access, MS Excel, or a proprietary system such as Client Tracking System, Explore Options-Take Action, ETO Cloud, Primeworks, or SamSoft WIA Intake software It’s also possible to contract with local workforce investment areas to process the dataIt’s also possible to contract with local workforce investment areas to process the data What must a grantee’s information management system be able to do?What must a grantee’s information management system be able to do? –At a minimum, capture all required data elements, perform any necessary calculations and report information to the grantee and stakeholders

20 Who Has Responsibility? Grantees are responsible for ensuring a system is in place to track participant characteristics, services and outcomesGrantees are responsible for ensuring a system is in place to track participant characteristics, services and outcomes You may contract out for services, but grantee should (at a minimum) provide input on how the data are maintained and processedYou may contract out for services, but grantee should (at a minimum) provide input on how the data are maintained and processed –Highly recommended that grantees maintain access to data processing and reporting at all times –You need to know what is going on with your grant… and you’re accountable!

21 It is highly recommended that grantees have policies and procedures in place to support data collection and data processingIt is highly recommended that grantees have policies and procedures in place to support data collection and data processing –Policy can be supported by written procedures (e.g., “MIS Handbook” or “Data Entry/Processing Procedures for Staff”) –Ongoing staff training and capacity building critical –All staff, including sub-recipient staff, need to clearly understand their roles in reporting What Protocols Are In Place?

22 Data needs to be accurate, complete and consistentData needs to be accurate, complete and consistent Factors affecting data quality:Factors affecting data quality: –Lack of data collection and data processing policies and procedures –Inaccurate and incomplete data –Insufficient staff training –Differences in definitions –Insufficient system controls Are The Data Accurate?

23 Reporting Means by which data are organized and compiled in a useful manner for management purposesMeans by which data are organized and compiled in a useful manner for management purposes Two types of reports:Two types of reports: 1. Internal reports for the management of the grant –Management must be able to easily interpret internal reports for use in decision-making –Internal reports are typically process-oriented 2. External reports to ETA and other stakeholders –Include required quarterly reports –Grantee program performance reports to ETA are based on OMB approved reporting instructions

24 Before transmitting your quarterly report:Before transmitting your quarterly report: –Verify that counts and outcomes are computed and reported correctly Example: Summing number of males and females and arriving at a total that doesn’t equal the number of participantsExample: Summing number of males and females and arriving at a total that doesn’t equal the number of participants –Research any extreme numbers (outliers) Example: Calculating an outcome that exceeds 100%Example: Calculating an outcome that exceeds 100% External Reports To ETA

25 About Reports At the heart of complete and accurate reporting is a sound data collection strategy and an effective information management system that produces useful, credible reportsAt the heart of complete and accurate reporting is a sound data collection strategy and an effective information management system that produces useful, credible reports Errors made in collecting data will later translate into erroneous results being reportedErrors made in collecting data will later translate into erroneous results being reported

26 For More Information… More detailed information on Program Reporting and Record Keeping Information for grantees can be found at: