Earmark Grants- Developing Performance Measures

Slides:



Advertisements
Similar presentations
MONITORING OF SUBGRANTEES
Advertisements

St. Louis Public Schools Human Resources Support for District Improvement Initiatives (Note: The bullets beneath each initiative indicate actions taken.
COMMON MEASURES. Training Objectives Review Common Measures data sources Discuss Reports and Desk Aids.
C3 Orientation.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Workforce Investment Act (WIA) Eligibility and Required Documentation.
Supportive Services for Veteran Families (SSVF) Data Bigger Picture Updated 5/22/14.
Core Monitoring Guide 2005 National Equal Opportunity Training Conference.
Discretionary Grants: Data Collection, Processing & Reporting 1 Back-to-Basics - Discretionary Grants Data Collection, Processing & Reporting Employment.
Measuring and reporting outcomes for your BTOP grant 1Measuring and Reporting Outcomes.
SROI Report Card: Year Ending July Inner City Renovation: Social Mission Overview SROI Report Card: Year End 2007 Hire majority of ICR employees.
WTCS Framework for Student Success WTCS Board Meeting March
1 WIA YOUTH PROGRAM Outreach, Recruitment, Intake, Registration, and Eligibility Requirements.
Implementing Effective Contractor Compliance Programs.
Trini Torres-Carrion. AGENDA Overview of ED 524B Resources Q&A.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
PROGRAMMATIC MONITORING Adult Education & Literacy.
1 WIA Youth Common Measures Webinar Attainment of a Degree or Certificate January 19, :00 am – 11:00 am.
TRAINING SERIES Attainment of Credentials, Degrees and Certificates WIA Workforce Investment Act.
VANCE-GRANVILLE COMMUNITY COLLEGE DISABILITY SERVICES VGCC Disability Services Presented by Cathy A. Davis, VGCC Disability Counselor.
Core Performance Measures FY 2005
Approaches for Reporting Vendor Performance in Vocational Rehabilitation: A Benchmarking Study by Florida VR Webinar November 21, :30-3:30 Eastern.
1 Literacy and Numeracy Gains Webinar February 3, :00 am - 11:00 am.
ALBERTA EMPLOYMENT FIRST Challenges and Opportunities Sean McEwen Calgary Alternative Employment Services.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Compliance /Monitoring and Data Collection Department of Labor and Industry Office of Equal Opportunity.
© 2003 By Default! A Free sample background from Slide 1 Milestone Payment System Orientation Office of Community Services.
Charles Pack Jr. WorkKeys and KeyTrain Help Make The Academy of Careers and Technology A West Virginia Exemplary School.
DOL Performance and Reporting Requirements for YouthBuild Grantees Redondo Beach, CA February 7, 2007.
FY07 COMMON MEASURES CHANGES FOR REPORTING AND MOSES TRACKING.
Common Measures How do we fit?. 2 Overview Common Measures, where did they come from? Common Measures Concepts New Reporting Elements Questions Common.
Understanding the NRS Rosemary Matt NYS Director of Accountability.
Development of coordinator competencies for the Stanford chronic disease management program in Maryland Virginia Brown, DrPH; Patsy Ezell, PhD Stanford’s.
Employment and Training Administration DEPARTMENT OF LABOR ETA Back to Basics! Program Reporting U.S. Department of Labor Employment and Training Administration.
U.S. DEPARTMENT OF LABOR EMPLOYMENT AND TRAINING ADMINISTRATION ARRA GREEN JOB AND HEALTH CARE / EMERGING INDUSTRIES NEW GRANTEE POST AWARD FORUM JUNE.
EEO Best Practices: Addressing and Preventing Discrimination February 12, 2013 MHRMA.
+ EOIS-CaMS DATA MIA EOIS-CaMS Data Management, Integrity and Analysis (Data MIA) Prepared by: Robyn Cook-Ritchie, RCR Consulting ManagementIntegrity Analysis.
FORTIFYING HIGH PERFORMANCE : IMPROVING REPORTING AND DATA COLLECTION OF POSITIVE EXITS PRESENTERS: RIAN HOWARD POSTELL CARTER DONNA SATTERTHWAITE FORTIFYING.
Monitoring for Equal Opportunity and Compliance Evelyn Rodriguez EO Officer, Washington State Employment Security Valerie E. Kitchings EO Officer, District.
The Atlanta Region Discretionary Grants Roundtable May 21, 2008 Return on Investments in Workforce Training Sherrill MitchellRobison, CWDP Federal Project.
Why Do State and Federal Programs Require a Needs Assessment?
Search Committee Orientation. Housekeeping Restrooms Emergency Exits Please set Cell Phone to Silent 2.
Work Based Training WIA Adults & Dislocated Workers January
Eta EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT OF LABOR eta EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT OF LABOR.
Trade Act Participant Report (TAPR) 2005 Revisions for Implementing Common Measures.
Certifying Your Data The Annual Performance Report (APR) is due each fall. Data collected in APlus will be used to generate sections of the APR for each.
Changing Perspectives on Workforce System Performance Workforce Innovations Conference July 2004 Employment and Training Administration Performance and.
Performance Reporting Under WIA Title 1B Candice Graham-Young ETA Performance Accountability Team.
1 State Performance Plan (SPP) Indicator # Measurement 1Graduation 2Dropout 3Statewide Assessments 4Suspension and Expulsion 5Least Restrictive Environment.
Performance Management Training October , 2015 Grace Gorenflo, MPH, RN Principal Gorenflo Consulting, Inc.
Back to Basics The Application of Common Measures to Discretionary Grants Lane Kelly April 16, 2009.
Reporting & Performance Quarterly Performance Reports  Narrative  Performance  Financial.
Welcome to Workforce 3 One U.S. Department of Labor Employment and Training Administration WIOA Consultation Webinar: Non-Discrimination and Equal Opportunity.
Minnesota’s Disability Employment Initiative (DEI): Partners for Youth Third Round of DEI Grants October 2013.
Individual Work Plan (IWP). Objectives Describe the purpose of the Individual Work Plan (IWP) Discuss when to submit an IWP The IWP as a living document.
2016 EEO OFFICERS ANNUAL ON-LINE TRAINING This presentation may be downloaded to your PC in order to complete the training session at your own pace. Presented.
1 Overview of the U.S. Public Workforce System March 2012.
1 TITLE UNITED STATES DEPARTMENT OF LABOR Employment and Training Administration Region 3 – Atlanta Program Performance Reporting Discretionary Grants.
Common Performance Measures for Employment and Training Programs SC Workforce Development Partnership Conference October 26-29, 2003 Brad Sickles
U.S. Department of Education Office of Special Education Programs Building the Legacy: IDEA 2004 Highly Qualified Teachers (HQT)
TOPSpro Special Topics V: Meeting Federal Accountability Requirements.
On-the-Job (OJT) Training 1. Agenda Overview and Purpose Employer Eligibility On the Job Training (OJT) Process Technical Assistance/Resources Q & A’s.
Attainment of Credentials, Degrees and Certificates
YCC Performance and Reporting
Introduction to Performance and Accountability in Adult Education
The Application of Common Measures
WIOA Accountability Ben Konruff
Using Data to Monitor Title I, Part D
Technical Assistance Webinar
WIOA Youth Performance
Presentation transcript:

Earmark Grants- Developing Performance Measures Understanding Grant Proposal Requirements, Identifying Appropriate Performance Measures & Understanding Reporting Requirements

Session Objectives 1 2 3 Review Context for Performance Accountability Review Grant Proposal Performance Requirements 3 Review Federal Reporting Requirements

GPRA, WIA & Common Measures President’s Management Agenda Federal Requirements Statutes, Policies, and Directives Requiring Grantees to Focus on Performance Outcomes GPRA, WIA & Common Measures President’s Management Agenda 29 CFR Parts 95 & 97 Congress & Sponsors www.doleta.gov/performance/guidance/tools_commonmeasures.cfm

Getting Started Identifying performance measures requires time, input and a clear understanding of your project’s goals You do not need to start from scratch here because your grant proposal already contains a Statement of Need with identified goals and Statement of Work with specific tasks What are our major activities for this project and why are we doing this (pull information from Statement of Need and Statement of Work)? What will be different as a result of your project and for whom? What will the community or other stakeholders say is the value of your project?

Grant Proposal Performance Requirements Part 5 Project Outcomes recommends grantees use a table to identify the following information: The name of the performance measure? How the performance measure will be defined? Who/what is included in the numerator Who/what is included in the denominator? Any other exclusions for calculating the measure? What is the time period for calculating outcomes? What data sources will be used to collect information needed to calculate outcomes/results (optional, not required)? Planned level of outcome Should be expressed as a numeric level and as a percentage, where appropriate (e.g., your project plans to help 50 out of the 100 individuals enrolled in the project find employment, which translated into a 50% Placement Rate)

Types of Performance Measures TYPE OF OUTCOME MEASURE BY GRANT PURPOSE Provide Training Training Completion Rate Placement in Employment Rate Average Earnings Retention in Employment Rate Customer Satisfaction Rate Increase Skills Skill Attainment Rate Attainment of Degree or Certificate Rate Earnings Change Rate Research-Oriented or Product- Development Qualitative Information Research Findings Products or Curriculum Developed Results Disseminated

Example 1: Enrollment Rate Performance Table Name of Performance Measure Measure Definition or Formula Planned Level of Outcome Measure 1: Enrollment Rate The number of participants who enroll in the project divided by the number that are recruited for project participation Only participants enrolled in the project are included in the numerator The denominator includes individuals that completed an intake/enrollment form (i.e., all registrants) (Optional for proposal submission) Data sources include intake forms and enrollment information 50% Enrollment Rate (100 individuals will enroll in the project out of 200 individuals recruited for the project)

Example 2: Skill Attainment Rate Performance Table Name of Performance Measure Measure Definition or Formula Planned Level of Outcome Measure 2: Skill Attainment Rate The number of participants who improve their skill levels at least one grade level by the end of the grant period divided by the total number of participants enrolled in training. Only participants enrolled in training are included in denominator Only those participants who improve their reading, writing or math skills one level are included in the numerator Pre- and post-tests will be used to assess skill levels Participants can only be counted once in the numerator regardless of the number of skills achieved (Optional for proposal submission) Data sources include training vendor records, training certificates, pre- and post-test results 90% Skill Attainment Rate (90 out of 100 individuals will improve their skill levels one level)

Example 3: Training Completion Rate Performance Table Name of Performance Measure Measure Definition or Formula Planned Level of Outcome Measure 3: Training Completion Rate The number of participants who complete training by the end of the grant period divided by the total number of participant enrolled in the grant. Only participants enrolled in the grant are included in denominator Only those participants who complete training by the end of the grant period are included in the numerator Training completion means the participant completed 24 hours of basic skills training and received a certificate of completion (Optional for proposal submission) Data sources include training vendor records and training certificates 80% Training Completion Rate (80 out of 100 individuals will complete training)

Example 4: Placement Rate Performance Table Name of Performance Measure Measure Definition or Formula Planned Level of Outcome Measure 4: Placement Rate The number of participants who enter employment by the end of the grant period divided by the total number of participants who are enrolled in the project. Only participants who are enrolled in the grant are included in denominator Only those participants who are employed by the last day of the grant period are included in the numerator Employment is defined as working 20 hours per week at minimum wage (Optional for proposal submission) Data sources include training vendor records, training certificates, pay stubs, participant written attestation, or employer verification 56% Placement Rate (56 out of 100 individuals will be employed)

Example 5: Average Earnings Performance Table Name of Performance Measure Measure Definition or Formula Planned Level of Outcome Measure 5: Average Earnings Of those participants employed by the last day of the grant period, the total post-program hourly wages divided by the total number of participants employed by the end of the grant period. Only participants employed by the last day of the grant period are included in the denominator The numerator includes the cumulative total of all participant hourly wages for those participants employed by the last day of the grant period. Employment is defined as working 20 hours per week at minimum wage (Optional for proposal submission) Data sources include pay stubs, participant written attestation, or employer verification $8.75 Average Hourly Wages

Example 6: Retention Rate Performance Table Name of Performance Measure Measure Definition or Formula Planned Level of Outcome Measure 6: Retention Rate The number of participants who are still employed in the 8th quarter of project operations, divided by the total number of participants who are employed by the 6th quarter of project operations. Only participants who enter employment by the 6th quarter of project operations are included in the denominator Only participants who are still employed in the 8th quarter of project operations are included in the numerator Employment need not be with the same employer to count as being retained; wages greater than $1 count as employment (Optional for proposal submission) Data sources include training vendor records, training certificates, pay stubs, participant written attestation, or employer verification 70% Retention Rate (39 out of 56 individuals will be retained in employed)

Example 7: Curriculum Development Rate Performance Table Name of Performance Measure Measure Definition or Formula Planned Level of Outcome Measure 7: Curriculum Development Rate The number of training modules developed by the end of the project divided by total number of modules planned for the project. Only training modules that have been completed are included in the numerator. Training module completion means that all the student and trainer materials are finished and ready to be tested (Optional for proposal submission) Data sources include training manuals and materials verification 90% Curriculum Development Rate (9 out of 10 training modules will be completed by the end of the grant period)

Examples 8 and 9: Product Development Performance Table Name of Performance Measure Measure Definition or Formula Planned Level of Outcome Measure 8: Develop Website Recruitment Tool A website will be develop that participants can use to locate employment opportunities in the automotive industry in Pennsylvania. In addition, the website will provide employers with the ability to search for possible employment candidates. Recruitment Website Developed by 12/31//2008 Measure 9: Competency Report Completed A report outlining the core competencies for the automotive industry will be developed. The report will provide the competency name and a description of the behaviors, skills, or activities associated with each competency. Report Addressing Competency Model for Automotive Industry Completed by 7/15/2008

Example 10: Problem Resolution Rate Performance Table Name of Performance Measure Measure Definition or Formula Planned Level of Outcome Measure 10: Problem Resolution Rate The project will conduct a community audit in the first quarter of the grant period to identify workforce development areas requiring problem resolution. Once issues have been identified, grant staff will work with the community to remedy at least 50% of the issues identified during the audit. Only issues identified during the community audit will be included in the denominator. Issues must be resolved by the end of the grant period to count in the numerator. 60% Problem Resolution Rate (6 out of the 10 problems identified during the community audit will be resolved)

Example 11: Participant Customer Satisfaction Rate Performance Table Name of Performance Measure Measure Definition or Formula Planned Level of Outcome Measure 11: Participant Customer Satisfaction Rate The number of participants who state that they are satisfied or extremely satisfied with the services received through the project divided by the total number of participants enrolled in the project. Only those participants enrolled in the project are included in the denominator. Only participants that select satisfied or extremely satisfied with project services are included in the numerator. Participants will be given a written survey to complete two weeks prior to the end of the grant period. (Optional for proposal submission) Data sources include customer satisfaction surveys 70% Participant Customer Satisfaction Rate (35 out of 50 participants will state that they are satisfied or extremely satisfied with the services received through the grant)

Performance Management Data Processing Data Collection “What matters in the end is Completion. Performance. Results. Not just making promises, but making good on promises. - President’s Management Agenda - Reports and Information

Identifying Grant Responsibilities Earmark grant proposal template Part 6 Management and Personnel, Part A. Applicant Organization and Project Administration, Subpart (2) Project Administration, Reporting Responsibilities and Processes requires grantees to address the following: How will data for the project be collected? Who has responsibility for data entry, compilation and processing? What management information system will be used to maintain the data? How will the data be secured?

Data Collection— Where to Get the Information? Source Documentation Social Security Card Driver’s License/ID Card Hospital Records Intake/Eligibility Forms Attendance Sheets Sign-In Sheets School Records Activity Forms Assessment Results Pay stubs Progress Reports Surveys Self-Attestation Forms Copy of Diploma Training Certificates Interviews Public Agency Records Student ID

Data Processing— Who Has Responsibility? The grantee is responsible for ensuring that a system is in place to track participant characteristics, services, and outcomes It is highly recommended that grantees maintain access to data processing and reporting at all times You need to know what is going on with your grant! The grantee may contract out for services, but should provide input on how the data is maintained and gathered Regardless of who provides the service, your grant proposal should identify who is responsible for ensuring information will be collected and processed sufficiently

Data Processing— What System Should We Use? How sophisticated or elaborate does the grantee’s MIS need to be? It varies, but high-performing organizations have an MIS that produces information/reports to assist staff in addressing issues and improving performance Examples include MS Access, MS Excel, or a proprietary system such as Client Tracking System Contract with local workforce investment areas to process and aggregate data What must grantee MIS be able to do? At a minimum, capture all required data elements, perform any necessary calculations and report information to the grantee and its partners

Federal Data Collection Requirements 29 CFR Part 37 contains the Non-Discrimination and Equal Opportunity requirements for your grant Grantees must provide initial and continuing notice that it does not discriminate based on race, color, religion, sex, national origin, age, disability, political affiliation, and citizen status as a lawfully admitted immigrant authorized to work in the U.S. 29 CFR Part 37 requires grantees to collect Equal Opportunity information for all applicants, registrant, participant, terminee, applicant for employment & employee Race/ethnicity, sex, age and where known, disability status An individual has the right to refuse to provide any part or all of the above information Jobs for Veterans Act P.L. 107-228 requires grantees to give priority to veterans that meet eligibility requirements for the grant Grantees will need to track the number of veterans and veterans’ spouses served (See http://www.doleta.gov/programs/VETS for more information)

Federal Reporting Requirements Reporting requirements based on Uniform Administrative Requirements in 29 CFR Parts 95.51 and 97.40 Comparison of actual accomplishments with goals and objectives (i.e., performance measures) for the project Reasons why established goals were not met and corrective action being taken Other pertinent information, including technical assistance needs, best practices or any promising approaches ETA has a recommended quarterly progress report format for your use with established due dates Quarterly progress and financial report (ETA 9130) due 45 days after the end of the quarter Recommended format, but use instructions provided by your RO Final evaluation report due 90 days after the end of the grant period of performance

THANK YOU Are there any questions?