WIA PERFORMANCE MEASURES AND STANDARDS: The WIASRD, Common Measures and Standards Negotiation Challenges Christopher T. King Ray Marshall Center for the.

Slides:



Advertisements
Similar presentations
WIA Performance and Common Measures Where are we now? by Anthony L. Joseph, Ph.D. Program Manager Workforce Development & Training Division, NYSDOL.
Advertisements

WIA Closure, Exit and Follow-up
COMMON MEASURES Presented by: Phil Degon Workforce Administration September 2006.
COMMON MEASURES. Training Objectives Review Common Measures data sources Discuss Reports and Desk Aids.
Perkins IV National Definitions and State Reporting: The Impact on Data Collection in Texas Gabriela Borcoman Texas Higher Education Coordinating Board.
It must be borne in mind that the tragedy of life does not live in not reaching your goal. The tragedy of live is having no goal to reach. Benjamin E.
Judy Mortrude DEED Program Administrator Workforce Innovation and Opportunities Act.
THE ADMINISTRATIVE DATA RESEARCH AND EVALUATION PROJECT (ADARE) BACKGROUND AND OPPORTUNITIES TO PARTNER David W. Stevens The Jacob France Institute University.
1 Performance Measurement & Reporting For Employment And Training Programs.
1 WIA Core Performance Measures (17 Measures) Presented by: Department of Economic Opportunity An equal opportunity employer/program. Auxiliary aids and.
PERFORMANCE ACCOUNTABILITY: WIA YOUTH PERFORMANCE MEASURES John J. Heldrich Center for Workforce Development Rutgers, the State University of New Jersey.
Overview and Discussion of NRS Changes. Massachusetts Department of Elementary and Secondary Education 2 NRS Changes for FY13 AGENDA  Review changes.
New York State Report Cards Program Report Cards Fiscal Year 10/11 Presented by: Rosemary Matt NYS Director of Accountability.
Department of Technical and Adult Education Georgia’s Technical College System DTAE Proposed Performance Funding Model Department of Technical And Adult.
1 Workforce Investment Act Performance and Reporting.
Youth Common Performance Measures for YouthBuild
Promoting a flexible, innovative, and effective workforce system within the State of Michigan. WIOA Overview Michigan Works! Association Conference October.
WTCS Framework for Student Success WTCS Board Meeting March
SKIES Data Entry Instructions for WIA Entrepreneurial Training.
Employment and Training Administration DEPARTMENT OF LABOR ETA Simple Ways to Improve Your Reporting Greg Wilson Office of Performance and Technology Employment.
1 WIA Youth Common Measures Webinar Attainment of a Degree or Certificate January 19, :00 am – 11:00 am.
TRAINING SERIES Attainment of Credentials, Degrees and Certificates WIA Workforce Investment Act.
Minnesota FastTRAC Adult Career Pathways
Common Measures for the Workforce Investment Act (WIA) An equal opportunity employer/program. Auxiliary aids and services are available upon request to.
Welcome to the Common Measures Webinar Press *6 to mute your telephone Please do not put your phone on HOLD In the ‘chat’ box, please enter and send us.
Performance Management Policy Highlights Workforce Innovations 2005 July 12 and 13, 2005.
Measurement Standardization in Perkins The Perspective from the Integrated Performance Information (IPI) Project Data Quality Institute June 14, 2005 Bryan.
Presented by Lois ScottAugust 21, Why We Are Here Financial and Participant Data Overview Program Year 2014 – 2015 – Program Performance – Performance.
FY07 COMMON MEASURES CHANGES FOR REPORTING AND MOSES TRACKING.
Understanding the NRS Rosemary Matt NYS Director of Accountability.
107 East Madison Street Caldwell Building, MSC# G-229 Tallahassee, Florida Monthly Management Report (MMR) Measures WIA An equal opportunity.
U.S. Department of Education Office of Vocational and Adult Education Division of Academic and Technical Education Progress of the State Perkins Accountability.
June 2006 FY 2007 Federal Performance for Staff1 Overview and Discussion of Performance Measures for FY07.
1. 2 Collaborative Partnerships It’s that evolution thing again! Adult education has been partnering and collaborating for years.
U.S. Department of Labor Employment and Training Administration Keith Rowe ETA – Dallas Region Office Presenter ETA – PROTECH WISPR Quarterly Reports and.
Integrated Performance Information (IPI) Project Mike Switzer Workforce Florida, Inc. Jay Pfeiffer Florida Department of.
Performance Measurement Under Title 1-B of the Workforce Investment Act Regional Training Richard West Social Policy Research Associates.
1 WORKFORCE INVESTMENT ACT PY 2000 (JULY 2000-JUNE 2001) Title 1-B Adults and Dislocated Workers Administrative Data Research and Evaluation (ADARE) Project.
Employment and Training Administration DEPARTMENT OF LABOR ETA 1 Change in Reporting Requirements for the Workforce Investment Act Standardized Record.
Florida’s Experience with Long-Term, Short-Term and Common Measures Mike Switzer Workforce Florida, Inc Commonwealth Lane Tallahassee, FL
COMMON PERFORMANCE MEASURES & REPORTING. New legislation requires the use of three outcome performance measures that are used in all ETA youth programs.
Changing Perspectives on Workforce System Performance- Adjustment Models Workforce Innovations Conference July 21, 2004.
FY2007 Federal Performance for Managers1June 2006 FY2007 Federal Performance for Managers Overview and Discussion of Performance Measures for FY07.
The Atlanta Region Discretionary Grants Roundtable May 21, 2008 Return on Investments in Workforce Training Sherrill MitchellRobison, CWDP Federal Project.
YouthBuild Programs Strategies for Grant Management Atlanta, Georgia May 20, 2008.
Strategies, Funding, Performance From TEGL 2-07 “Features of Registered Apprenticeship, including its customized format, the extensive industry knowledge.
NRS and MABLE Changes July 1, new data fields for students Highest Credential Achieved Education Location – U.S. or non U.S. Diploma at Entry?
Trade Act Participant Report (TAPR) 2005 Revisions for Implementing Common Measures.
Focusing on Post- Secondary Credential Attainment.
Performance Reporting Under WIA Title 1B Candice Graham-Young ETA Performance Accountability Team.
FREQUENTLY ASKED QUESTIONS Common Measures. When did common measures become effective? Common measures became effective for W-P on 7/1/05.
ADARE Project1 Low Income and Welfare Client Priorities: Peter Mueser & David Stevens Presentation: August 27, 2003 U.S. Department of Labor.
Understanding the NRS Rosemary Matt NYS Director of Accountability.
Understanding the National Reporting System Rosemary Matt NYS Director of Accountability NRS.
Common Performance Measures for Employment and Training Programs SC Workforce Development Partnership Conference October 26-29, 2003 Brad Sickles
1 Understanding the Updated Red and Green Report for WIA Programs Presented by: BETTYE MCGLOCKTON, DEHRYL MCCALL, MERSHAL NOBLE AGENCY FOR WORKFORCE INNOVATION.
1 CREDENTIAL ATTAINMENT PILOT. USING CHAT 2 To submit a question or comment, type the question in the text box Choose who you want to submit the question.
Brian Frazier Talent Investment Agency Office of Adult Education
Attainment of Credentials, Degrees and Certificates
Measurable Skill Gains
Rosemary Matt NYS Accountability
Kansas Leads the World in the Success of Each Student.
STATE PLANNING GUIDANCE: Performance Negotiation and Common Measures ETA Performance Webinar April 20, 2005.
Introduction to Performance and Accountability in Adult Education
CareerSource Chipola Performance Overview
Part 2 Carrie Tupa Texas Workforce Commission December 6, 2017
Performance Accountability
Making Common Sense of the Common Measures
WIOA Youth Performance
Burt S. Barnow George Washington University
Presentation transcript:

WIA PERFORMANCE MEASURES AND STANDARDS: The WIASRD, Common Measures and Standards Negotiation Challenges Christopher T. King Ray Marshall Center for the Study of Human Resources University of Texas, Austin 512/ David W. Stevens The Jacob France Institute University of Baltimore 410/ April 22, 2003

BRIEFING TOPICS 1.Highlights from PY 2000 program outcome information in the WIASRD files from the seven ADARE Project states, focusing on the quality of the data elements. 2.Negotiated, actual and actual minus negotiated difference in PY 2000 performance data for the seven ADARE Project states. 3.Observations about the proposed common measures. 4.WIA performance standards negotiation challenges and opportunities (including pros and cons of regression modeling). 5.Other challenges that will follow reauthorization.

EMPLOYED IN QUARTER AFTER EXIT QUARTER The data element code choices are: yes, no and not yet available Georgia, Illinois and Missouri did not use the not yet available code. The four ADARE Project states that used the not yet available code used it the following percent of the time:  Florida44 percent  Maryland73 percent  Texas23 percent  Washington50 percent

USE AND SOURCE OF SUPPLEMENTAL DATA The data element code choices are: used case management files and record sharing/matching  Florida, Missouri and Washington did not report any use of supplemental data sources.  Georgia reported only three instances of supplemental data use.  Texas reported using supplemental data one percent of the time.  Illinois and Maryland reported using supplemental data three percent of the time.

OCCUPATIONAL CODE of any job held since exit This information is to be reported if the individual is reported as employed in the quarter after exit. The information can be based on information derived from case management files, follow-up services or other sources. It is not necessary to wait until information on employed in quarter after exit is available.  Florida, Georgia and Maryland used only the nine-digit DOT code.  Illinois and Texas used only the five-digit OES code.  Washington used both the DOT and OES coding taxonomies.  Missouri used the five-digit or six-digit O*Net98 code.

ENTERED TRAINING RELATED EMPLOYMENT Two-thirds of the yes or no entries for this data element were recorded as a yes. The range of affirmative entries was from a low of 29 percent for Maryland to a high of 94 percent for Florida.  The reported method used by Florida, Maryland, Texas and Washington to determine training related employment was ‘other appropriate method’.  The reported method used most often by Georgia, Illinois and Missouri was ‘a comparison of the occupational codes of the training activity and the job’, but each of these three states also used ‘a comparison of the industry of employment with the occupation of training using an appropriate crosswalk’.

ENTERED NONTRADITIONAL EMPLOYMENT The nontraditional employment designation can be based on either local or national data.  Six percent of the yes or no entries for this data element were reported as a yes.  The range of affirmative entries among the seven ADARE Project states was from a low of one percent to a high of fifteen percent.  Texas did not report yes or no entries for this data element.

TYPE OF RECOGNIZED EDUCATIONAL/OCCUPATIONAL CERTIFICATE, CREDENTIAL, DIPLOMA OR DEGREE ATTAINED  Seven codes are provided. States and localities have flexibility in choosing the methods used to collect data documenting this data element.  Each of the seven ADARE Project states reported award of some credentials in each of the six type of credential categories.

PY 2000 CORE MEASURES OF PERFORMANCE SEVEN ADARE PROJECT STATES The four Adult and Dislocated Worker performance measures are covered.  Entered employment rate.  Employment and credential rate.  Retention rate.  Earnings change Each of the four charts that follow ‘flies in’ PY 2000 negotiated, actual and actual minus negotiated performance measure values for the seven ADARE Project states.

QUESTIONS TO ASK WHEN LOOKING AT THE CHARTS THAT FOLLOW  Do I know enough about the criteria for specifying each negotiated performance measure value to interpret the observed differences in these negotiated values among the seven ADARE Project states?  Do I know enough about the data sources that were used to calculate the actual performance measure values to interpret the actual minus negotiated differences in these values among the seven ADARE Project states?  What management and/or policy conclusions can I reach based on my answers to the previous two questions?  Can I be confident in making incentive awards and imposing sanctions based on actual minus negotiated value differences?

Program Year 2000 (July 2000-June 2001): Entered Employment Rate

Program Year 2000 (July 2000-June 2001): Employment And Credential Rate

Program Year 2000 (July 2000-June 2001): Retention Rate

Program Year 2000 (July 2000-June 2001): Earnings Change

REVISITING THE QUESTIONS ASKED HAVING LOOKED AT THE CHARTS  Do I know enough about the criteria for specifying each negotiated performance measure value to interpret the observed differences in these negotiated values among the seven ADARE Project states?  Do I know enough about the data sources that were used to calculate the actual performance measure values to interpret the actual minus negotiated differences in these values among the seven ADARE Project states?  What management and/or policy conclusions can I reach based on my answers to the previous two questions?  Can I be confident in making incentive awards and imposing sanctions based on actual minus negotiated value differences?

COMMON MEASURE ISSUES Performance Measure Quality ENTERED EMPLOYMENT RATE  Registration date  Employed or not employed at registration  Exit date  Entered employment by the end of the first quarter after exit ISSUES  Staff decision whether and when to register a customer  Quality of ‘employed or not employed at registration’ data element  Unintended consequences of this measure  Staff decision when to enter or allow automatic entry of exit date  Use of supplemental data sources

COMMON MEASURE ISSUES Performance Measure Quality EMPLOYMENT RETENTION RATE  Employed first quarter after exit (regardless of employment status at time of registration)  Employed second quarter after exit  Employed third quarter after exit ISSUES  Stakeholder interest in this measure  Drill-down questions that will be asked  Use of supplemental data sources  Timeliness of availability for intended uses

COMMON MEASURE ISSUES Performance Measure Quality EARNINGS INCREASE  Earnings in second quarter prior to registration  Employed in first quarter after exit  Earnings in first quarter after exit  Earnings in third quarter after exit ISSUES  Stakeholder interest in this measure  Drill-down questions that will be asked  Number of ‘pays’ in each reference quarter  Use of supplemental data sources  Timeliness of availability for intended uses

COMMON MEASURE ISSUES Performance Measure Quality EFFICIENCY  The dollar amount specification to serve as the numerator  The number of participants figure to serve as the denominator ISSUES  Stakeholder interest in this measure  Drill-down questions that will be asked  Quality of data elements  Unintended consequences

COMMON MEASURE ISSUES Performance Measure Quality PLACEMENT IN EMPLOYMENT OR EDUCATION  Registration date  Enrolled in secondary education at registration  Exit date  Not enrolled in post-secondary education at registration  Not employed at registration  Enrolled in secondary education at exit  Employed in first quarter after exit  In military service in first quarter after exit  Enrolled in post-secondary education in first quarter after exit  Enrolled in advanced training/occupational skills training in first quarter after exit CONTINUED……

COMMON MEASURE ISSUES Performance Measure Quality PLACEMENT IN EMPLOYMENT OR EDUCATION CONTINUED…. ISSUES  Stakeholder interest in this measure  Drill-down questions that will be asked  Quality/uniformity of data definitions and sources  Cost of data collection  Access to education records  Timeliness of data availability for intended uses  Unintended consequence—proliferation of credentials

COMMON MEASURE ISSUES Performance Measure Quality ATTAINMENT OF A DEGREE OR CERTIFICATE  Registration date  Enrolled in education  Exit date  Attained a diploma, GED, or certificate by the end of the third quarter after exit ISSUES  Stakeholder interest in this measure  Drill-down questions that will be asked  Access to education records  Quality/uniformity of data definitions and sources  Timeliness of data availability  Unintended consequences

COMMON MEASURE ISSUES Performance Measure Quality LITERACY OR NUMERACY GAINS ?

COMMON MEASURE ISSUES Performance Measure Quality FIVE ISSUES ARE OF PARTICULAR IMPORTANCE AND CONCERN:  The accuracy and probable unintended consequences associated with the employed or not employed at registration data element  The integrity and value-added of supplemental data use  Selection of denominator and numerator definitions for the proposed efficiency measure  The complexity and value-added of the placement in employment or education measure  Expected unintended consequences associated with the attainment of a degree of certificate measure

PERFORMANCE STANDARD ISSUES THREE TOPICS ARE OF PARTICULAR IMPORTANCE: State and Local Workforce Area Benchmarking  The Census Bureau LEHD Program as a potential source of local demographic and economic activity information for discretionary use in negotiation of state and local performance standards  Benchmarking of own performance over time  Benchmarking against other ‘similar’ states or Local Workforce Areas  Return to regression modeling? Pros and cons CONTINUED….

PERFORMANCE STANDARD ISSUES Challenges Associated with Pursuing Continuous Improvement  Integrity of state and local management information systems over time  Continuity of data source availability and content over time  Expected unintended consequences

PERFORMANCE STANDARD ISSUES Vulnerability to Unintended State and Local Actions  Discretionary opportunities to define selection in criteria, assignment to service components criteria (including whether and when to use partner services) and timing of exit criteria  Investment in staff development can reduce the frequency of some of the unwanted behaviors that will otherwise follow introduction of the common measures

OTHER CHALLENGES  Occupations in demand  Required registration of some customers  Stakeholder interest in number of customers served