Download presentation
Presentation is loading. Please wait.
Published byAubrey Hawkins Modified over 9 years ago
1
1 The Role of Performance Management in Workforce Investment Programs Burt S. Barnow Institute for Policy Studies Johns Hopkins University Prepared for the Conference What Europeans Can Learn from the WIA Experience Washington, DC November 7, 2009
2
Topics Covered Background on performance management Elements of a performance management system Performance Management and Evaluation Performance management in U.S. workforce programs Adjustments to performance standards Evidence on the effects of performance standards Conclusions and lessons 2
3
Background Why have performance management? Principal-agent problem: interests of state and local governments may differ from those of federal government Local government may have own goals, e.g., maximize participants, appeal to maximum voters, provide work to local employees JTPA and WIA systems tried to align goals of federal, state, and local government Problem: goals of program not easy to identify: equity vs. efficiency for federal government 3
4
Elements of a Performance Management System Performance Measure(s): The indicator of success used to judge performance. Measures can be: Inputs (number of people served, hours of instruction) Process variables (characteristics of a course, procedures followed) Gross outcomes ((program completers, placements, no. passing a test) net outcomes (skills added by the program) Value of outcomes (earnings gains) Performance Standards and a means to set the standards Standards can be absolute or relative Standards can be set objectively or subjectively Standards can be the same for all or they can be adjusted Rewards and Sanctions are needed to encourage meeting or exceeding the standards but not always emphasized 4
5
Distinctions between Performance Measurement and Evaluation Performance measurement is a management tool to track progress over time Evaluation answers questions about whether program is meeting expectations What happened? (Process Study) What difference did it make (Impact Study) Was it worth it? (Cost-benefit Study) Evaluations can be used in management, but on a slower path than performance measurement We need both types of activities, and it is important not to confuse them 5
6
The JTPA Performance Management System The Secretary of Labor issued the performance measures and set national standards as well as an adjustment procedure The national standards were set so that based on past experience, 75 percent of the SDAs were expected to meet or exceed the standard. The adjustments were determined from regression models estimated on management information system data submitted annually by each SDA. Variables with insignificant coefficients or coefficients with an unexpected sign are omitted from the adjustment model. The performance measures were gross outcomes 6
7
JTPA Measures and National Standards The adult follow-up employment rate, defined as the proportion of adult respondents who were employed at least 20 hours per week during the 13th week after termination (59 percent); Adult follow-up weekly earnings, defined as average weekly earnings for all adults who were employed for at least 20 hours per week during the 13th week after termination ($281); Welfare adult follow-up employment rate, defined in the same manner as for the adult follow-up employment rate but for adult welfare recipients only (50 percent); Welfare follow-up weekly earnings, defined in the same manner as adult follow-up weekly earnings ($244); Youth entered employment rate defined as the proportion of youth terminees (other than potential dropouts who remained in school) who entered employment with at least 20 hours per week (41 percent); and Youth employability enhancement rate, defined as the proportion of youth who obtained one of the employability enhancements at termination (40 percent). 7
8
Performance Standards under the Workforce Investment Act Standards now set for states as well as local areas Standards now “negotiated” at state level rather than based on model States determine standards for local areas WIA now uses “Common Measures” established by OMB for workforce-oriented programs; for adults these are: Entered employment rate: Of those not employed at the date of participation, the number of participants who are employed in the first quarter after the exit quarter divided by the number of participants who exited during the quarter. Employment retention rate: Of those who are employed in the first quarter after the exit quarter, the\ number of participants who are employed in both the second and third quarters after the exit quarter divided by the number of participants who exited during the quarter. Average Earnings: Of those participants who are employed in the first, second, and third quarters after the exit quarter, total earnings in the second quarter plus total earnings in the third quarter divided by the number of participants who exited during the quarter. 8
9
Why Adjust Performance Standards? Better measure value added by programs rather than gross change Level the playing field (horizontal and vertical equity) Discourage (but not eliminate) gaming responses for programs with discretion in who is served and how Provide more accurate and useful information program performance management and accountability 9
10
Reasons not to Adjust Performance Standards Set more ambitious standards for low achievers Promote single standard of minimum performance Ex.: Set same standard for reading and math proficiency for disadvantaged children as we do for relatively more advantaged? Regression-based adjustments more difficult for public officials to understand Explanatory power of adjustment models often low, limiting confidence in results Rigid application of statistical adjustment models may contribute to unintended consequences, distortions 10
11
Evidence on the Effects of Performance Standards JTPA system led to modest cream skimming; WIA has stronger incentives for cream skimming due to absence of regression adjustment model Short-term performance measures on earnings levels have little relationship to long-term impact measures Limited empirical evidence on impact of performance standards on services provided, but believed that local areas try to maximize measured performance; one study finds that effects on efficiency are mixed No evidence identified on effects of performance management on technical efficiency of service delivery Performance standards lead to real resources being spent on strategic behavior by local programs. 11
12
Conclusions and Lessons Do not confuse performance measurement with program evaluation There are often good reasons to adjust performance standards to take account of program goals, participant characteristics, and environmental conditions Programs need not have the same performance measures or standards Be cautious in establishing performance measures with large rewards and/or sanctions Efficiency measures are particularly prone to programs avoiding more disadvantaged customers and more costly services Performance management Is still in a formative stage; Legislation should not be overly prescriptive on the measures, standards, and incentives. Deliberations on the structure of the performance management system should include input by all the relevant stakeholders 12
13
The Origins of Performance Measurement? “All lovers swear more performance than they are able, and yet reserve an ability that they never perform; vowing more than the perfection of ten, and discharging less than the tenth part of one.” --William Shakespeare, Troilus and Cressida 13
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.