GWDC Return on Investment Initiative 26 th Annual Conference on Policy Analysis October 13, 2010
Background Public and non-profit workforce training programs use a different ROI methodologies – so that it is inappropriate to compare results. In 2009, a law was passed that requires DEED to report on a set of accountability measures, including return on investment (M.S. 116J.997). The GWDC, responsible for advising on performance standards and measures (M.S. 116L.665), convened the ROI Initiative to develop a standard ROI Measure. 2
Project Goal, Scope, and Audience To create through consensus a standard return on investment methodology for workforce and training programs. The focus is on programs administered or funded by DEED, but the idea is to create a model that will appeal to all workforce programs across the state. Audience: – Workforce and training programs – Legislators – Taxpayers 3
ROI Methodology Criteria Manageable: easy to administer; minimizes work and training for service providers to implement Useful: relevant and timely; helps providers improve programs, helps legislators and others understand program impacts Understandable: transparent; easily understood by experts, users, and the general public ; assumptions and limitations defined, rules of use defined Credible: trusted; measure is seen as accurate by experts, users, and the general public; relies on verified data as much as practical; process integrity; ensures data and model integrity Adaptable: measure can be used by different programs/providers Sensitive to change: collect and report data at a frequency that detects significant variations 4
ROI Initiative Members Bryan Lindsley (Sponsor), GWDC Cristine Leavitt (Project Manager), DEED Nick Maryns (Staff), GWDC Paul Anton, Anton Economics Art Berman, Twin Cities Rise! Mark Brinda, City of Minneapolis Carol Dombek, DEED Martha Hegewisch, DEED Dick Joerger, MnSCU Randy Johnson, Workforce Development, Inc. Steve Erbes, DEED Susan Lindoo, DEED Anne Olson, Minnesota Workforce Council Association Brian Paulson, Greater Twin Cities United Way Devon Meade, Twin Cities United Way Raymond Robertson, Macalester College Mary Rothchild, MnSCU Deb Serum, DEED Todd Wagner, MN Department of Education Luke Weisberg, Lukeworks, LLC 5
ROI Initiative Work Plan Phase I: Evaluate existing ROI models and develop a standard model (Jun 2009-Aug 2010) Phase II: Pilot test, evaluate, refine the model, and prepare an implementation plan (Sep 2010 – June 2011). Phase III: Make policy and implementation recommendations as part of the January 2011 GWDC Report to the Legislature. 6
DRAFT ROI Framework 7
Data Framework Participation & Benefits Data DHS Data Sources (MFIP, SNAP, Medical Assistance / MinnesotaCare) DHS Data Sources (MFIP, SNAP, Medical Assistance / MinnesotaCare) UI Program Data ROI Framework Dept of Corrections Data SSNs or other data matching methods Data Sources for Treatment and Comparison Groups Individual Characteristics, Wage and Employment Data, Receipt of Benefits Data Sources for Receipt of Public Benefits Which programs did/does the individual use? What level of benefits is/was received? Treatment Group WIASRD Workforce One Wage Detail MnSCU Data (?) Comparison Group LEPR UI Applicant Data Treatment Group WIASRD Workforce One Wage Detail MnSCU Data (?) Comparison Group LEPR UI Applicant Data ROI Framework Assumptions and Statistical Methods Pass-Through Programs Most have their own data systems, though capacities vary. Pass-Through Programs Most have their own data systems, though capacities vary. 8
Measure Benefits ROI measure serves as a performance benchmark that highlights best practices Service providers have a standard measure for communicating the value of services to stakeholders Legislature/Stakeholders have an additional measure to inform policy and program decisions Potential use of measure as an active (not after the fact) management tool to improve service value Measure allows citizens to understand the value received for their tax dollars Measure brings greater accountability to programs that serve the public 9
Measure Challenges 10
Measure Challenges Applying one methodology to a wide variety of programs, with varied missions and target populations Building a methodology that is helpful to program managers AND policy makers Balancing the tension between transparency/simplicity and rigor/accuracy of the methodology Accounting for participants who are served by multiple programs Data and systems do not allow aggregation of individual data Concern regarding how the measure will be used 11
Next Steps Fall 2010: Develop Communications Plan and Initiate Pilot Testing January 2011: Include Policy Recommendations in GWDC Report to the Legislature Spring 2011: Public Participation Meeting June 2011: Complete Implementation Plan 12
GWDC ROI Initiative Thank You! Questions? 13