Download presentation
Presentation is loading. Please wait.
Published byรื่น สมิท Modified over 6 years ago
1
Purvi Sevak and Todd Honeycutt Presentation at NCRE Fall Conference
Promoting Youth Transition: Using Data-driven Practices to Expand the Evidence Base Purvi Sevak and Todd Honeycutt Presentation at NCRE Fall Conference Alexandria, VA October 27, 2018
2
Funding Ongoing funding: National Institute on Disability, Independent Living, and Rehabilitation Research’s Rehabilitation Research & Training Center on Vocational Rehabilitation (VR) Practices for Youth and Young Adults (90RT5034) Initial development: Rehabilitative Services Administration (H235LI00004) Mathematica designed the Coach as part of the Substantial Gainful Activity Project Demonstration, in partnership with the Institute for Community Inclusion at the University of Massachusetts
3
What We’ll Cover Why is evaluating VR services and programs important?
What are the steps to program evaluation? What is the VR Program Evaluation Coach? What is the value of the VR Coach to VR agencies and rehabilitation educators? What’s next?
4
Why Is Evaluating VR Programs and Services Important?
VR staff face rising pressure to: Show that their programs and services work Apply effective, evidence-based practices But with recent increases in reporting requirements, there is little time left for evaluation
5
What Are the Steps to Program Evaluation?
Program evaluation is an important part of the performance management cycle
6
A Framework for Program Evaluation Priorities
To address this challenge, we developed a framework that can help VR agency staff organize competing priorities and assess whether they can use performance management tools to address them The framework organizes priorities across two dimensions: (1) processes or outcomes (shown as rows), and (2) the unit of analysis (shown as columns). As shown in the row categories, the first step is to determine whether the question relates to a VR process (for example, considering how to better implement a new motivational interviewing training for counselors) or targets a specific outcome (for example, assessing the impact of a specific program on client employment at closure). VR research often focuses on longer-term outcomes, such as the rehabilitation rate or the type of closure. However, shorter-term processes can have a significant impact on long-term outcomes, can indicate whether a program is on track to achieve its goals, and are an important topic of research on their own. For example, VR staff may be concerned about initial client engagement—getting new clients to develop and sign an IPE and start services—as a step toward attaining eventual long-term employment for clients, especially high school students who may be engaged with VR for several years or more before they finish school. The column categories capture the level of analysis of the research question or problem across types of VR stakeholders: VR offices or geographies (for example, counties, school districts, or regions within a state), VR staff (for example, counselors or service providers), or VR clients (for example, transition-aged youth or individuals with a specific disability). Understanding who or what the question or problem is about can help identify the best method to answer the question or fix the problem. For a question about transition-aged youth who receive services in a school setting, the most appropriate level of analysis might be the school district. For a question about the quality of services clients receive through referrals to service providers, the level of analysis is likely the provider.
7
Analytic Options for VR Staff
Descriptive analysis uses data to describe processes and outcomes Predictive analytics build models using historical data to predict future events Process or formative evaluation examines agencies’ internal processes to improve efficiency Cost effectiveness studies may calculate the return on investment across different programs Impact or outcomes evaluation estimates the relationship between programs and outcomes such as skill gains or employment Rapid-cycle evaluation (RCE) provides feedback in as close to real time as possible The most appropriate tools and techniques to answer a specific research question or address a specific problem depend on a variety of factors, including where the question falls across the two dimensions of processes versus outcomes and VR stakeholders. For example, the best method to analyze regional differences in implementing a new program across VR offices is likely different from the best method to assess employment outcomes for clients with autism, or for clients served by an individual counselor. VR staff must also consider other factors, such as data availability and staff capacity to conduct the analyses. There are many potential analytic methods available, including…
8
Case Example Using the Program Evaluation Cycle with RCE
Implement a work-based learning experience program at select schools (other schools used for comparison) Collect quantitative data on service delivery and work experiences Examine data on previously identified measures to assess implementation success and early outcomes as soon as possible Receipt of key services, time between enrollment and work experience or other key services, completion of work experiences, paid employment after work experiences, case closure with employment Review results with agency staff, partner organizations, and school officials Adjust or redesign program, or expand program to additional schools
9
What Is the VR Program Evaluation Coach?
A tool to help you securely and efficiently evaluate changes in agency’s policies, programs, and services Originally designed to help schools with RCE Identify effective practices and inform program and policy changes Accessible for free online at
10
VR Program Evaluation Coach
11
How Does the VR Coach Help?
Helps staff conduct rigorous evaluations to assess the impacts of services Could be any service, program, or policy Guides users through the program evaluation process Developing an evaluation question Identifying a comparison group Conducting statistical analysis Reporting Includes guides, tips, and examples
12
Things to Consider THINGStoCONSIDER How to Design a Successful Pilot
VOCATIONAL REHABILITATION PROGRAM EVALUATION COACH Use randomization when possible Include more participants Randomize at the lowest possible level Take advantage of pre-intervention data Think carefully about what will define success When vocational rehabilitation (VR) agencies roll out a new service, they may make it available to clients statewide. But what if you could first learn how likely a new service was to work at your office, in your region, or in your unique setting before you rolled it out at scale? Here are some things to think about to help you get the data needed to make evidence-based decisions. How to Design a Successful Pilot A well-designed pilot will provide data to help you determine which effective services to adopt and which ineffective services not to adopt. To maximize the likelihood of making the right decision… THINGStoCONSIDER How do I use data to inform my decisions and adopt effective programs or services? SEPTEMBER 2018 Conduct your own VR program evaluation at © 2018, Mathematica Policy Research, Inc. This document carries a Creative Commons (CC BY) license that permits re-use of content with attribution as follows: Developed by Mathematica Policy Research, Inc., as part of the Rehabilitation Research and Training Center on Vocational Rehabilitation Practices for Youth and Young Adults project, funded by the National Institute on Disability, Independent Living, and Rehabilitation Research. Things to Consider
13
Things to Consider – Considerations when choosing a comparison group
Every person has “observable” characteristics (which can be easily measured) and “unobservable” characteristics (which cannot be easily measured). Both types of characteristics can affect the outcome you are measuring. If your groups are not similar, then part of the difference you see in outcomes could be caused by the differences in characteristics, not by receipt of the VR service. THINGStoCONSIDER What makes my treatment and comparison groups similar? VOCATIONAL REHABILITATION PROGRAM EVALUATION COACH SEPTEMBER 2018 Conduct your own VR program evaluation at © 2018, Mathematica Policy Research, Inc. This document carries a Creative Commons (CC BY) license that permits re-use of content with attribution as follows: Developed by Mathematica Policy Research, Inc., as part of the Rehabilitation Research and Training Center on Vocational Rehabilitation Practices for Youth and Young Adults project, funded by the National Institute on Disability, Independent Living, and Rehabilitation Research. In a matched-comparison group design, you want to construct a comparison group that is as similar as possible to your service recipients. This starts with selecting a group that is reasonably comparable. The VR Coach's matching tool will help you further narrow down this group so that it is similar to your service recipients across several characteristics that identify as important for your evaluation. Considerations when choosing a comparison group: Similar characteristics Things to Consider – Considerations when choosing a comparison group
14
Five Steps of the VR Coach
Frame the question Create an evaluation plan Prepare your data Analyze your data Summarize and share your findings
15
1. Frame the Question What program or service do you want to test, and why? What is the outcome of the program? Employment, service, skills, other Can be short- or long-term outcomes, or VR processes Whom does the program target? Clients, providers, VR staff, others
16
VR Eval Coach – the Basics
17
2. Create an Evaluation Plan (1)
Evaluation compares clients who received a service to a comparison group of clients who did not Comparison group needed to show that it is the service—not other factors—that causes the observed outcomes Challenge is that both groups of clients may differ in other important ways
18
2. Create an Evaluation Plan (2)
VR Coach supports two approaches to constructing comparison groups Random assignment (the gold standard) Matched comparison group Statistical tools to identify subsets of clients who did and did not receive a service but otherwise have similar characteristics Characteristics can include age, sex, type of disability, school or referral source, or any measure that is in the case management system
19
3. Prepare Your Data You can use data from your agency’s case management system Clear instructions guide you through formatting and preparing the data Upload a Microsoft Excel file with data to the VR Coach through a secure web interface
20
VR Coach Data Security VR Coach uses Amazon Web Services GovCloud
Uploaded data are encrypted and stored in a temporary cache Cleared within 15 minutes of uploading Data must be re-uploaded for each analysis See our privacy and data handling policy page for more information
21
4. Analyze Your Data The Coach conducts two sets of statistical analyses to: Construct a well-matched comparison group Generate regression-adjusted estimates of program impacts
22
5. Summarize and Share Your Findings
The Coach produces two completed evaluation reports: Easy-to-share report that summarizes findings in plain language Technical appendix with methods Reports are populated using answers to questions that users provided in previous steps
23
Summary of Results The report begins with a succinct summary of the finding– in this demo that… It reminds me of the program evaluation question I drafted in step 2 of the evaluation It also reminds me of the thresholds I set for concluding that the program was successful. For this demo, I said I would feel comfortable concluding… Some of you may recognize that the Coach uses a Bayesian approach to framing evaluation findings. This allows us to make these probabilistic statements about the program’s impacts such as there is a 99% probability that the service increased employment by at least 5 percentage points.
24
Figure A.2 The technical appendix provides more detail on the Bayesian approach and evaluation methods. It also provides a visual for the estimated probability distribution of the impacts of the service. Here the x-axis shows the range of potential impacts based on the data I uploaded, from 0 to 40 percentage points. The area beneath the whole curve is 100 percent. The shaded green area to the right of the threshold bar I set at 5 percentage points equals the probability that the service increased employment by 5 percentage points or more.
25
What Kinds of Evaluations Can I Do?
Rapid-cycle evaluation to quickly assess program and service changes Small changes in counselor practices Behavioral “nudges” Can estimate impacts on outcomes: Short-term: closures outcomes Long-term: earnings, cost reimbursement from the Social Security Administration Can also estimate impacts on VR processes: Time to individualized plan for employment (IPE) Service receipt and costs The Coach can be used to conduct rapid-cycle evaluations. Rapid cycle evaluations can allow agencies to quickly assess changes in programs or services because they rely on existing data and focus on results that can be observed over a short timeframe. These can include small changes in counselor practices or any number of behavioral nudges targeted to clients. One example of an evaluation of a very small behavioral nudge, in an effort to increase engagement of job seekers with reemployment services, Mathematica helped the Department of Labor in partnership with local workforce agencies, to design and test a series of personalized outreach s. A rapid cycle evaluation showed that these s led to statistically significant increases in service take-up. In Colorado, Mathematica is partnering with a county workforce center to test whether different post-card or reminders can improve timely reporting of work hours by TANF recipients.
26
Case Example 1: Blended Services
Problem: Does dual enrollment in VR and Wagner Peyser have an effect on timing and success? Examined multiple years of administrative data closures to consider time to employment (number of days from IPE date to closure date)
27
Case Example 1: Program Evaluation Question
Does dual enrollment in VR and Wagner Peyser lead to a decreased time from IPE to employment among those dually enrolled in Wagner Peyser and VR compared with VR consumers with an IPE and an employment date who are not dually enrolled in Wagner Peyser? Service being tested Outcome Treatment group Comparison group
28
Case Example 1: Comparison Method
Matched comparison group 41 treatment clients matched to 41 clients in comparison group (of 312 clients) Matched on age, race, and sex
29
Case Example 1: Results The results from the evaluation of dual enrollment in VR and Wagner Peyser are inconclusive
30
Case Example 2: High School Summer Program
Problem: What is the effect of an existing high school summer work experience program? Examined multiple years of administrative data to look at successful engagement (either exited with employment or still receiving services)
31
Case Example 2 Program Evaluation Question
Do summer youth work experiences lead to an increase in successful engagement among high school students with assigned IPE receiving summer youth-work experience compared with other high school students who had a signed IPE? Service being tested Outcome Treatment group Comparison group
32
Case Example 2: Comparison Method
Matched comparison group 3,851 treatment clients matched to 3,851 clients in comparison group (of 8,452 clients) Matched on age, sex, impairment type, application year, race, referral from school, referral from intellectual and developmental disabilities agency, and Supplemental Security Income receipt
33
Case Example 2: Comparison Method
34
Case Example 2: Results Summer youth work experience had the intended effect on successful engagement
35
What Is the Value of the VR Coach to VR Agencies?
Helps staff conduct rigorous evaluations to assess the impacts of services Could be any service, program, or policy Process, impact, and RCE analyses Guides users through the program evaluation process Developing an evaluation question Identifying a comparison group Conducting statistical analysis Reporting Repeat what I said earlier…
36
Challenges to Using the VR Coach
Identifying a comparison group Organizing data Setting aside time to learning how to use the tool
37
What’s Next? We can provide technical assistance to agency staff using the Coach to test transition services Identifying services to evaluate Preparing data Making sense of evaluation findings
38
Lessons for Educators VR counselors and staff are poised to conduct their own evaluations and put evidence into action They know which questions are most pressing and actionable and the gaps in their knowledge Train counselors to enter the field ready to leverage data and evidence Our guides and briefs can help Make use of RSAs TA centers
39
For More Information Please contact Purvi Sevak and Todd Honeycutt
Make an appointment at Recorded webinar on VR Program Evaluation Coach The RRTC on VR Practices and Youth
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.