The Disability Employment Initiative (DEI): Impact Evaluation Design October 21, 2015 Sung-Woo Cho, Ph.D.

Slides:



Advertisements
Similar presentations
Performance Measurement and Evaluation 2/8/2014 Performance Measurement and Evaluation 1 Performance Measurement and Evaluation What are they? How are.
Advertisements

BETTER TOGETHER Region 6 DOL Gathering. 2 Organize Community Resources SIX GUIDING PRINCIPLES Deepen, Sustain Employer Partnerships Make Easier to Acquire.
Cross Sectional Designs
Job Search Assistance Strategies Evaluation Presentation for American Public Human Services Association February 25, 2014.
Pradeep Kotamraju, Deputy Director, NRCCTE Amanda Richards, Senior Research Associate, MPR Inc. Building on the past to improve the future.
How Do We Know if a Charter School is Really Succeeding? – Various Approaches to Investigating School Effectiveness October 2012 Missouri Charter Public.
ABC. Question 1 Human capital is defined as: The knowledge, talent, and skills that people possess. A The common knowledge, talent, and skills that all.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Wisconsin Disability Employment Initiative
Accelerating Opportunity Evaluation Planning the Evaluation with the Accelerating Opportunity States February 10, :30 a.m. – 1:00 p.m.
Experiments. Types of experiments ‘so far’ Paired comparison Happy experiment watching Goon video Two independent groups Different treatments for each.
Standardization: Why Now? A Post-Secondary Perspective From Minnesota Pradeep Kotamraju Ph.D. System Director, Perkins Minnesota State Colleges and Universities.
S-STEM Program Evaluation S-STEM PI Meeting Arlington, VA October 2012.
LOCAL LEVEL ALIGNMENT UNDER WIOA Office of Career, Technical, and Adult Education for NTI Conference November 12, 2014.
Agenda: Block Watch: Random Assignment, Outcomes, and indicators Issues in Impact and Random Assignment: Youth Transition Demonstration –Who is randomized?
TEST YOUR KNOWLEDGE LESSON 4: BACK TO SCHOOL ABC Lesson 4: Back to School.
Experiments and Observational Studies.  A study at a high school in California compared academic performance of music students with that of non-music.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
SPECA Regional Workshop on Disability Statistics: Dec 13-15, 2006 Purposes of Disability Statistics Jennifer Madans and Barbara Altman National Center.
Evaluating NSF Programs
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
Minnesota FastTRAC Adult Career Pathways
Overview of the Disability Employment Initiative Setting the Stage for Round 5 DEI Projects December 2014.
STATISTICSSTATISTIQUECANADA Aboriginal Labour Force Survey Province of Alberta.
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
Matching Methods. Matching: Overview  The ideal comparison group is selected such that matches the treatment group using either a comprehensive baseline.
Oregon Pathways for Adult Basic Skills Transition to Education and Work (OPABS) Initiative.
Copyright © 2010 Pearson Education, Inc. Chapter 13 Experiments and Observational Studies.
Experiments and Observational Studies. Observational Studies In an observational study, researchers don’t assign choices; they simply observe them. look.
SPF SIG State-Level Evaluation COMMUNITY LEVEL INSTRUMENT (CLI): PART 2.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 13 Experiments and Observational Studies.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
INDUSTRIAL ECONOMICS, INCORPORATED IEc INDUSTRIAL ECONOMICS, INCORPORATED Measuring Impact of Compliance Assistance on Auto Body Shops using an Experimental.
Rigorous Quasi-Experimental Evaluations: Design Considerations Sung-Woo Cho, Ph.D. June 11, 2015 Success from the Start: Round 4 Convening US Department.
Classifying Designs of MSP Evaluations Lessons Learned and Recommendations Barbara E. Lovitts June 11, 2008.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Key Considerations in Collecting Student Follow-up Data NACTEI May 15, 2012 Portland, OR Promoting Rigorous Career and Technical Education Programs of.
Land Market Based Interventions in LAC: Protierras in Bolivia Martín Valdivia.
Developing Evidence on “What Works” in Moving TANF Recipients to Work through Job Search Assistance Karin Martinson, Abt Associates February,
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
C82MST Statistical Methods 2 - Lecture 1 1 Overview of Course Lecturers Dr Peter Bibby Prof Eamonn Ferguson Course Part I - Anova and related methods (Semester.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Overview of the TAACCCT Grant Program and the National Evaluation.
Getting Inside the “Black Box” – Capitalizing on Natural and Random Variation to Learn from the HPOG Impact Study Presenters: Alan Werner, co-Principal.
Credit Scoring Update CAS November 14, 2007 John Wilson.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Agenda: Quasi Experimental Design: Basics WSIPP drug court evaluation Outcomes and Indicators for your projects Next time: bring qualitative instrument.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
The Transition Focused IEP/ITP: A tool for building lives The District Office of Transition Services 333 So. Beaudry Avenue – 17 th floor Los Angeles,
Characteristics of Studies that might Meet the What Works Clearinghouse Standards: Tips on What to Look For 1.
SP 2015 CP PROBABILITY & STATISTICS Observational Studies vs. Experiments Chapter 11.
Chapter Nine Primary Data Collection: Experimentation and
CAREER PATHWAYS THE NEW WAY OF DOING BUSINESS. Agenda for our Discussion Today we’ll discuss: Career Pathways Systems and Programs Where we’ve been and.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Crystal Reinhart, PhD & Beth Welbes, MSPH Center for Prevention Research and Development, University of Illinois at Urbana-Champaign Social Norms Theory.
Patricia Gonzalez, OSEP June 14, The purpose of annual performance reporting is to demonstrate that IDEA funds are being used to improve or benefit.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
IMPACT EVALUATION PBAF 526 Class 5, October 31, 2011.
What Works in Federal Workforce Development CED Monthly Member Update Call June 15, 2016 Dr. Monica Herk Vice President of Education Research 1.
Experimental Design Ragu, Nickola, Marina, & Shannon.
Scott Elliot, SEG Measurement Gerry Bogatz, MarketingWorks
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Technical Assistance on Evaluating SDGs: Leave No One Behind
OSEP Project Directors Meeting
America’s Promise Evaluation What is it and what should you expect?
Gerald Dyer, Jr., MPH October 20, 2016
ISTE Workshop Research Methods in Educational Technology
Presentation transcript:

The Disability Employment Initiative (DEI): Impact Evaluation Design October 21, 2015 Sung-Woo Cho, Ph.D.

Abt Associates | pg 2 Introducing Myself  Associate at Abt Associates  Project Director of the DEI impact evaluation  Most of my work is in impact evaluation of programs in community colleges and K-12  Past work has been in community college students and their outcomes  Teach applied statistics at The George Washington University

Abt Associates | pg 3 DEI Round 5 Overview  Six grantees in six different states  Intervention consists of services that are designed to assist people with disabilities who are seeking employment  Flagged DEI Round 5 participants: TREATMENT group  Using a rigorous evaluation design, we would want to compare these people with a comparison group, tracking their outcomes along the same points in time

Abt Associates | pg 4 Design Phase  Not randomized – we are using two quasi- experimental designs (QEDs) to determine impact of DEI interventions as a whole, and impact of a career pathways component (part of the intervention)  Outcomes include postsecondary credentials, employment, wages  Will incorporate a survey (Abt SRBI) to determine disability type and ADL (adult daily living) information  Treatment and comparison – at the LWIA level. Services provided at AJCs (One-Stops)

Abt Associates | pg 5 DEI Round 5 Interventions  The actual interventions across the six grantees vary  However, career pathways component is consistent throughout the grantees  Some of the interventions are based on previous rounds’ interventions  Examples of interventions for DEI Round 5: –Wrap-around services (South Dakota) –Remedial skill development (California) –Disabilities resource coordinator (Minnesota)

Abt Associates | pg 6 Quasi-Experimental Design (QED)  The basic idea is to match a treatment group of customers to a comparison group of similar customers –Match by using their characteristics: gender, age, ethnicity, other demographic and wage information, and disability type –Pre-enrollment wage information would be very good to have, since there’s a good amount of variation here Wages are also an outcome of interest  In the end, you have a treatment group and comparison group of customers that look similar to one another on key characteristics – except only the treatment group received the DEI Round 5 services

Abt Associates | pg 7 Comparison Groups  Our calls with the grantees indicated that the participants in other local LWIAs could serve as our comparison group  All else equal, similar participants in the comparison LWIAs received a different set of services compared to participants in treatment LWIAs  We ruled out trying to create a comparison group directly from community and technical colleges –The data are difficult to collect directly from these institutions, and general population may be hard to match

Abt Associates | pg 8 Disability Type and Matching  Comparison group – would need to match on people with self-reported disabilities  The treatment LWIAs will have a participant tracking system (PTS) that will collect information on disability type  However, the comparison LWIAs will not have a PTS –Solution: A survey that will collect information on disability type among comparison LWIA participants who disclose a disability

Abt Associates | pg 9 QED and Baseline Equivalence  Once you have a treatment group and comparison group that look similar to one another, we want to measure their baseline (that is, pre-intervention) characteristics –Ex: using wage information prior to start of the intervention to measure baseline equivalence  If you can show that the treatment and comparison groups are very similar at baseline, you can look at the difference in the two groups’ outcomes to measure the impact of the intervention

Abt Associates | pg 10 QED Visual Outcome Time Start of services Treatment Comparison Baseline equivalence established Impact estimate

Abt Associates | pg 11 QED Matching Strategy 10 Treatment Customers 30 Comparison Customers

Abt Associates | pg 12 Match Based on Characteristics 10 Treatment Customers 30 Comparison Customers

Abt Associates | pg Comparison Customers are Not Matched (in red) 10 Treatment Customers 20 Comparison Customers

Abt Associates | pg 14 And They are Left Out of the Sample in a 1:1 Matching Strategy 10 Treatment Customers 10 Comparison Customers

Abt Associates | pg 15 Important Notes on Matching  The matching is done at the person (participant) level –We should use information about the LWIAs when we do the matching – but keep in mind that we are not matching LWIAs themselves –However, we keep in mind geography of the LWIAs  Each analysis would first be run within each state –We will eventually group the six states together, to determine the overall impact of the Round 5 interventions

Abt Associates | pg 16 Measuring Outcomes  Outcomes will be measured by WIASRD and Wagner-Peyser reported outcomes  These can include, but are not limited to, the following: –Employment –Wages –Credential completion (self-reported)

Abt Associates | pg 17 Potential Limitations  Interventions vary in strength and have different approaches to operationalizing career pathways programs. Grantees typically focus on adding services/components to existing services.  May be difficult to detect impacts on outcomes given nature of the interventions  Small sample sizes could contribute to issues in power –Makes it harder to determine if impacts are actually present

Abt Associates | pg 18 Why is this Important?  People with disabilities who are seeking employment – an important group to help  DOL has demonstrated this with large financial support Want to figure out if the initiatives are making an impact on education and employment outcomes  Survey information will give us more info on disability type, which is not collected in enough detail at AJCs  Through the evaluation, we are setting up data collection systems that will allow for easier information retrieval on this population and the types of services they receive

Abt Associates | pg 19 Concluding Remarks  A rigorous QED is possible, assuming that we will have information on an appropriate comparison group of non-Round 5 customers in other local WIAs  We will use data from WIASRD and Wagner-Peyser, with information on demographic characteristics and outcomes  We would use a matching strategy to create a comparison group for each treatment group –A survey will help collect information on comparison group disability type  Implementation study will be important for understanding how to provide career pathways programs to this population