America’s Promise Evaluation What is it and what should you expect?
Session Moderator Megan Lizik Senior Evaluation Specialist U.S. DOL Chief Evaluation Office Washington, DC
Presenters: Molly Irwin Megan Lizik Jeanne Bellotti Sheena McConnell Chief Evaluation Officer Senior Evaluation Specialist U.S. DOL Chief Evaluation Office Washington, DC U.S. DOL Chief Evaluation Office Washington, DC Jeanne Bellotti Sheena McConnell Evaluation Director Principal Investigator Mathematica Policy Research Washington, DC Mathematica Policy Research Washington, DC
Overview of our presentation Importance of building evidence Implementation evaluation Key research questions Data collection activities Timeline Impact evaluation
Importance of building evidence
Why emphasize evidence? Policy approaches and funding decisions are informed by evidence Evidence can help practitioners improve programs and better serve their customers All levels of government and the private sector are encouraging more evidence-based decision-making
Use of Evaluation in Policy/Program Implementation Plan Use Evidence to Improve Implement Evaluate
What is known now? Growing evidence that sector strategies can…. Increase employability, employment, earnings and other outcomes of job seekers Benefit employers through improved worker productivity, job retention and enhanced profitability Career pathways are emerging as potentially promising to meet workers and business needs Much more could be learned
Where do grantees fit in? DOL, Mathematica, and grantees are partners We want to learn from and share your experiences, challenges and successes Together, we will work to build evidence about ways to improve workforce programs and the outcomes of the people they serve
Implementation evaluation
Key research questions How were regional partnerships developed and maintained? What are the types and combinations of services and approaches provided? How were they implemented? What are the characteristics of those served? What is the community context of America’s Promise grantees?
Data collection activities Review of grant applications (completed) Clarifying telephone calls (completed) Grantee survey Partner network survey Site visits Telephone interviews Review of grantee QPRs and narratives
What to expect: grantee survey Who: All 23 grantees What: Grantee characteristics Program features and services Partner structure and participation Early challenges Early successes When: Spring 2018 How long: 30 minutes
What to expect: partner network survey Who: 6 grantees and up to 24 partners from each What: Partner characteristics Role in the partnership Perspectives on partnership goals Frequency and type of collaboration Assessment of partnership quality When: Spring 2018 and Winter 2020 How long: 20 minutes
What to expect: site visits Who: 12 grantees What: Partnership evolution Details of service provision Challenges Successes Participant perspectives When: Fall 2019 How long: 2.5 days
What to expect: telephone interviews Who: 11 grantees What: Partnership evolution Service provision over time Challenges Successes When: Fall 2019 How long: 2 hours
Timeline for data collection On-going during grant period Review of grantee quarterly performance reports and narratives Summer ‘17 Review of grant applications Clarifying telephone calls with grantees Spring ’18 Survey of all 23 grantees Round 1 partner network survey with 6 grantees Fall ’19 Site visits with 12 grantees Telephone interviews with 11 grantees Winter ’20 Round 2 partner network survey with 6 grantees Jan ‘17 Grants awarded Dec ‘20 Grants end
Impact evaluation
Key research questions What effects did America’s Promise have on education, training, and skill development? What effect did America’s Promise have on labor market outcomes? Did the effect of America’s Promise vary by participant characteristics or program components?
Different approaches to impact evaluation The evaluation will compare people who participate in America’s Promise with similar people who do not DOL wants to use the most rigorous approach that is both feasible and appropriate The goal is to conduct an experiment A quasi-experiment will supplement experimental findings
Major considerations for impact design Is America’s Promise different from other services in the region? What does “business as usual” look like? Is the expected number of people served realistic? Can the grantee form a control group? Is high quality administrative data available for a quasi-experimental design?
Data collection activities Baseline survey at application Periodic, short, text surveys Follow-up survey Local administrative data Unemployment insurance wage data
What to expect: feasibility site visits Who: 5 grantees Why: Understand America’s Promise and alternative services Discuss potential evaluation procedures and needs Identify opportunities for comparison groups When: Late 2017 How long: 1 day
Timeline for impact evaluation Late 2017: Impact feasibility visits to 5 grantees Early 2018: DOL selects grantees to participate Spring 2018: We work with selected sites to develop impact procedures Going forward: We train staff on study procedures We provide on-going support to participating grantees We work with grantees to collect data on participants
Questions?
Contacts: Megan Lizik Jeanne Bellotti Sheena McConnell Evaluation COR Evaluation Director DOL Chief Evaluation Office lizik.megan@dol.gov Mathematica Policy Research jbellotti@mathematica-mpr.com Sheena McConnell Principal Investigator Mathematica Policy Research smcconnell@mathematica-mpr.com