America’s Promise Evaluation What is it and what should you expect?

Slides:



Advertisements
Similar presentations
Job Search Assistance Strategies Evaluation Presentation for American Public Human Services Association February 25, 2014.
Advertisements

Wisconsin Disability Employment Initiative
S-STEM Program Evaluation S-STEM PI Meeting Arlington, VA October 2012.
United States Department of Labor Employment & Training Administration EVALUATING TAACCCT Kristen Milstead Region 2 TAACCCT Roundtable July 29-31, 2014.
1 Presented By: Dr. Jacob BenusDr. Wayne Vroman Project DirectorPrincipal Investigator July 11-13, 2005 The Reemployment Eligibility Assessment (REA) Study.
Policy: SCWDC WS Training Delivery Design: Group.
The Promise of Customized Training1 The Promise of Customized Training: Evidence from the United States November 7, 2009.
Right Person – Skills – Job: O*NET ® and ETA’s Business Relations Group Presentation by Gay M. Gilbert, Acting Director, Business Relations Group, DOL/ETA.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
US Department of Labor Employment and Training Administration (ETA) Partnering for Effective Business Engagement Heather Graham Director of Special Initiatives.
0 Emerging Findings from the Employment Retention and Advancement (ERA) Evaluation Gayle Hamilton, MDRC Workforce Innovations 2005 Conference.
© 2007 SRI International CPATH Principal Investigators Meeting: Program Evaluation Update March 26, 2010 By the Center for Education Policy Dr. Raymond.
RTI International RTI International is a trade name of Research Triangle Institute. Evaluation of the Office of Juvenile Justice and Delinquency.
1 YOUTHBUILD EVALUATION Building evidence about the effect of YouthBuild on the young people it serves August 17, 2011 Washington, D.C.
Moving Toward Self-Sufficiency ________________________________________________________________ Preparing Mississippi’s Workforce Presentation for Reaching.
BUILDING BRIDGES AND BONDS (B3) A rigorous, multi-site study of innovative services offered by Responsible Fatherhood/ReFORM (RF) programs Unique opportunity.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Evaluation and Performance Measurement of Sector Strategies Colorado SECTORS Academy February 19, 2009.
Developing skills Making connections Providing a framework for professional growth and opportunity.
Welcome to Workforce 3 One U.S. Department of Labor Employment and Training Administration Webinar Date: April 6 th, 2015 Presented by: U.S. Department.
Minnesota Public Listening Sessions: WIOA Draft Plan Deputy Commissioner Jeremy Hanson Willis Minnesota Department of Employment and Economic Development.
Vocational Rehabilitation (DVR) Recruitment. Results. Retention. Customer Service Professional Training A Demand Driven Model
What is layoff aversion? In June 2010, the U.S Department of Labor (DOL) ETA issued TEGL Broader Definition Layoff aversion is preventing, or minimizing.
Texas Workforce Commission
Jobs Accelerator Grantee Evaluations
Pathways for Advancing Careers and Education (PACE): Findings from a Study of a Career Pathways Program Karen Gardiner Abt Associates, Inc. National Association.
Communities Aligned for Career Pathways
Youth CareerConnect Evaluation:
Youth CareerConnect Programs:
HCDE and Marek Partners in Building People
Oklahoma’s Workforce Information Programs Education Job Seekers Employees Employers Oklahoma’s Workforce.
DESK AID: Menu of Job Seeker Services for All WorkSource Centers
Where We’ve Been, Where We Are Now and Where We’re Going!
Case Management: A Service Delivery Strategy
Pilot Internship Program: Project Overview
Poll Question Have you read the Supplemental Wage Information Guidance issued under TEGL 26-16, PM 17-6, and TAC 17-04?
National Farmworker Jobs Program
Impacting Public Policy: Role of the SRC
Creating a P.L Plan.
HCDE and Marek Partners in Building People
November 22, 2018 Building Evidence The Evaluation of the America’s Promise Job Driven Grant Program Megan Baird, H-1B Program Manager, DOL Molly Irwin,
High Impact Partners on behalf of the Employment and Training
Plenary Session: Early Planning For Sustainability
Career Outcomes for Higher Education Graduates
DEI-Career Pathways Webinar Series Part 1-Aligning DEI and Career Pathways System and Program Strategies November 2017.
H-1B America’s Promise Grantee Convening Day 1: November 14, 2017
Perspectives on Reform of Publicly Funded Training
Reemployment & System Integration (RSI)-Dislocated Worker Grants (DWG)
Innovative Supportive Services
SWFI Evaluation Overview And Update
Office of Planning, Research and Evaluation
Using Data to Monitor Title I, Part D
TECHHIRE GRANTS MANAGEMENT PLENARY SESSION: EVALUATION UPDATE
Building Evidence The YCC Evaluation
Annual Title I Meeting and Benefits of Parent and Family Engagement
David Mann David Stapleton (Mathematica Policy Research) Alice Porter
Evidence-Based Practices Under ESSA for Title II, Part A
Journey to the Future: Our Partnership Story
America’s Promise Success Factors
A Focus on Strategic vs. Tactical Action for Boards
Anna Gaughan Centre for Local Governance 26th March 2008
Jamie Weinstein, MPH The MayaTech Corporation,
Engaging Employers to Support SWFI Career Pathways
Rapid cycle evaluation:
Which Evaluation Designs Are Right for Your State?
The Workforce Innovation and Opportunity Act
Run of Show Introduction Grantee Presenter Introductions 1:28
Disability Program Navigator Training A Joint Initiative of the U. S
Presentation transcript:

America’s Promise Evaluation What is it and what should you expect?

Session Moderator Megan Lizik Senior Evaluation Specialist U.S. DOL Chief Evaluation Office Washington, DC

Presenters: Molly Irwin Megan Lizik Jeanne Bellotti Sheena McConnell Chief Evaluation Officer Senior Evaluation Specialist U.S. DOL Chief Evaluation Office Washington, DC U.S. DOL Chief Evaluation Office Washington, DC Jeanne Bellotti Sheena McConnell Evaluation Director Principal Investigator Mathematica Policy Research Washington, DC Mathematica Policy Research Washington, DC

Overview of our presentation Importance of building evidence Implementation evaluation Key research questions Data collection activities Timeline Impact evaluation

Importance of building evidence

Why emphasize evidence? Policy approaches and funding decisions are informed by evidence Evidence can help practitioners improve programs and better serve their customers All levels of government and the private sector are encouraging more evidence-based decision-making

Use of Evaluation in Policy/Program Implementation Plan Use Evidence to Improve Implement Evaluate

What is known now? Growing evidence that sector strategies can…. Increase employability, employment, earnings and other outcomes of job seekers Benefit employers through improved worker productivity, job retention and enhanced profitability Career pathways are emerging as potentially promising to meet workers and business needs Much more could be learned

Where do grantees fit in? DOL, Mathematica, and grantees are partners We want to learn from and share your experiences, challenges and successes Together, we will work to build evidence about ways to improve workforce programs and the outcomes of the people they serve

Implementation evaluation

Key research questions How were regional partnerships developed and maintained? What are the types and combinations of services and approaches provided? How were they implemented? What are the characteristics of those served? What is the community context of America’s Promise grantees?

Data collection activities Review of grant applications (completed) Clarifying telephone calls (completed) Grantee survey Partner network survey Site visits Telephone interviews Review of grantee QPRs and narratives

What to expect: grantee survey Who: All 23 grantees What: Grantee characteristics Program features and services Partner structure and participation Early challenges Early successes When: Spring 2018 How long: 30 minutes

What to expect: partner network survey Who: 6 grantees and up to 24 partners from each What: Partner characteristics Role in the partnership Perspectives on partnership goals Frequency and type of collaboration Assessment of partnership quality When: Spring 2018 and Winter 2020 How long: 20 minutes

What to expect: site visits Who: 12 grantees What: Partnership evolution Details of service provision Challenges Successes Participant perspectives When: Fall 2019 How long: 2.5 days

What to expect: telephone interviews Who: 11 grantees What: Partnership evolution Service provision over time Challenges Successes When: Fall 2019 How long: 2 hours

Timeline for data collection On-going during grant period Review of grantee quarterly performance reports and narratives Summer ‘17 Review of grant applications Clarifying telephone calls with grantees Spring ’18 Survey of all 23 grantees Round 1 partner network survey with 6 grantees Fall ’19 Site visits with 12 grantees Telephone interviews with 11 grantees Winter ’20 Round 2 partner network survey with 6 grantees Jan ‘17 Grants awarded Dec ‘20 Grants end

Impact evaluation

Key research questions What effects did America’s Promise have on education, training, and skill development? What effect did America’s Promise have on labor market outcomes? Did the effect of America’s Promise vary by participant characteristics or program components?

Different approaches to impact evaluation The evaluation will compare people who participate in America’s Promise with similar people who do not DOL wants to use the most rigorous approach that is both feasible and appropriate The goal is to conduct an experiment A quasi-experiment will supplement experimental findings

Major considerations for impact design Is America’s Promise different from other services in the region? What does “business as usual” look like? Is the expected number of people served realistic? Can the grantee form a control group? Is high quality administrative data available for a quasi-experimental design?

Data collection activities Baseline survey at application Periodic, short, text surveys Follow-up survey Local administrative data Unemployment insurance wage data

What to expect: feasibility site visits Who: 5 grantees Why: Understand America’s Promise and alternative services Discuss potential evaluation procedures and needs Identify opportunities for comparison groups When: Late 2017 How long: 1 day

Timeline for impact evaluation Late 2017: Impact feasibility visits to 5 grantees Early 2018: DOL selects grantees to participate Spring 2018: We work with selected sites to develop impact procedures Going forward: We train staff on study procedures We provide on-going support to participating grantees We work with grantees to collect data on participants

Questions?

Contacts: Megan Lizik Jeanne Bellotti Sheena McConnell Evaluation COR Evaluation Director DOL Chief Evaluation Office lizik.megan@dol.gov Mathematica Policy Research jbellotti@mathematica-mpr.com Sheena McConnell Principal Investigator Mathematica Policy Research smcconnell@mathematica-mpr.com