How does evaluation help my state and where do we start?

Slides:



Advertisements
Similar presentations
Project L.O.F.T. Report May 2007 through October 2007 Creating a design to meet stakeholder desires and dissolve our current set of interacting problems.
Advertisements

Evaluation Capacity Building Identifying and Addressing the Fields Needs.
HR Manager – HR Business Partners Role Description
Healthy North Carolina 2020 and EBS/EBI 101 Joanne Rinker MS, RD, CDE, LDN Center for Healthy North Carolina Director of Training and Technical Assistance.
How to Develop the Right Research Questions for Program Evaluation
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
1 Presented By: Dr. Jacob BenusDr. Wayne Vroman Project DirectorPrincipal Investigator July 11-13, 2005 The Reemployment Eligibility Assessment (REA) Study.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Working Definition of Program Evaluation
1 Customized Employment Strategic Service Delivery Component Disability Employment Initiative.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
1 Customized Employment Strategic Service Delivery Component Disability Employment Initiative.
US Department of Labor Employment and Training Administration (ETA) Partnering for Effective Business Engagement Heather Graham Director of Special Initiatives.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Welcome to Workforce 3 One U.S. Department of Labor Employment and Training Administration Webinar Date: April 6 th, 2015 Presented by: U.S. Department.
Strategic planning A Tool to Promote Organizational Effectiveness
Stages of Research and Development
Logic Models How to Integrate Data Collection into your Everyday Work.
Evaluating the Quality and Impact of Community Benefit Programs
7 Training Employees What Do I Need to Know?
Developing Community Assessments
Account Management Overview
Technical Business Consultancy Project
The What Works Centre for Crime Reduction: An evaluation
Welcome! Enhancing the Care Team May 25, 2017
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Fundamentals of Monitoring and Evaluation
MUHC Innovation Model.
Using Logic Models in Program Planning and Grant Proposals
Programme Board 6th Meeting May 2017 Craig Larlee
RECOGNIZING educator EXCELLENCE
Introduction to Program Evaluation
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
TSMO Program Plan Development
Statistics Canada Internal Services Business Transformation Experience
“CareerGuide for Schools”
RtI Innovations: Evaluation Anna Harms & Jose Castillo
Developing & Refining a Theory of Action
Getting Started with Your Malnutrition Quality Improvement Project
America’s Promise Evaluation What is it and what should you expect?
Promising Practices for Increasing Certificate and Credentialing Outcomes H-1B Ready to Work.
Chapter 6 HEALTHCARE MARKETING. Chapter 6 HEALTHCARE MARKETING.
Logic Models and Theory of Change Models: Defining and Telling Apart
For this grant initiative I am the:
Evidence Based Curriculum & Instruction
ANALYSIS AND DESIGN Case Study Work Session 2 From Concept to Reality
4.2 Identify intervention outputs
Chicago Public Schools
Grantee Guide to Project Performance Measurement
Strategic Boards Toolkit
Using Data for Program Improvement
CATHCA National Conference 2018
Standard for Teachers’ Professional Development July 2016
Using Data for Program Improvement
Portfolio, Programme and Project
A Focus on Strategic vs. Tactical Action for Boards
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Using Logic Models in Project Proposals
Which Evaluation Designs Are Right for Your State?
Guiding Claimants to Reemployment
RESEA Evaluation Technical Assistance
Procuring and Selecting an Independent Evaluator
Framing Grants for policy Research
Run of Show Goals of RESEA EvalTA (Cycles of Learning and Doing) 0:51
Presentation transcript:

How does evaluation help my state and where do we start? August 10, 2019 How does evaluation help my state and where do we start? RESEA Evaluation Technical Assistance

Megan Lizik Senior Evaluation Specialist and Project Officer for RESEA Evaluation U.S. DOL, Chief Evaluation Office

Evaluation Technical Assistance (E-TA) DOL and the E-TA team will develop resources designed to build state capacity to use and develop evidence E-TA will include: Written materials (e.g., overviews of evaluation evidence, “Evaluation Toolkits,” etc.) Webinars State-specific or small group consultations More customized E-TA as needs are identified

Define program evaluation and highlight its benefits for RESEA programs Review tools that will help you form learning goals and begin thinking about potential evaluation efforts Demystify key evaluation concepts, including research questions and evaluation design

Siobhan Mills De La Rosa Lawrence Burns Reemployment Coordinator Office of Unemployment Insurance, ETA, U.S. DOL Phomdaen Souvanna Senior Analyst Abt Associates Siobhan Mills De La Rosa Associate Abt Associates

How would you rate your knowledge of evaluations? I don’t know a lot about evaluations – but I’m ready to learn! I know some general evaluation concepts, but have not planned or conducted an evaluation. I’ve planned some evaluation activities, but have never played a role in conducting an evaluation. I’ve planned and conducted an evaluation.

What do you see as the greatest potential benefit received from evaluations? Evaluations help us learn how to potentially improve our programs. Evaluations allow us to contribute to the field of study and demonstrate our program’s effectiveness to the public. Evaluations fulfill funders and/or stakeholder requirements. Other (share in chat box!) I am unsure about the benefits gained from evaluations.

What is your top 3 concerns about conducting evaluations? Building our capacity to conduct evaluations. Coming up with research questions about what we want to learn from an evaluation. Incorporating evaluation procedures like random assignment into existing program operations. Accessing necessary data and related resources (data systems, technology) to conduct evaluations. Developing internal and external partnerships to facilitate evaluations. Understanding, communicating, and using evaluation results.

What is program evaluation?

An evaluation is… “…the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future program development.” ~ Michael Quinn Patton Former President, American Evaluation Association (AEA) Patton, MQ (1997). Utilization-focused evaluation: The new century text. 3rd ed. Thousand Oaks, CA: Sage.

Why conduct evaluations? Evaluations can help you understand how: Program implementation and services vary within your state across local service delivery areas. Services, activities or other program aspects meet UI claimant needs. Program components may be strengthened or improved. Programs influence UI claimant outcomes over time.

Evaluation Planning Roadmap Develop Learning Goals Conduct Evaluability Assessment Create a Logic Model Choose an Evaluation Design Tools to Move from Learning Goals to Specific Evaluations

Develop Learning Goals

What are Learning Goals? Areas that your team would like to learn more about and will guide evaluation efforts Begin to inform your research questions Examples include: Is our RESEA program achieving its goals? What combination of services best improve claimant outcomes? Do our program’s short-term achievements last over time? Can our profiling model better identify claimants who are most likely to benefit from RESEA?

How do we identify learning goals? Ask yourself: What problems are we trying to address? What do we need to know in order to address the problem? Engage key stakeholders State leadership Agency staff (program, research, technology) Local workforce/American Job Center leadership Frontline staff Research partners

Evaluability Assessments and Logic Models Tools to Move from Learning Goals to Specific Evaluations

Evaluability assessments will help you identify All program activities and goals (using the logic model as a guide); The focus of the evaluation, including components to be tested and populations of interest; Potential strengths and challenges in executing the evaluation; and Intended uses of the evaluation findings.

Sample Evaluability Assessment Evaluation Design and Assessment Tool, IMPAQ international, https://evalhub.workforcegps.org/resources/2018/09/07/19/53/Evaluation-Design-Assessment-Tool

How do we use evaluability assessment results? Refine broad learning goals Create specific evaluation research questions Components Population of interest Outcomes of interest Data you will use to measure outcomes Identify and develop solutions for challenges to successful evaluation completion: Address data availability and quality issues Create evaluation procedures and plan to train staff Request IT resources as necessary

A Simplified RESEA Logic Model Outcomes Outputs Activities Inputs Staff time RESEA funds AJC facilities Labor market data OUI guidance Research Strategic Partnership AJC orientation LMI Individual Reemployment Plan Reemployment Services Eligibility Assessment Adjudication processes Penalties Improved labor market knowledge Improved job readiness skills Improved job searching effectiveness Identify ineligible claimants Greater job search effort Faster return to Employment Improved Earnings Reduced UI Duration

What do logic models do? Build shared understanding of the steps and time needed to achieve desired outcomes Pinpoint gaps in relationship between services and outcomes that need to be addressed Articulate why and how you expect your program or intervention to work Identify focus for current and future evaluations

RESEA Research Question Examples Question Type Example Impact Does selection for RESEA improve claimants’ employment outcomes from what they would have been otherwise? Would more intensive case management by AJC staff improve claimants’ outcomes? Outcomes How soon did claimants become reemployed? How do FTR rates vary across WDB areas? Process or Implementation What reemployment services and activities do claimants participate in? What activities take place during meetings between claimants and case managers?

Choosing an Evaluation Design

Evaluation Design Types Impact Studies (Quasi-Experimental Designs & Randomized Control Trials) Determine extent to which claimant outcomes are different compared to what outcomes would have been without the program Outcomes Studies Assess program’s progress in achieving its established goals Process & Implementation Studies Document program operations and client flow through the program, as implemented

Impact Studies Benefits: Challenges: Explain what effect the program has on claimants’ outcomes Provides strong evidence on program’s impact on outcomes Challenges: Requires a comparison group and often large sample sizes Requires more sophisticated analytical expertise

Outcomes Studies Benefits: Challenges: Document whether the program is achieving its established outcomes Identify where promising or concerning outcomes exist Can typically be performed with existing data Do not require high levels of statistical expertise Challenges: Cannot tell you what caused the outcomes that are observed

Implementation Studies Benefits: Describe how the program is implemented across offices Can identify promising practices, areas for improvement, and program elements to evaluate further Can help explain why something was or was not effective when combined with an impact study Does not require advanced statistical expertise Challenges: Cannot tell you the effectiveness of the program

RESEA Evaluation Example Research Question: Does offering a 2nd one-on-one RESEA meeting help claimants return to work, and leave UI, more quickly? Test: Some RESEA eligible claimants receive one meeting A group of otherwise similar claimants receive two Compare outcomes of groups Measures: Employment in the second quarter after the start of the claim Duration of UI benefits

Cycles of Learning and Doing

This process is iterative! What you learn from your first evaluation might lead to changes in the program Initial results may motivate you learn more about program impacts for particular sub-groups If your results are not what you expected, you may want to go back to your logic model and test something new! Plan Evaluation of Program Reflect on & Communicate Evaluation Findings Conduct Evaluation Refine Program or Evaluation

Communicate Evaluation Findings Sharing evaluation results is critical Findings can alert: Frontline staff and other states to best practices and new approaches to serving claimants Employers to the quality of the services you provide jobseekers DOL and other stakeholders that your program is effective in improving claimant outcomes More on disseminating results to a variety of audiences in future webinars!

Closing Thoughts and Next Steps

Other things to keep in mind… Current evidence that supports RESEA is based on REA program Continuous Improvement Life-cycle of an evaluation Planning and Partnership Building

Which Evaluation Designs are Right for You? Week of May 20-24, 2019 What Evaluation Details Do I Need for a Plan and How Long Will It Take? Week of June 17-21, 2019 Procuring and Selecting and Independent Evaluator Week of July 15-19, 2019

Evaluation Design Assessment Tool This tool will help you assess the evaluability of your proposed intervention and highlight key operational considerations to assess when determining evaluation feasibility. https://evalhub.workforcegps.org/resources/ 2018/09/07/19/53/Evaluation-Design- Assessment-Tool Fully Articulating Your Vision: Using Logic Models to Support Innovation This webinar provides in-depth guidance on how to create logic models for labor programs. https://evalhub.workforcegps.org/sitecore/co ntent/global/resources/2015/05/07/11/07/Full y_Articulating_Your_Vision_Using_Logic_M odels_to_Support_Innovation Clearinghouse for Labor Evaluation and Research This clearinghouse serves as a repository of high-quality labor evaluations and research. You can use the clearinghouse to explore existing evidence on reemployment interventions as well as learn more about CLEAR’s standards for high-quality research. https://clear.dol.gov/ Reemployment Synthesis: https://clear.dol.gov/synthesis- report/reemployment-synthesis Reemployment Supplement: https://clear.dol.gov/sites/default/files/Resea rchSynthesis_Reemploy_Sup.pdf

RESEA E-TA Inbox Megan Lizik Larry Burns Phomdaen Souvanna Senior Evaluation Specialist and Project Officer U.S. DOL – Chief Evaluation Office Lizik.Megan@dol.gov Larry Burns Reemployment Coordinator U.S. DOL – Office of Unemployment Insurance Burns.Lawrence@dol.gov Phomdaen Souvanna Senior Analyst Abt Associates Phomdaen_Souvanna@abtassoc.com 617.520.2452 Siobhan Mills De La Rosa Associate/Scientist Siobhan_Mills@abtassoc.com 301.968.4405 RESEA E-TA Inbox RESEA@abtassoc.com