How does evaluation help my state and where do we start? August 10, 2019 How does evaluation help my state and where do we start? RESEA Evaluation Technical Assistance
Megan Lizik Senior Evaluation Specialist and Project Officer for RESEA Evaluation U.S. DOL, Chief Evaluation Office
Evaluation Technical Assistance (E-TA) DOL and the E-TA team will develop resources designed to build state capacity to use and develop evidence E-TA will include: Written materials (e.g., overviews of evaluation evidence, “Evaluation Toolkits,” etc.) Webinars State-specific or small group consultations More customized E-TA as needs are identified
Define program evaluation and highlight its benefits for RESEA programs Review tools that will help you form learning goals and begin thinking about potential evaluation efforts Demystify key evaluation concepts, including research questions and evaluation design
Siobhan Mills De La Rosa Lawrence Burns Reemployment Coordinator Office of Unemployment Insurance, ETA, U.S. DOL Phomdaen Souvanna Senior Analyst Abt Associates Siobhan Mills De La Rosa Associate Abt Associates
How would you rate your knowledge of evaluations? I don’t know a lot about evaluations – but I’m ready to learn! I know some general evaluation concepts, but have not planned or conducted an evaluation. I’ve planned some evaluation activities, but have never played a role in conducting an evaluation. I’ve planned and conducted an evaluation.
What do you see as the greatest potential benefit received from evaluations? Evaluations help us learn how to potentially improve our programs. Evaluations allow us to contribute to the field of study and demonstrate our program’s effectiveness to the public. Evaluations fulfill funders and/or stakeholder requirements. Other (share in chat box!) I am unsure about the benefits gained from evaluations.
What is your top 3 concerns about conducting evaluations? Building our capacity to conduct evaluations. Coming up with research questions about what we want to learn from an evaluation. Incorporating evaluation procedures like random assignment into existing program operations. Accessing necessary data and related resources (data systems, technology) to conduct evaluations. Developing internal and external partnerships to facilitate evaluations. Understanding, communicating, and using evaluation results.
What is program evaluation?
An evaluation is… “…the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future program development.” ~ Michael Quinn Patton Former President, American Evaluation Association (AEA) Patton, MQ (1997). Utilization-focused evaluation: The new century text. 3rd ed. Thousand Oaks, CA: Sage.
Why conduct evaluations? Evaluations can help you understand how: Program implementation and services vary within your state across local service delivery areas. Services, activities or other program aspects meet UI claimant needs. Program components may be strengthened or improved. Programs influence UI claimant outcomes over time.
Evaluation Planning Roadmap Develop Learning Goals Conduct Evaluability Assessment Create a Logic Model Choose an Evaluation Design Tools to Move from Learning Goals to Specific Evaluations
Develop Learning Goals
What are Learning Goals? Areas that your team would like to learn more about and will guide evaluation efforts Begin to inform your research questions Examples include: Is our RESEA program achieving its goals? What combination of services best improve claimant outcomes? Do our program’s short-term achievements last over time? Can our profiling model better identify claimants who are most likely to benefit from RESEA?
How do we identify learning goals? Ask yourself: What problems are we trying to address? What do we need to know in order to address the problem? Engage key stakeholders State leadership Agency staff (program, research, technology) Local workforce/American Job Center leadership Frontline staff Research partners
Evaluability Assessments and Logic Models Tools to Move from Learning Goals to Specific Evaluations
Evaluability assessments will help you identify All program activities and goals (using the logic model as a guide); The focus of the evaluation, including components to be tested and populations of interest; Potential strengths and challenges in executing the evaluation; and Intended uses of the evaluation findings.
Sample Evaluability Assessment Evaluation Design and Assessment Tool, IMPAQ international, https://evalhub.workforcegps.org/resources/2018/09/07/19/53/Evaluation-Design-Assessment-Tool
How do we use evaluability assessment results? Refine broad learning goals Create specific evaluation research questions Components Population of interest Outcomes of interest Data you will use to measure outcomes Identify and develop solutions for challenges to successful evaluation completion: Address data availability and quality issues Create evaluation procedures and plan to train staff Request IT resources as necessary
A Simplified RESEA Logic Model Outcomes Outputs Activities Inputs Staff time RESEA funds AJC facilities Labor market data OUI guidance Research Strategic Partnership AJC orientation LMI Individual Reemployment Plan Reemployment Services Eligibility Assessment Adjudication processes Penalties Improved labor market knowledge Improved job readiness skills Improved job searching effectiveness Identify ineligible claimants Greater job search effort Faster return to Employment Improved Earnings Reduced UI Duration
What do logic models do? Build shared understanding of the steps and time needed to achieve desired outcomes Pinpoint gaps in relationship between services and outcomes that need to be addressed Articulate why and how you expect your program or intervention to work Identify focus for current and future evaluations
RESEA Research Question Examples Question Type Example Impact Does selection for RESEA improve claimants’ employment outcomes from what they would have been otherwise? Would more intensive case management by AJC staff improve claimants’ outcomes? Outcomes How soon did claimants become reemployed? How do FTR rates vary across WDB areas? Process or Implementation What reemployment services and activities do claimants participate in? What activities take place during meetings between claimants and case managers?
Choosing an Evaluation Design
Evaluation Design Types Impact Studies (Quasi-Experimental Designs & Randomized Control Trials) Determine extent to which claimant outcomes are different compared to what outcomes would have been without the program Outcomes Studies Assess program’s progress in achieving its established goals Process & Implementation Studies Document program operations and client flow through the program, as implemented
Impact Studies Benefits: Challenges: Explain what effect the program has on claimants’ outcomes Provides strong evidence on program’s impact on outcomes Challenges: Requires a comparison group and often large sample sizes Requires more sophisticated analytical expertise
Outcomes Studies Benefits: Challenges: Document whether the program is achieving its established outcomes Identify where promising or concerning outcomes exist Can typically be performed with existing data Do not require high levels of statistical expertise Challenges: Cannot tell you what caused the outcomes that are observed
Implementation Studies Benefits: Describe how the program is implemented across offices Can identify promising practices, areas for improvement, and program elements to evaluate further Can help explain why something was or was not effective when combined with an impact study Does not require advanced statistical expertise Challenges: Cannot tell you the effectiveness of the program
RESEA Evaluation Example Research Question: Does offering a 2nd one-on-one RESEA meeting help claimants return to work, and leave UI, more quickly? Test: Some RESEA eligible claimants receive one meeting A group of otherwise similar claimants receive two Compare outcomes of groups Measures: Employment in the second quarter after the start of the claim Duration of UI benefits
Cycles of Learning and Doing
This process is iterative! What you learn from your first evaluation might lead to changes in the program Initial results may motivate you learn more about program impacts for particular sub-groups If your results are not what you expected, you may want to go back to your logic model and test something new! Plan Evaluation of Program Reflect on & Communicate Evaluation Findings Conduct Evaluation Refine Program or Evaluation
Communicate Evaluation Findings Sharing evaluation results is critical Findings can alert: Frontline staff and other states to best practices and new approaches to serving claimants Employers to the quality of the services you provide jobseekers DOL and other stakeholders that your program is effective in improving claimant outcomes More on disseminating results to a variety of audiences in future webinars!
Closing Thoughts and Next Steps
Other things to keep in mind… Current evidence that supports RESEA is based on REA program Continuous Improvement Life-cycle of an evaluation Planning and Partnership Building
Which Evaluation Designs are Right for You? Week of May 20-24, 2019 What Evaluation Details Do I Need for a Plan and How Long Will It Take? Week of June 17-21, 2019 Procuring and Selecting and Independent Evaluator Week of July 15-19, 2019
Evaluation Design Assessment Tool This tool will help you assess the evaluability of your proposed intervention and highlight key operational considerations to assess when determining evaluation feasibility. https://evalhub.workforcegps.org/resources/ 2018/09/07/19/53/Evaluation-Design- Assessment-Tool Fully Articulating Your Vision: Using Logic Models to Support Innovation This webinar provides in-depth guidance on how to create logic models for labor programs. https://evalhub.workforcegps.org/sitecore/co ntent/global/resources/2015/05/07/11/07/Full y_Articulating_Your_Vision_Using_Logic_M odels_to_Support_Innovation Clearinghouse for Labor Evaluation and Research This clearinghouse serves as a repository of high-quality labor evaluations and research. You can use the clearinghouse to explore existing evidence on reemployment interventions as well as learn more about CLEAR’s standards for high-quality research. https://clear.dol.gov/ Reemployment Synthesis: https://clear.dol.gov/synthesis- report/reemployment-synthesis Reemployment Supplement: https://clear.dol.gov/sites/default/files/Resea rchSynthesis_Reemploy_Sup.pdf
RESEA E-TA Inbox Megan Lizik Larry Burns Phomdaen Souvanna Senior Evaluation Specialist and Project Officer U.S. DOL – Chief Evaluation Office Lizik.Megan@dol.gov Larry Burns Reemployment Coordinator U.S. DOL – Office of Unemployment Insurance Burns.Lawrence@dol.gov Phomdaen Souvanna Senior Analyst Abt Associates Phomdaen_Souvanna@abtassoc.com 617.520.2452 Siobhan Mills De La Rosa Associate/Scientist Siobhan_Mills@abtassoc.com 301.968.4405 RESEA E-TA Inbox RESEA@abtassoc.com