Lessons from the United States: Evaluating Employment Services is Neither Easy Nor Cheap November 7, 2009 Sheena McConnell Peter Schochet Alberto Martini.

Slides:



Advertisements
Similar presentations
B45, Second Half - The Technology of Skill Formation 1 The Economics of the Public Sector – Second Half Topic 9 – Analysis of Human Capital Policies Public.
Advertisements

Child Care Subsidy Data and Measurement Challenges 1 Study of the Effects of Enhanced Subsidy Eligibility Policies In Illinois Data Collection and Measurement.
Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
1 Preliminary Analysis: First 2000 Survey Respondents Workforce Innovations 2005 Research Showcase Jacob Benus, Project DirectorSheena McConnell, Principal.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Using randomised control trials to evaluate public policy – Presentation to DIISRTE/DEEWR workshop, January 31 Jeff Borland Department of Economics University.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Job Search Assistance Strategies Evaluation Presentation for American Public Human Services Association February 25, 2014.
Case Study on the U.S. Government Accountability Office’s Electronic Records Management System (ERMS) June 15, 2006 THE GILBANE CONFERENCE ON CONTENT MANAGEMENT.
Wisconsin Disability Employment Initiative
American Lessons on Designing Reliable Impact Evaluations, from Studies of WIA and Its Predecessor Programs Larry L. Orr, Independent Consultant Stephen.
Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.
1 Opportunistic Experiments and Rapid-Cycle Evaluation.
AGEC 608 Lecture 11, p. 1 AGEC 608: Lecture 11 Objective: Provide overview of how “demonstrations” are typically used in deriving benefits and costs of.
The Intersection of Performance Measurement and Program Evaluation: Searching for the Counterfactual The Intersection of Performance Measurement and Program.
Making Impact Evaluations Happen World Bank Operational Experience 6 th European Conference on Evaluation of Cohesion Policy 30 November 2009 Warsaw Joost.
How Policy Evaluations and Performance Management are Used Maureen Pirog Rudy Professor of Public and Environmental Affairs, Indiana University Affiliated.
Agenda: Block Watch: Random Assignment, Outcomes, and indicators Issues in Impact and Random Assignment: Youth Transition Demonstration –Who is randomized?
Job Training Programs. What has been tried? How well does it work?
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Guidance on Evaluation of Youth Employment Initiative
Employment Creation Employment creation has not been given priority by some countries in their macro economic policies and development plans. Policy Priorities.
1 Randomization in Practice. Unit of randomization Randomizing at the individual level Randomizing at the group level –School –Community / village –Health.
Pg. 2 © 2015 DEKRA DEKRA world wide DEKRA in the region DEKRA services DEKRA applications DEKRA in Croatia.
Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations.
Some perspectives on the importance of policy evaluation Joost Bollens HIVA- K.U.Leuven 1Joost Bollens.
1 Presented By: Dr. Jacob BenusDr. Wayne Vroman Project DirectorPrincipal Investigator July 11-13, 2005 The Reemployment Eligibility Assessment (REA) Study.
Community Planning Training 3-1. Community Plan Implementation Training Community Planning Training.
EQARF Applying EQARF Framework and Guidelines to the Development and Testing of Eduplan.
Building Community Partnerships to Serve Immigrant Workers Funded by the Ford Foundation Nonprofit and Community College Collaborations.
1 Informing a Data Revolution Getting the right data, to the right people, at the right time, on the right format Johannes Jütting, PARIS21 Tunis, 8 Decemeber.
SEDA IMPACT EVALUATION WESTERN CAPE (SOUTH AFRICA) Varsha Harinath (the dti) Francisco Campos (World Bank) Finance and Private Sector Development IE Workshop.
Unit 10. Monitoring and evaluation
ADAPT serving geriatric populations in rural communities. Project ADAPT Assessing Depression and Proactive Treatment The Minnesota Area Geriatric Education.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
RANDRAND CAHPS® Relevance of CAHPS® for Workers’ Compensation Medical Care Donna Farley Senior Health Policy Analyst, RAND Workers’ Compensation Colloquium.
Data Users Needs & Capacity Building International Programs National Agricultural Statistics Service United States Department of Agriculture November 2009.
The NRS Project NRS Activities Enhancing Performance Through Accountability American Institutes for Research February 2005.
Monitoring for Equal Opportunity and Compliance Evelyn Rodriguez EO Officer, Washington State Employment Security Valerie E. Kitchings EO Officer, District.
Development and Reform Research Team University of Bologna Assessing Active Labor Market Policies in Transition Countries: Scope, Applicability and Evaluation.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
AfDB Action Plan for Youth Employment April 2012.
1 Universal Pre-Kindergarten (UPK) Scope of FY08 Evaluation Activities December 11, 2007.
Developing Evidence on “What Works” in Moving TANF Recipients to Work through Job Search Assistance Karin Martinson, Abt Associates February,
Impact Evaluation “Randomized Evaluations” Jim Berry Asst. Professor of Economics Cornell University.
MEDICAID ADMINISTRATIVE CLAIMING Staff Training. Medicaid Administrative Claiming Is: A method of identifying and accounting for the time spent by public.
S Ethiopia Sustainable Land Management Project Impact Evaluation CROSS-COUNTRY WORKSHOP FOR IMPACT EVALUATIONS IN AGRICULTURE AND COMMUNITY DRIVEN DEVELOPMENT.
What is randomization and how does it solve the causality problem? 2.3.
Preliminary Report Joint Legislative Audit and Review Committee Cynthia L. Forland September 14, 2005 At-Risk Youth Study.
Workshop on Migration Cost Survey Institute for Labor Studies – Department of Labor and Employment Migration Information Resource Center 10 April 2015.
Getting Inside the “Black Box” – Capitalizing on Natural and Random Variation to Learn from the HPOG Impact Study Presenters: Alan Werner, co-Principal.
1 The Effects of Customer Choice: First Findings from the Individual Training Account (ITA) Experiment Mathematica Policy Research, Inc. Social Policy.
Better Decision-Making with RCTs: A Guide for Financial Service Providers.
Country: Mauritius Manufacturing and Services Development and Competitiveness Project CROSS-COUNTRY WORKSHOP FOR IMPACT EVALUATIONS IN FINANCE AND PRIVATE.
Implementing an impact evaluation under constraints Emanuela Galasso (DECRG) Prem Learning Week May 2 nd, 2006.
The Value of Random Assignment Impact Evaluations for Youth-Serving Interventions? Notes from Career Academy Research and Practice James Kemple Senior.
Training of Process Facilitators 1- Training of Process Facilitators 5-1.
Forecasting the labor market needs of workforce skills Budapest 26 February 2014.
BUILDING BRIDGES AND BONDS (B3) A rigorous, multi-site study of innovative services offered by Responsible Fatherhood/ReFORM (RF) programs Unique opportunity.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
IMPACT EVALUATION PBAF 526 Class 5, October 31, 2011.
Do Performance Measures Track Longer-Term Program Impacts? A Case Study for Job Corps Peter Z. Schochet, Ph.D. OECD, Paris April 27, 2016.
Fundamentals of Monitoring and Evaluation
America’s Promise Evaluation What is it and what should you expect?
Results of Adult Led Recess Activity
As we reflect on policies and practices for expanding and improving early identification and early intervention for youth, I would like to tie together.
Perspectives on Reform of Publicly Funded Training
Operational Aspects of Impact Evaluation
Presentation transcript:

Lessons from the United States: Evaluating Employment Services is Neither Easy Nor Cheap November 7, 2009 Sheena McConnell Peter Schochet Alberto Martini

 Evaluations have led to new programs –Worker Profiling and Reemployment Services system –Self-Employment Assistance program  Expanded funded for programs found effective –Job Corps  Reduced funding for programs found ineffective –JTPA youth program  Provided administrators useful information –Individual Training Account experiment Evaluation Has Informed U.S. Policy 2

 What to evaluate? –Whole program or component?  For whom is the program effective? –Decide on populations of interest prior to evaluation  What is the counterfactual? –“No service” environment does not exist Step 1: Develop Research Questions 3

 Consider experiments –Provide credible estimates –Widespread in US and developing countries  Nonexperimental approaches –Need good quality data –Careful analysis –Not as credible Step 2: Choose Evaluation Design 4

 More politically acceptable if: –Pilot of new program –Excess demand for program’s services –Treatments differ, no control  Cooperation from program staff requires: –Discussing the importance of the experiments –Reduce burden on program: small control groups, quick random assignment Political and Program Support Necessary for Experiments 5

 Experiments often take years –Sample intake period –Follow-up period  Costs incurred only by experiments: –Site recruitment –Staff training –Conducting and monitoring random assignment  But experiments require smaller sample sizes than nonexperimental evaluations Experiments Take More Time and May Cost More 6

 Outcomes –Survey versus administrative data  Baseline –Study form, surveys, administrative data  Service receipt –Both treatment and control/comparison group  Program implementation –Understand what is being evaluated  Program costs –To determine cost effectiveness Step 3: Collect Data 7

 Collect high quality data  Consider conducting experiments  Invest in rigorous evaluations Recommendations 8