Procuring and Selecting an Independent Evaluator

Slides:



Advertisements
Similar presentations
United States Department of Labor Employment & Training Administration EVALUATING TAACCCT Kristen Milstead Region 2 TAACCCT Roundtable July 29-31, 2014.
Advertisements

Funding Opportunities at the Institute of Education Sciences: Information for the Grants Administrator Elizabeth R. Albro, Ph.D. Acting Commissioner National.
COLLEGE SPARK WASHINGTON 2012 Community Grants Program Application Webinar 12/22/201110:00 AM 1/4/20122:00 PM.
LOCAL LEVEL ALIGNMENT UNDER WIOA Office of Career, Technical, and Adult Education for NTI Conference November 12, 2014.
Purpose of the Standards
North Carolina Back-to-Work Program Overview. North Carolina Back-to-Work Program Page 2 Legislative Requirements Purpose: The NC Back-to-Work program.
Grant Applications 101: A Plain English Guide to ETA Competitive Grants Region 6 Southern California Technical Assistance & Training Forum May 27, 2010.
Tennessee Promise Forward Mini- Grant Competition Tennessee Higher Education Commission Informational Webinar.
A SOUND INVESTMENT IN SUCCESSFUL VR OUTCOMES FINANCIAL MANAGEMENT FINANCIAL MANAGEMENT.
Research and Innovation Portfolio Research Support & Performance Team – Implementing Positioning for Growth Recommendations – Response to Staff Feedback.
Data Integration Project. MoSTEMWINS Data Projects Strategy 1 -- Develop and Implement a statewide data system in support of tracking student performance.
PMP® Exam Preparation Course
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
1 Presented By: Dr. Jacob BenusDr. Wayne Vroman Project DirectorPrincipal Investigator July 11-13, 2005 The Reemployment Eligibility Assessment (REA) Study.
Improving Public Health Through Access to and Consumption of Water Applicant Meeting.
New Jersey: State Industry Sector Investment Initiatives Aaron R. Fichtner, Ph.D Deputy Commissioner NJ Department of Labor and Workforce Development June.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
1 Customized Employment Strategic Service Delivery Component Disability Employment Initiative.
1. 2 Collaborative Partnerships It’s that evolution thing again! Adult education has been partnering and collaborating for years.
United We Ride: Where are we Going? December 11, 2013 Rik Opstelten United We Ride Program Analyst.
12/07/20101 Bidder’s Conference Call: ARRA Early On ® Electronic Enhancement to Individualized Family Service Plans (EE-IFSP) Grant and Climb to the Top.
Sub-Regional Workshop for GEF Focal Points in West and Central Africa Accra, Ghana, 9-11 July 2009 Tracking National Portfolios and Assessing Results.
Get Your "Party" Started: Establishing a Successful Third-party Evaluation Martha Thurlow, Ph.D. & Vitaliy Shyyan, Ph.D.—National Center on Educational.
FREQUENTLY ASKED QUESTIONS Common Measures. When did common measures become effective? Common measures became effective for W-P on 7/1/05.
PERKINS IV AND THE WORKFORCE INNOVATION AND OPPORTUNITY ACT (WIOA): INTERSECTIONS AND OPPORTUNITIES.
1 25 STRONG WORKFORCE RECOMMENDATIONS IMPLEMENTATION OVERVIEW #strongworkforce DoingWhatMATTERS.cccco.edu.
Cross-System Data Capacity Developing State Level Strategies for Integration and Interoperability.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Informational Webinar Troy Grant Assistant Executive Director for P-16 Initiatives Tennessee Higher Education Commission.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Welcome to Workforce 3 One U.S. Department of Labor Employment and Training Administration Webinar Date: April 6 th, 2015 Presented by: U.S. Department.
What is layoff aversion? In June 2010, the U.S Department of Labor (DOL) ETA issued TEGL Broader Definition Layoff aversion is preventing, or minimizing.
County Vocational School District Partnership Grant, Cohort 3 Technical Assistance Workshop January 6, 2017.
Job Titles Examples Used for HISD Nonexempt Jobs
Promoting Evidence-Based Policymaking by Sharing State Administrative Data Dr. Marty Romitti January 25, 2017.
North Carolina Council on Developmental Disabilities
Creating or Enhancing Skills-Based Training Programs:
National UI Director’s Conference and IT/Legal Issues Forum October Inspiring Innovation
Defense industry Adjustment and Economic Development U. S
Cost Allocation Webinar
12.2 Conduct Procurements The process of obtaining seller responses, selecting a seller and awarding the contract The team applies selection criteria.
Nebraska’s Re-Employment Strategy
Creating or Enhancing Skills-Based Training Programs:
IT Project Management Version IT Industry Apprenticeship System
Working with your AoA Project Officer
How Does a State Make an Award to Eligible Providers?
Local Workforce Innovation Area Realignment Statewide Update 9/12/18
State Apprenticeship Expansion Grants
America’s Promise Evaluation What is it and what should you expect?
SCD Grants & Contracts Policy & Procedure 670.
Introduction to Finding Grants
THE OFFICE OF PROCUREMENT CONTRACT TRAINING MAY 2018
State Board and Agency Responsibilities in Single Area States
Monitoring and Evaluation using the
ApprenticeshipNC Regional Collaboration and Expansion Project
Reemployment & System Integration (RSI)-Dislocated Worker Grants (DWG)
Strong Workforce Program Funding Implementation
February 19, 2019 An Evaluation Readiness Assessment— Prioritizing Efforts to Measure Effectiveness Evaluations and Research Hub Flash Webinar Series.
2019 Local School District Charter Application Process
Engaging Employers to Support SWFI Career Pathways
How Do I Write a Good Technology Plan?
Edith Cabuslay, MPH Community Health Promotion Unit, BHRS
Which Evaluation Designs Are Right for Your State?
How does evaluation help my state and where do we start?
Strong Workforce Program Funding Implementation
RESEA Evaluation Technical Assistance
Run of Show Goals of RESEA EvalTA (Cycles of Learning and Doing) 0:51
Run of Show Today’s Objectives 4:53 Participant Polls 8:36
HMPPS Innovation Grant Programme (2020 – 2022)
Rate your experience level producing the Annual Statewide Performance Report Narrative Master: I am well versed in all aspects of the WIOA Statewide.
Presentation transcript:

Procuring and Selecting an Independent Evaluator November 1, 2019 Procuring and Selecting an Independent Evaluator RESEA Evaluation Technical Assistance

Gloria Salas-Kos Senior Program Analyst and RESEA Evaluation TA Coordinator Office of Policy Development and Research, ETA, U.S. DOL

Cycles of learning and doing… Plan Evaluation of Program Reflect on & Communicate Evaluation Findings Conduct Evaluation Refine Program or Evaluation

In Our Last Webinar, We… Described key steps in planning your evaluation Discussed the length of time each step may take Discussed potential hurdles and strategies for addressing them To find the webinar recording of “What Evaluation Details Do I Need to Plan for and How Long Will the Evaluation Take?” please visit: https://www.workforcegps.org/events/2019/05/30/13/54/What-Evaluation-Details-Do-I-Need-for-a- Plan-and-How-Long-Will-It-Take

Types of potential evaluators We will review: Types of potential evaluators Appropriate evaluator qualifications and experience; Request for Proposals (RFP) process; Bid assessment and evaluator selection; and University Partnerships Note: This session covers basic evaluation selection and procurement information

Siobhan Mills De La Rosa Associate Abt Associates Hannah Engle Senior Analyst Abt Associates Lesley Hirsh Assistant Commissioner, Research & Information New Jersey Department of Labor & Workforce Development

Has your RESEA staff ever worked with an independent evaluator? Yes – we do this all the time! Yes – but only occasionally. No. I’m not sure.

Does your agency work with an outside organization or university partner to evaluate its programs? Yes! No. I’m not sure – how can I find out?

Where can I find expert evaluators? Research and Evaluation Teams in your Agency or Another State Agency University Research Centers Research and Evaluation Firms Individual Research Consultants

What can my internal RESEA team do?

What are the benefits to working with an experienced independent evaluator? Experienced evaluators do this all the time! They have special experience and expertise in: Research methods Statistical analysis and data collection Protection of human subjects Their findings are independent and transparent No undue influence Can potentially be replicated

Tips for Overseeing Evaluation Efforts Designate a primary point of contact Establish regular communication with your evaluator Regular calls Progress reports Notify your evaluator about changes to: Program implementation Evaluator procedures

Considerations in Selecting an Evaluator Evaluator Qualifications and Funding

What qualifications should we look for? Experience conducting evaluations of the design type you are planning? On workforce programs? Lead staff with advanced (i.e., post-graduate) degree and considerable experience (e.g., 5+ years) Junior and mid-level staff to support study activities, may have less education and/or experience Capacity and resources Other specialized knowledge

Additional qualifications for Impact Studies Random Assignment (RA) Studies Leading RA studies of similar size Incorporating RA into claimant selection algorithm in existing data systems Monitoring RA in complex service delivery environments Quasi-experimental designs Creating comparison groups using your preferred methods Analyzing large datasets created using multiple data sources

Considerations of Evaluation Budget Evaluator Qualifications Sample Size Data Complexity Data Collection Travel and Other Costs Writing and Revisions

RESEA Funds for Evaluation Can use up to 10% of their funds May not be sufficient for larger, multi-year impact evaluations Discreet tasks that match funding period Optional tasks Pooled evaluations

Evaluator Procurement Competitive vs Non-Competitive Processes

Competitive Evaluator Procurement Writing your Request for Proposals (RFP) Advertising Your RFP Assessing Bids and Selecting an Evaluator

Non-Competitive Evaluator Procurement Existing evaluation partnerships Research organizations State universities Internal evaluation unit in the workforce agency or in another agency in the state Always follow state and local procurement rules when selecting your evaluator!

Writing a Request for Proposals Competitive Procurement: Step 1

What are Requests for Proposals (RFP)? Formal announcement inviting qualified organizations to bid on the evaluation contract Should provide guidance on: Scope of Work Evaluator Qualifications Proposed Budget Project and Application Timelines Legal requirements

The Scope of Work (SOW) Should Describe… Your initial thoughts on research questions and evaluation design Tasks, activities, and deliverables you expect your evaluator to complete Level of effort and timeline associated with each task and deliverable Desired evaluator qualifications May also be called Performance Work Statement We strongly recommend that procurement and legal staff review the SOW and RFP!

Include your Evaluation Timeline and Budget For more information on evaluation timelines, see “What Evaluation Details Do I Need to Plan for and How Long Will It Take?”

Evaluations Take Time Evaluations are often multi-year efforts Implementation & outcomes studies may take 1.5+ years Impact studies may take 4+ years Illustrative Example of Impact Study Timeline Year 1 Year 2 Year 3 Year 4 Evaluation planning, design, and launch Assign claimants to an intervention group or control group until target sample size is reached Track claimant outcomes (2 quarter follow-up period after claim approval) Collect outcomes data as it becomes available Data analysis Complete final report

Sample Impact Study Level of Effort Staff Title Tasks Annual Est FTE Principal Investigator Refine RQs Design the study and impact models Oversee analysis .15 FTE Project Director Project management Oversee all tasks Review materials and data collection instruments .25 FTE Mid-level Analyst Design study procedures \ Serve site point of contact Train RESEA staff on procedures Analyze data .3 FTE Junior-level Analyst Collect and clean data Support Project as necessary

How can my evaluator demonstrate their qualifications? Descriptions of previous evaluation work Biographies and CVs of proposed staff Descriptions of company’s institutional capacity Other information required by your state Request and check references!

Competitive Procurement: Step 2 Advertising Your RFP Competitive Procurement: Step 2

Getting the Word Out You can advertise through a number of avenues: Your state or agency website listing contract opportunities Evaluation websites (e.g., www.eval.org, www.appam.org) Direct communication with evaluators from your evidence base (letters, emails) University websites Social media (Facebook, LinkedIn, Twitter, etc.) Local, state, and other relevant newsletters or publications Make sure your RFP, advertisement, and procurement process all align with local, state, and federal rules regarding procurement!

Assessing Bidder Responses Competitive Procurement: Step 3

Assessing Bidder Applications Systematically assess each application, using a rubric or points system Determine and codify the most important application factors Assign points to each factor Rate applications on these factors Selecting evaluators in the top range of scores Cost is important, but not the only factor you should consider!

If it sounds too good to be true… …It probably is! Ask clarifying questions! Check references! Ask for documentation of previous work!

Selecting Your Evaluator Once you have made a selection, follow state procurement rules for: Notifying the selected bidder of the award Negotiating any final contract details Signing the contract and beginning work Make it a high priority to keep the process moving. In some states, these final steps can take several months!

An Alternative to Third-Party Evaluators University Partners An Alternative to Third-Party Evaluators

What are university partnerships? Established relationships with research universities for the purposes of evaluation Relationship is on-going University partners develop relationships with state workforce agencies University partners come to learn state data and workforce systems in depth State workforce agencies can update evaluation priorities from year to year as needs change

Case Study: NJ and Heldrich Center for Workforce Development at Rutgers University NJ Department of Labor and Workforce Development has an evaluation partnership with the John J. Heldrich Center for Workforce Development at Rutgers University The Heldrich Center is located in the School of Public Policy and Planning

Evaluation Partnership Origin John J. Heldrich Center for Workforce Development at the Edward Bloustein School of Public Policy and Planning, Rutgers University developed a baseline data warehouse in the 1990s and early 2000s Wage matching used for Eligible Training Provider “Consumer Report Card” in 2004

Evaluation Partnership Evolution 2012 WDQI Grant Round 2 SLDS Grant 2014 WDQI Grant Round 4 Mapping workforce system relative to customers UI worker profiling and reemployment service (WPRS) initiative, replicated by GAO in 2018 Balanced scorecard metrics 2015 Evaluated sector-strategy training outcomes Established New Jersey Education to Earnings Data System (NJEEDS) Analyzed post-college earnings by major and distributed results to 39 colleges 2016 Interactive dashboards (higher education and State WIA) WIOA Adult and Dislocated Worker evaluation 2017 CTE employment and earnings outcomes study Interactive data visualization tools for jobseekers

2018 to Present WDQI Round VII Build out infrastructure and add data sources Improve agency and research access to linked data Conduct evaluations (e.g., state-funded apprenticeships) Task Order Memorandum of Understanding supported by state dollars Accelerate learning and capacity building around partnership and expand portfolio of evaluation activity. NJCareer Network: developing online tools to aid jobseekers on their journey from career exploration through job placement RESEA evaluation planning and execution.

Closing Remarks Lawrence Burns Reemployment Coordinator Office of Unemployment Insurance, ETA, U.S. DOL

Using CLEAR – A Demonstration August 19, 2019 at 2pm Eastern

Gloria Salas-Kos Lawerence Burns RESEA EvalTA Inbox Evaluation Technical Assistance Coordinator U.S. DOL – Office of Policy Development and Research Salas-Kos.Gloria@dol.gov Lawerence Burns Reemployment Coordinator U.S. DOL – Office of Unemployment Insurance Burns.Lawrence@dol.gov Siobhan Mills De La Rosa Associates Abt Associates Siobhan_Mills@abtassoc.com 301.968.4405 Hannah Engle Senior Analyst Hannah_Engle@abtassoc.com 301.347.5796 RESEA EvalTA Inbox RESEA@abtassoc.com