Dana Keener, Ph.D. ICF Macro 2009 AEA Annual Meeting November 12, 2009

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

When Students Can’t Read…
A GUIDE TO CREATING QUALITY ONLINE LEARNING DOING DISTANCE EDUCATION WELL.
Results from the MLC Evaluation Ruth Wetta-Hall, RN, PhD, MPH, MSN Kansas Public Health Conference September 20, 2011.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Online Career Assessment: Matching Profiles and Training Programs Bryan Dik, Ph.D. Kurt Kraiger, Ph.D.
Title I Needs Assessment and Program Evaluation
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
+ REFLECTIVE COACHING APRIL 29, Goals for Today Check in on where everyone is in our self-guided learning and practice with reflective coaching.
District Workforce Module Preview This PowerPoint provides a sample of the District Workforce Module PowerPoint. The actual Overview PowerPoint is 62 slides.
Nevada Counselor / Psychologist Survey Data Prepared for the Legislative Committee on Education July, By Marina McHatton CTE Counseling and Assessments,
EFFECTIVENESS OF TRAINING Group 5. Effectiveness of Training  What is “effectiveness of training”? Effectiveness means producing an intended result.
Evaluating a Literacy Curriculum for Adolescents: Results from Three Sites of the First Year of Striving Readers Eastern Evaluation Research Society Conference.
A Professional Development Model for Teachers in Child- Care Centers CEC National Conference April 2, 2009 Seattle, WA Madelyn James UIC PhD student in.
TRAINING EVALUATION WHAT? WHAT? WHY? WHY? HOW? HOW? SO WHAT? SO WHAT?
Management Development
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
TRAINING & DEVELOPMENT Dr. Anil Mehta DEFINITION OF TRAINING “A PLANNED ACTIVITY TO MODIFY ATTITUDE, KNOWLEDGE OR SKILL THROUGH LEARNING EXPERIENCE TO.
Community Planning Training 5- Community Planning Training 5-1.
Dana Keener, Ph.D. ICF Macro 2009 AEA Annual Meeting November 12, 2009.
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Scientific Method Identify a Problem Formulate a Hypothesis Determine a Plan of Action Collect Information/Data Analyze Information/Data Interpret Findings.
Logic Models How to Integrate Data Collection into your Everyday Work.
How To Build An Assessment And Impact Model Dr. Suzan Harkness
Introduction to the Application of Balanced Counseling Strategy
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Freshmen to PhD, Empowering with Research
COMMUNITY SELF-MANAGEMENT PROGRAMS FOR PEOPLE WITH EPILEPSY
American Evaluation Association Annual Conference
First-Year Experience Seminars: A Benchmark Study of Targeted Courses for Developmental Education Students.
Functional Area Assessment
Evaluation of The Incredible Years Teacher Classroom Management Program in a Norwegian school setting: Changes in children’s behavior (preliminary results)
How to Conduct Toileting Trials: A Webinar Course Evaluation
Key recommendations Successful components of physical activity interventions fall into three categories: Planning and developing physical activity initiatives.
Evaluation Report: April 1, 2015 – March 31, 2016
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
A nationwide US student survey
Module 5 HAIL Research This module provides an overview of how researchers will collect data with both Independent Living Specialist(ILS) and consumers.
Brett Bruner, M.S. Director of Transition & Student Conduct
Evaluation Period: January 1, 2016 – December
Advancing Student and Educator Growth through Peer Feedback
Nutrition Education Intervention
FPG Child Development Institute
Student Engagement at Orange Coast College
FAMILY EMPLOYMENT AWARENESS TRAINING (FEAT) FEAT Format and Content
Georgia’s Pre-K Summer Transition Program
MEASURES OF SUCCESS: Assessment and Evaluation
Self Directed learning
Organization Development and Change
14 Work Design.
Self-Advocacy in the Observation Cycle
Social Change Implications
DB Summit 2016 Early Identification/Referral Session
Child Care Research Roundtable
EDD/581 Action Research Proposal (insert your name)
Overview of Implementation and Local Decisions
Georgia Department of Education
McREL TEACHER EVALUATION SYSTEM
RESEARCH METHODS Lecture 19
EDD/581 Action Research Proposal (insert your name)
AFIX Standards: a new programmatic tool
SGM Mid-Year Conference Gina Graham
Seminar on the Evaluation of AUT STEM Programme
McREL TEACHER EVALUATION SYSTEM
Student Learning Outcomes Assessment
Program Evaluation for Development
Presentation transcript:

Dana Keener, Ph.D. ICF Macro 2009 AEA Annual Meeting November 12, 2009 Evaluating the Added Value of Offering Technical Assistance Following a Training Program Dana Keener, Ph.D. ICF Macro 2009 AEA Annual Meeting November 12, 2009

Overview Background/context of the study What is technical assistance? Intervention description Methods Results Conclusions

Study Context CDC’s Rape Prevention and Education (RPE) Grant Program in the Division of Violence Prevention Four regional trainings offered in summer 2006 to sexual violence prevention professionals RPE wanted to explore the added value of offering TA after a traditional training session

What is Technical Assistance? What  Intentional, individualized, tangible help, aid, or assistance For whom  provided to an individual, organization, or community Why  for the purpose of increasing knowledge, skills, and abilities towards the successful achievement of a particular end- goal or product

Four Dimensions of Technical Assistance in the Literature Relationship quality Individualization Proactive design Sufficient dosage

The Intervention Purpose: to promote the application of effective training practices among sexual violence prevention professionals Two components: A full-day training component A subsequent telephone-administered technical assistance component

Desired Outcomes of Intervention Satisfaction with the training and TA Motivation to use effective training practices Knowledge and self-perceived ability to use effective training practices Application of effective training practices

Training Component 6 hour in-person training Repeated in 4 locations Included PowerPoint, stories, worksheets, exercises, and group discussion Small group sizes Atlanta = 4 participants Chicago = 3 participants San Diego = 9 participants Hartford = 11 participants

Technical Assistance Component Conducted via scheduled telephone conferences within 6 months of training Two different intensity levels Designed to extend and reinforce the training component No new material presented Focused on application of training content

Technical Assistance Component 1-60 min group call Scheduled 90 days after training n=14 3-60 min group calls 1-60 min individual call Scheduled every 30-45 days after training n=13 Low-Intensity TA (Chicago and Hartford) High-Intensity TA (Atlanta and San Diego)

Research Design and Methods Longitudinal, quasi-experimental Data collected via surveys and observations

Participants 27 participants in total Self-selected into the training session Professional, well-educated Involved in training others as part of their jobs

Group Assignment Based on training city Chicago and Hartford  low-intensity TA Atlanta and San Diego  high-intensity TA No systematic differences observed between groups at pre-test

Timing of Data Collection Points of Measurement Timing of Data Collection Measure Method Before training session Pre-Training Survey Self-report paper/pencil During training session Training Observation Tool Evaluator completed paper/pencil Immediately after the training session Post-Training Survey Following each TA call Post-TA Survey Web survey Six months after the training session Six Month Follow-Up Survey

Response Rates Pre-Training  100% Post-Training  100% 6 Month Follow-up  96%

Measures Demographics/Participant Characteristics Organizational Characteristics Satisfaction with Training and TA TA Engagement Effective Training Practices Attitudes/Motivation Self-Perceived Ability (general and specific) Knowledge (knowledge score) Improvement Application

RESULTS Results

Training Outcomes Participants were highly satisfied w/the training Participants reported significant increases in: Perceived ability to plan, implement and evaluate training programs Knowledge scores Outcomes did not vary based on size of training group Outcomes did not diminish 6 months after the training regardless of participation in TA

TA Engagement Defined as participating in at least 50% of calls offered to them 15 engaged participants 9 from high-intensity group 6 from low-intensity group Participants from the smallest two trainings were more likely to be engaged in the TA component 6 of 7 from smallest trainings were engaged (86%) 9 of 20 from largest trainings were engaged (45%)

Technical Assistance Outcomes No changes in self-perceived ability was observed from post-training to 6 month follow-up No differences found between high and low intensity groups

Technical Assistance Outcomes Participants in the high-intensity TA group reported more improvement on training tasks than the low-intensity group Engaged participants reported more improvement in training tasks than unengaged participants at six months However, the engaged participants already had greater self-perceived abilities before the TA component

Technical Assistance Outcomes Participants who were engaged in the TA fared better than participants who were not engaged at six months follow up. BUT, most of the differences between engaged and unengaged participants preceded the TA intervention. The results say more about who was inclined to participate in TA than they do about the impact of the TA component on the outcomes we measured.

One size DOES NOT FIT ALL! Conclusions Full-day, interactive, skills-based training with small groups (fewer than 12) can be very effective! Brief TA that is one-size-fits-all is unlikely to contribute very much to outcomes, particularly if it follows training that was already effective. Don’t throw in the towel on TA though! Need to test tailored TA programs under other conditions. One size DOES NOT FIT ALL!

Recommendations for Future TA Conduct a brief needs assessment among training participants to identify and select appropriate participants of a follow-up TA program. Develop individualized TA goals and objectives for each participant. Develop an individualized TA plan for each participant to meet their goals. Schedule TA proactively, rather than waiting to be asked for assistance.