Dana Keener, Ph.D. ICF Macro 2009 AEA Annual Meeting November 12, 2009.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

The Community Themes and Strengths Assessment A How-To Guide.
When Students Can’t Read…
A GUIDE TO CREATING QUALITY ONLINE LEARNING DOING DISTANCE EDUCATION WELL.
Results from the MLC Evaluation Ruth Wetta-Hall, RN, PhD, MPH, MSN Kansas Public Health Conference September 20, 2011.
Foundation Competencies New CSWE procedures
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
WorkSafe Victoria is a division of the Victorian WorkCover Authority Guidance Note on the Prevention of Bullying and Violence at Work Evaluation results.
May 5, 2015 Strategies for Evaluation Data Collection Eric Graig, Ph.D.
NIDRR-funded AATT Project (Agricultural Assistive Technology Training) DOE/OSERS Project # H133G TRAINING OVERVIEW On-line and In-person Evaluations.
1 The Effectiveness of a NIOSH Multimedia Training Program: The Lifting Equation William Bowles Education and Information Division NIOSH Best Practices.
Key Communities and Objectives Outcomes- Based Assessment Telling the Story Results Closing the Loop.
Measuring and reporting outcomes for your BTOP grant 1Measuring and Reporting Outcomes.
Supporting Client Disclosure of HIV Status – Evaluating Provider Efficacy Before vs. After a Skills-based Training Objective Results More participants.
Title I Needs Assessment and Program Evaluation
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Community Planning Training 4- Community Planning Training 4-1.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Molly Chamberlin, Ph.D. Indiana Youth Institute
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Measuring for Success Module Nine Instructions:
How to Develop the Right Research Questions for Program Evaluation
Evaluating NSF Programs
Assessing, and Leading a Schoolwide Culture iDEAL: Inspiring, Developing, Empowering, Assessing, and Leading a Schoolwide Independent Reading Culture.
+ REFLECTIVE COACHING APRIL 29, Goals for Today Check in on where everyone is in our self-guided learning and practice with reflective coaching.
Nevada Counselor / Psychologist Survey Data Prepared for the Legislative Committee on Education July, By Marina McHatton CTE Counseling and Assessments,
Evaluating a Literacy Curriculum for Adolescents: Results from Three Sites of the First Year of Striving Readers Eastern Evaluation Research Society Conference.
A Professional Development Model for Teachers in Child- Care Centers CEC National Conference April 2, 2009 Seattle, WA Madelyn James UIC PhD student in.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Performance Development at The Cathedral of the Incarnation A Supervisor’s Guide.
Oregon PBS Trainer’s Training Presented by the NWPBIS Network.
Perspectives on Impact Evaluation Cairo, Egypt March 29 – April 2, 2009 Presented by: Wayne M. Harding. Ed.M., Ph.D., Director of Projects, Social Science.
Corinne Graffunder, DrPH, MPH National Center for Injury Prevention and Control October 2009 Laying the Foundation: The Strategic Vision for RPE.
Joint Infant and Toddler Steering Committee/Early Learning Regional Coalition Statewide Meeting “Using our Data for Continuous Improvement” Organizational.
1. 2 Collaborative Partnerships It’s that evolution thing again! Adult education has been partnering and collaborating for years.
Proposed National SET Goals for 2009 National SET Mission Mandate Team and National 4-H Council.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Copyright © 2012 American Institutes for Research. All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Mariel Sparr.
Management of Community Nutrition Services Chapter 19.
1© 2010 by Nelson Education Ltd. Chapter Ten Transfer of Training.
Programme Information Incredible Years (IY)Triple P (TP) – Level 4 GroupPromoting Alternative Thinking Strategies (PATHS) IY consists of 12 weekly (2-hour)
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Evaluating Teacher Training changing classroom practices Richard Lambert, Ph.D.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
IMPLEMENTATION QUALITY RESEARCH OF PREVENTION PROGRAMS IN CROATIA MIRANDA NOVAK University of Zagreb, Faculty of Education and Rehabilitation Sciences.
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
MSP Annual Performance Report: Online Instrument MSP Regional Conferences November, 2006 – February, 2007.
HIL Research Project Evaluation Plan and Timeline April 15, 2008 A project of the Medical Library Association working with the National Library of Medicine.
TRAINING & DEVELOPMENT Dr. Anil Mehta DEFINITION OF TRAINING “A PLANNED ACTIVITY TO MODIFY ATTITUDE, KNOWLEDGE OR SKILL THROUGH LEARNING EXPERIENCE TO.
Community Planning Training 5- Community Planning Training 5-1.
Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Presentation prepared for Helping Extend Learning and.
1© 2013 by Nelson Education Ltd. CHAPTER TEN Transfer of Training.
Measuring for Success Module Nine. Reflecting on the Previous Session What was most useful? What progress have you made? Any comments or questions?
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Evaluating for Impact Learning Circle Project Faculty Presentation Name: Ellen RoweTitle: Community & Leadership Development.
Abstract Service Learning is a useful avenue in developing agency in college students, giving them the opportunity to interact with issues linking course.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Initial Project Aims To increase the capacity of primary schools in partnership with parents to implement a sustainable health and sexuality education.
Reaching Tobacco Treatment Providers Through Online Training INTRODUCTION  U.S. Public Health Service Guidelines recommend that health care professionals.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 2.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Karen Cheung, MPH, Pamela Luna, DrPH, MST, Sarah Merkle, MPH American Evaluation Association Annual Meeting November 11, 2009 The findings and conclusions.
Welcome! These slides are designed to help you think through presenting your evaluation planning and results. Feel free to pick and choose the slides that.
Fundamentals of Monitoring and Evaluation
Dana Keener, Ph.D. ICF Macro 2009 AEA Annual Meeting November 12, 2009
Student Learning Outcomes Assessment
Presentation transcript:

Dana Keener, Ph.D. ICF Macro 2009 AEA Annual Meeting November 12, 2009

 Background/context of the study  What is technical assistance?  Intervention description  Methods  Results  Conclusions

 CDC’s Rape Prevention and Education (RPE) Grant Program in the Division of Violence Prevention  Four regional trainings offered in summer 2006 to sexual violence prevention professionals  RPE wanted to explore the added value of offering TA after a traditional training session

 What  Intentional, individualized, tangible help, aid, or assistance  For whom  provided to an individual, organization, or community  Why  for the purpose of increasing knowledge, skills, and abilities towards the successful achievement of a particular end- goal or product

 Relationship quality  Individualization  Proactive design  Sufficient dosage

 Purpose: to promote the application of effective training practices among sexual violence prevention professionals  Two components: ◦ A full-day training component ◦ A subsequent telephone-administered technical assistance component

 Satisfaction with the training and TA  Motivation to use effective training practices  Knowledge and self-perceived ability to use effective training practices  Application of effective training practices

 6 hour in-person training  Repeated in 4 locations  Included PowerPoint, stories, worksheets, exercises, and group discussion  Small group sizes ◦ Atlanta = 4 participants ◦ Chicago = 3 participants ◦ San Diego = 9 participants ◦ Hartford = 11 participants

 Conducted via scheduled telephone conferences within 6 months of training  Two different intensity levels  Designed to extend and reinforce the training component  No new material presented  Focused on application of training content

Low-Intensity TA (Chicago and Hartford) High-Intensity TA (Atlanta and San Diego)  1-60 min group call  Scheduled 90 days after training  n=14  3-60 min group calls  1-60 min individual call  Scheduled every days after training  n=13

 Longitudinal, quasi-experimental  Data collected via surveys and observations

 27 participants in total  Self-selected into the training session  Professional, well-educated  Involved in training others as part of their jobs

 Based on training city  Chicago and Hartford  low-intensity TA  Atlanta and San Diego  high-intensity TA  No systematic differences observed between groups at pre-test

Timing of Data Collection MeasureMethod Before training session Pre-Training SurveySelf-report paper/pencil During training session Training Observation Tool Evaluator completed paper/pencil Immediately after the training session Post-Training SurveySelf-report paper/pencil Following each TA callPost-TA SurveyWeb survey Six months after the training session Six Month Follow-Up Survey Web survey

 Pre-Training  100%  Post-Training  100%  6 Month Follow-up  96%

 Demographics/Participant Characteristics  Organizational Characteristics  Satisfaction with Training and TA  TA Engagement  Effective Training Practices ◦ Attitudes/Motivation ◦ Self-Perceived Ability (general and specific) ◦ Knowledge (knowledge score) ◦ Improvement ◦ Application

RESULTS

 Participants were highly satisfied w/the training  Participants reported significant increases in: ◦ Perceived ability to plan, implement and evaluate training programs ◦ Knowledge scores  Outcomes did not vary based on size of training group  Outcomes did not diminish 6 months after the training regardless of participation in TA

 Defined as participating in at least 50% of calls offered to them  15 engaged participants ◦ 9 from high-intensity group ◦ 6 from low-intensity group  Participants from the smallest two trainings were more likely to be engaged in the TA component  6 of 7 from smallest trainings were engaged (86%)  9 of 20 from largest trainings were engaged (45%)

No changes in self-perceived ability was observed from post-training to 6 month follow-up No differences found between high and low intensity groups

 Participants in the high-intensity TA group reported more improvement on training tasks than the low-intensity group  Engaged participants reported more improvement in training tasks than unengaged participants at six months  However, the engaged participants already had greater self-perceived abilities before the TA component

 Participants who were engaged in the TA fared better than participants who were not engaged at six months follow up.  BUT, most of the differences between engaged and unengaged participants preceded the TA intervention.  The results say more about who was inclined to participate in TA than they do about the impact of the TA component on the outcomes we measured.

 Full-day, interactive, skills-based training with small groups (fewer than 12) can be very effective!  Brief TA that is one-size-fits-all is unlikely to contribute very much to outcomes, particularly if it follows training that was already effective.  Don’t throw in the towel on TA though! Need to test tailored TA programs under other conditions. One size DOES NOT FIT ALL!

 Conduct a brief needs assessment among training participants to identify and select appropriate participants of a follow-up TA program.  Develop individualized TA goals and objectives for each participant.  Develop an individualized TA plan for each participant to meet their goals.  Schedule TA proactively, rather than waiting to be asked for assistance.