ASSURING THAT TRAINING HAS IMPACT: EVALUATING A LARGE AND COMPLEX TRAINING SYSTEM Child Welfare Evaluation Summit Washington, D.C. | August 30 th, 2011.

Slides:



Advertisements
Similar presentations
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
Advertisements

Ohio Improvement Process (OIP) August Core Principles of OIP  Use a collaborative, collegial process which initiates and institutes Leadership.
CW/MH Learning Collaborative First Statewide Leadership Convening Lessons Learned from the Readiness Assessment Tools Lisa Conradi, PsyD Project Co-Investigator.
Common Core 3.0 Update Project Overview and Status Timeline for Completion Evaluation.
700: Moving Through the Collaboration Continuum to Improve Permanency Outcomes Wednesday, March 20, 2013.
Family Resource Center Association January 2015 Quarterly Meeting.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Comprehensive Curriculum Framework for Tiered Instruction: A Response to Intervention Model Sarah Jackson, M.Ed. Sandra Hess Robbins, M.Ed. Sanna Harjusola-Webb,
1 THE CHILD AND FAMILY SERVICES REVIEW (CFSR) PRACTICE PRINCIPLES: Critical Principles for Assessing and Enhancing the Service Array The Service Array.
Child Welfare Workforce Changing Context & Implications Resulting from Privatization & Performance-Based Contracting Karl Ensign, Director Evaluation for.
Early Childhood Mental Health Consultants Early Childhood Consultation Partnership® Funded and Supported by Connecticut’s Department of Children and Families.
Common Core 3.0 Content Overview Stakeholder Feedback Seeking Your Input to Improve Child Welfare Training! For audio: call enter access.
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
Common Core 3.0 Executive Summary Stakeholder Feedback Seeking Your Input to Improve Child Welfare Training! For audio: call enter access.
10/ Introduction to the MA Department of Children and Families’ Integrated Casework Practice Model (ICPM) Fall 2009.
Adolescent Sexual Health Work Group (ASHWG)
ADVANCED LEADERSHIP DEVELOPMENT FOR SUPERVISORS (ALDS) PROGRAM YEAR 1 IMPLEMENTATION MARCH 2010 – FEBRUARY 2011 PILOT PROGRAM.
Common Core 3.0 Learning Objectives for Stakeholder Feedback Seeking Your Input to Improve Child Welfare Training! For audio: call enter.
1 Understanding and Developing Child Welfare Practice Models Steven Preister, Associate Director National Child Welfare Resource Center for Organizational.
1 Adopting and Implementing a Shared Core Practice Framework A Briefing/Discussion Objectives: Provide a brief overview and context for: Practice Models.
1 Moving Children to Timely Permanence Training for Legal Representation for Children and Parents A Report to the State Roundtable of Pennsylvania.
Participants Adoption Study 109 (83%) of 133 WSU Cooperative Extension county chairs, faculty, and program staff responded to survey Dissemination & Implementation.
APAPDC National Safe Schools Framework Project. Aim of the project To assist schools with no or limited systemic support to align their policies, programs.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Working Definition of Program Evaluation
Military Family Services Program Participant Survey Training Presentation.
Katie A. Learning Collaborative For Audio, please call: Participant code: Please mute your phone Building Child Welfare and Mental.
Julie R. Morales Butler Institute for Families University of Denver.
Evaluation Highlights from Pilot Phase July 2005 – June 2007 Prepared for Leadership Team Meeting January 11, 2008.
Community Partnerships to Protect Children: Challenges and Opportunities Deborah Daro.
Vermont’s Early Childhood & Family Mental Health Competencies A story of Integration & Collaboration  How can they help me?
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Practice Model Elements Theoretical framework Values and principles Casework components Practice elements Practice behaviors.
Your Presenters Melissa Connelly, Director, Regional Training Academy Coordination Project, CalSWEC Sylvia Deporto, Deputy Director, Family & Children’s.
Strategic Planning for Training Evaluation
Training and Developing a Competitive Workforce 17/04/2013.
Military Family Services Program Participant Survey Briefing Notes.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Practice Area 1: Arrest, Identification, & Detention Practice Area 2: Decision Making Regarding Charges Practice Area 3: Case Assignment, Assessment &
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Training Evaluation TAB Mtg Update: 3/21/14. Big Picture Training Evaluation Framework (in process; categories: Internal staff, relationships with agencies,
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
Secondary Analysis of Child Welfare In-Service Training Data Comparing Title IV-E and non-Title IV-E Graduates 1.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Early Intervention Program & Early Family Support Services: Analyzing Program Outcomes with the Omaha System of Documentation Presented to: Minnesota Omaha.
Sustainability Planning Framework and Process Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 ©
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Broward County Public Schools BP #3 Optimal Relationships
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
Common Core 3.0 Online Learning Classroom Skill Building Field Activities.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Helping Teachers Help All Students: The Imperative for High-Quality Professional Development Report of the Maryland Teacher Professional Development Advisory.
Developing a Model of Trainer Evaluation Leslie W. Zeitler, LCSW May 2010: 13 th Annual National Human Services Training Evaluation Symposium “Problem/Brainstorm.
Family Assessment Response. Welcome & Introduction Introduce yourself to the group: 1.Name 2.Work location 3.Work title 4.What is it about FAR that brought.
2012 National Human Services Training Evaluation Symposium: 2012 National Human Services Training Evaluation Symposium: An Investigation of Stereotype.
February,  2002 – CalSWEC, RTAs/IUC began development of CC training  Part of an overall strategic plan for child welfare training evaluation.
SAN JOSE STATE UNIVERSITY SCHOOL OF SOCIAL WORK FIELD INSTRUCTION INITIATIVE PARTNERED RESEARCH PROJECT Laurie Drabble, Ph.D., MSW, MPH Kathy Lemon Osterling,
Child Welfare Training Evaluation in California Update on the Strategic Planning Process RTA All Staff SPS | March
Welcome! Leadership Symposium on Evidence-Based Practice Shaping the Next Generation of Evidence in Child Welfare January 28 th, 2008 San Diego, California.
Keeping our Commitments to Collaborative Children’s Services.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
 Identification of Child Maltreatment: Public Child Welfare Worker Training Evaluation Outcomes Chris Lee, M.S.W. Maria Hernandez, M.S.W. California Social.
Instructional Leadership Supporting Common Assessments.
February 25, Today’s Agenda  Introductions  USDOE School Improvement Information  Timelines and Feedback on submitted plans  Implementing plans.
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Implementation Guide for Linking Adults to Opportunity
CURRICULUM & INSTRUCTION
Presentation transcript:

ASSURING THAT TRAINING HAS IMPACT: EVALUATING A LARGE AND COMPLEX TRAINING SYSTEM Child Welfare Evaluation Summit Washington, D.C. | August 30 th, 2011 Barrett L. Johnson, LCSW, Cynthia F. Parry, Ph.D., & Leslie W. Zeitler, LCSW California Social Work Education Center (CalSWEC) UC Berkeley School of Social Welfare

2 Objectives At the close of the workshop participants will:  Understand the benefits and uses of system wide training evaluation data.  Understand the bases for decisions about the training evaluation system’s purpose, design, scope, and standardization.  Apply concepts from the workshop to planning evaluation systems in their own states.

3 Why evaluate child welfare training?  Most practice improvement initiatives involve training.  We spend a great deal of funds on training.  Very few training programs systematically assess the impact of training on trainees’ knowledge, skills, or ability to transfer the skills to the job.  “The tail wags the dog” – Evaluation forces the entire system to focus on what specific knowledge, skills and values are most essential to effective practice.

4 Key Partners in California’s Child Welfare In-Service Training Evaluation

5 Importance to Practice Community…  Why are training evaluation efforts important to practitioners and administrators?  Must assure that they meet regulations  Allow them to participate in curriculum review and revision and make adjustments to content  Assures them that the workforce is prepared  Gives them structure for supporting Transfer of Learning  Makes the link to outcomes

6 Key Steps  Macro Evaluation Team convenes, begins planning.  PIP mandates development & Implementation of Framework for Training Evaluation [1][1]  2004 – Framework Completed and Adopted  2004 to 2009 – Partners implement Framework  Late 2008/Early 2009 – Begin Strategic Planning process for next 2-year period  Fall 2009 – Implementation of next strategic plan commences  Fall 2011 – Process begins on next strategic plan [ 1][ 1] Parry, C. & Berdie, J. (2004). Training evaluation framework report. Berkeley, CA: California Social Work Education Center.

7 Timeline of Activities

8 Framework Decision Points  What is the purpose of the evaluation?  Providing feedback System or course improvement Staff learning/skill mastery  Accountability Documentation of training effectiveness Evidence of individual competence  Supporting planning and decision-making Program development Individual needs/skill gaps

Framework Decision Points  How rigorous will the evaluation need to be to ensure valid decisions?  Stakes/consequences for participants  What will be the focus/scope of the evaluation?  Training system  Course or series of courses  Content area  Specific KSA/learning objective 9

Framework Decision Points (cont.)  What level(s) of evaluation are desired?  What level of standardization is desired or required?  How will results be disseminated, to whom, how often and for what purposes?  Who are the audiences for the information?  What are the best methods for dissemination? Best timetables?  What are appropriate and inappropriate uses for the data?  How will confidentiality and/or protection of human subjects be addressed? 10

11 Framework Decision Points (cont.)  What resources are available/needed?  Dollars  Evaluator skills  Staff time  Training related to implementation of the evaluation  Data tracking, entry, storage, QA

12 Framework Decisions in California  Purpose/Use = Program/course improvement  Focus/scope = Content area  Priority content areas: Assessment of Safety, Risk & Protective Capacity Engaging Families in Case Planning & Mgmt Human Development Placement/Permanence Child Maltreatment Identification

Evaluation Rigor  No reporting of individual results except in LA  No personnel consequences attached  Careful attention to validity of measurement tools  Multiple levels of item review  Statistical item analysis  Analysis of DIF  Rasch Modeling used to build item bank and allow interpretation across multiple years, test and curriculum versions

14 Framework Decisions in CA  Levels of Evaluation  Level 1: Tracking attendance (Demographics)  Level 2: Formative evaluation of training courses (course level: curriculum content & methods)  Level 3: Satisfaction and opinion of the trainees  Level 4: Trainee knowledge acquisition  Level 5: Skills acquisition (as demonstrated in class)  Level 6: Transfer of learning (TOL: use of knowledge and skill on the job)  Level 7: Agency/client outcomes (degree to which training affects achievement of specific agency goals or client outcomes)

15 Framework Decisions in CA  Levels of Training Evaluation (cont.)  Key concept: Chain of Evidence Establishes a linkage between training and desired outcomes for the participant, the agency, and the client such that a reasonable person would agree that training played a part in producing the desired outcome.

16 Framework Decisions in CA Levels of Standardization  Establishes standard competencies and learning objectives for the whole core  Establishes six core areas where information is standard (5 are evaluated at knowledge level)  Establishes one core area (Child Maltreatment Identification) where delivery and information is standard (Evaluated at skill level)

Venn Diagram of Standardization 17 5 Other Priority Areas Standard learning objectives and competencies Standardized information Knowledge evaluation Child Maltreatment ID Standard learning objectives and competencies Standardized information Standardized delivery Embedded skill evaluation Child Mal- Treatment ID 5 other Priority Areas Other content in core Other content Standard learning objectives and competencies

Framework Decisions in California  Dissemination of Results  Guided by Human Subjects considerations  Multiple reports for multiple audiences: Statewide organizations Regional Training Academies Trainers and Administrators  Resources = Intensive, multi-year commitment  CalSWEC staff time; staff time from state, regional, and private partners; evaluation consultant services; hardware and software. 18

19 Let’s see how far we’ve come…

20 What do we know now? (Summary of Progress/Results by Level)  Level 1: Demographic data captured for 5,253 new child welfare social workers and 663 supervisors since formal evaluations began in 2005 (data thru 6/30/11).  Level 2: Collected and analyzed data on training content and delivery, resulting in improvements to the Common Core.  Level 3: This level of evaluation is completed at a regional level, and not on a statewide basis.

21 Summary of Progress/Results, cont’d Level 4: Knowledge Tests  For topics in which knowledge pre- and post-tests were administered (Child & Youth Dev; Case Planning & Case Management, and Placement & Permanency):  Trainees (new CWWs) improved from pre-to post- test at a statistically significant level.

Summary of Progress/Results, cont’d  Level 4, continued:  IV-E Effects– IV-E trainees have scored higher at pre and posttest Posttest score differences have been statistically significant for all modules over the past 2 fiscal years, and for two of the three modules since January of IV-E trainees achieved significant gains from pre to posttest

23 Summary of Progress/Results, cont’d Level 4, continued:  For the topic in which a knowledge post-test only has been administered (Critical Thinking in Child Welfare Assessment: Safety, Risk & Protective Capacity):  Although no formal standard has been established that serves as a yardstick of mastery, the data indicates that trainees leave the classroom with a substantial level of knowledge related to the learning objectives for the course.

24 Summary of Progress/Results, cont’d  Level 5: For topics in which skill is assessed in the classroom (e.g., embedded evaluation) that pertains to identification of physical abuse and sexual abuse (Child Maltreatment Identification, Parts 1 and 2): At least 87% (and in most years 90% or more) of new CWWs made 3 out of 4 correct decisions when asked to indicate whether or not child maltreatment occurred in a given case scenario.

25 Summary of Progress/Results, cont’d  Level 6: Completed regional studies on Transfer and Field Training. (See 2009 White Paper, noted at end of this PPT.)  Level 7: Under the Framework, efforts have focused on developing the building blocks at the lower levels in a rigorous manner (as part of developing a chain of evidence). Overarching goal is to link training interventions to outcomes for children and families served by CWS.

26 Sample Report  Review Statewide Trainer/Administrator Report (June 2011)

27 Where are We Going?

28 Where Are We Going (by Level)?  Level 1:  Lineworker Core: Demographic profiles and related analyses of lineworker core test data will continue.  Supervisor Core: Demographic profiles and related analyses of supervisor core test data will commence.  Analyses of IV-E trainee test data.

29 Where Are We Going? (cont’d…)  Level 2:  Formative evaluations for observers (and separate ones for trainers) will be divided into assessments of content and assessments of delivery.  Formative evaluation materials also will be developed for a new statewide venture: the e- learning platform.  Level 3: These efforts will continue solely at the regional and county levels.

30 Where Are We Going? (cont’d…)  Level 4:  Continue knowledge tests (consists of multiple choice test questions, aka “test items”) for the curricula currently evaluated at this level. Move toward diagnostic testing:  Focus on key learning objectives  Make targeted revisions to training based on evaluation data.

31 Where Are We Going? (cont’d…)  Level 4 (cont’d):  Continue analysis of differential functioning by demographic groups.  Pilot study to look at possible effect of stereotype threat in trainee test performance (PCWTA).  Explore trainer-level differences in test item performance to provide feedback on fidelity of curriculum delivery.  Compare/monitor differences in performance for Title IV-E vs. non-IV-E students.

32 Where Are We Going? (cont’d…)  Level 5:  Continue analysis of differential performance by demographic groups.  Pilot the embedded eval of the SDM TM version of the Critical Thinking in Child Welfare Assessment: Safety, Risk & Protective Capacity curriculum.  Revise the embedded evaluation for Casework Supervision module.  Pilot neglect scenario as part of an embedded eval (PCWTA)

33 Where Are We Going? (cont’d…)  Level 6: Conduct a feasibility study of Transfer of Learning evaluations as applied at a statewide level, based on findings and lessons learned from initial TOL evaluations.  Level 7: Continue building Chain of Evidence to link training to outcomes. Conduct a feasibility study of linking training to outcomes evaluation as applied at a statewide level. (May link this to program evaluation efforts, or to research related to the Statewide Research Agenda for CWS.)

34 Where Are We Going? (cont’d…)  Other training evaluation projects:  Attitudes/Values Evaluation re: racial differences in identifying physical abuse. (PCWTA to pilot.)  Attitudes/Values Evaluation re: impact of attitudes toward sexual abuse disclosures. (Collaboration with UNC School of Medicine.)  Trainer Evaluation: Identify trainer-related differences in test item difficulty. Develop & obtain feedback on model of trainer evaluation.  Quality Assurance: Small group of reps from around the state to observe one Phase 1 training and one Phase 2 training in each region.

35 Exercise 1. Split into groups (dyads or triads) from different states 2. Small group process: a. Consider the training system in your home state. Briefly discuss each of the six decision points, and answer the following questions: 1. Where is your state in the process of implementing a child welfare training evaluation system? (e.g. planning stages, early implementation, mature system, etc.) 2. What key decisions have you made with respect to each of the decision points? 3. What key decisions do you still need to make with respect to these decision points? How would you go about making them? 3. Brief report out: What is the one take home point from your discussion that you would like to share?

36 We’re going to keep an eye on outcomes for children & families…

37 For More Information…  Refer to the full text of the white paper entitled: Evaluation of the California Common Core for Child Welfare Training: Implementation Status, Results and Future Directions (December 2009), at:  OR, refer to the summary table (of the white paper) entitled: Where We’ve Been and Where We’re Going: Summary Table of Training Evaluation Efforts in California (Dec 2009), also at:  For more information on the original Framework, go to: Parry, C. & Berdie, J. (2004). Training evaluation framework report. Berkeley, CA: California Social Work Education Center.

Barrett Johnson – Leslie Zeitler – Cynthia Parry –