Phillips Associates Learn 11 Surprising Techniques for Obtaining Powerful Data From Level 1 Evaluations Learn 11 Surprising Techniques for Obtaining Powerful.

Slides:



Advertisements
Similar presentations
A Systems Approach To Training
Advertisements

Evaluating Training Programs The Four Levels
Training Kirkpatrick’s Four Levels of Evaluation Kelly Arthur Richard Gage-Little Dale Munson Evaluation Strategies for Instructional Designers.
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
Culture For Service and Service For Humanity Module 1 Activity Ice Breaker: As a group, you will have 10 minutes to come up with an original chant, step,
Phillips Associates 1 Capturing Elusive Level 3 Data: The Secrets Of Survey Design Session: SU116 Capturing Elusive Level 3 Data: The Secrets Of Survey.
Graduate Program Assessment Report. University of Central Florida Mission Communication M.A. Program is dedicated to serving its students, faculty, the.
Unit 10: Evaluating Training and Return on Investment 2009.
Chapter 13 Survey Designs
Effective Training: Strategies, Systems and Practices, 2 nd Edition Chapter Eight Evaluation of Training.
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Chapter 6 Training Evaluation
Chapter 13 Survey Designs
Measuring Learning Outcomes Evaluation
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Survey Designs EDUC 640- Dr. William M. Bauer
Federal Consulting Group August 2004 Department of Labor Civil Rights Center 2004 Satisfaction Study - Recipients.
What is program success? Wendy Tackett, Ph.D., Evaluator Valerie L. Mills, Project Director Adele Sobania, STEM Oakland Schools MSP, Michigan.
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
TEAM MORALE Team Assignment 12 SOFTWARE MEASUREMENT & ANALYSIS K15T2-Team 21.
Chapter 3 Needs Assessment
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
Krista Streff MBC Final Project May 14, 2007 Krista Streff MBC Final Project May 14, 2007 How employers can more effectively provide retirement planning.
Evaluation Test Justin K. Reeve EDTECH Dr. Ross Perkins.
Key Performance Measures, Evaluation Plans, and Work Plan
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Glendale Elementary School District Professional Development August 15, 2012.
NorthSky Nonprofit Network Creating Customer Satisfaction Surveys Presented by Christine A. Ameen, Ed.D. Ameen Consulting & Associates
Chapter 12: Survey Designs
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Performance Development at The Cathedral of the Incarnation A Supervisor’s Guide.
Chapter 6 Training Evaluation. Chapter 6 Training Evaluation Concepts Training Evaluation: The process of collecting data regarding outcomes needed to.
Planning an Applied Research Project Chapter 7 – Forms of Quantitative Research © 2014 John Wiley & Sons, Inc. All rights reserved.
Market Research Lesson 6. Objectives Outline the five major steps in the market research process Describe how surveys can be used to learn about customer.
Autism Team Training in South Dakota Presented by Brittany Schmidt, MA-CCC/SLP Center for Disabilities Autism Spectrum Disorders Program
Interactive Training Skills: Evaluation Study Executive Summary Presentation Service Merchandise Company Date: Wednesday, April 6, 2011 CONFIDENTIAL Consultants:
1 Validity Conclusions are appropriate Conclusion are true.
INTRODUCTION TO STUDY SKILLS. What are Study Skills?  Study skills are approaches applied to learning. They are considered essential for acquiring good.
Dr. Shulagna Sarkar Assistant Professor , Jt. Coordinator – PGDM – HRM
Why Do State and Federal Programs Require a Needs Assessment?
Program Evaluation Dr. Ruth Buzi Mrs. Nettie Johnson Baylor College of Medicine Teen Health Clinic.
Introduction to Social Survey Methodology Map Your Hazards! Combining Natural Hazards with Societal Issues.
Copyright © 2014 Oracle and/or its affiliates. All rights reserved. | Change Readiness Assessment Analysis and Recommendations Presenter’s Name Presenter’s.
11 Report on Professional Development for and Update Developed for the Providence School Board March 28, 2011 Presented by: Marco Andrade.
Chapter 6 Training Evaluation
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Copyright 2004 ROI Institute, Inc. how me the money! Moving from Impact to ROI Patti Phillips, Ph.D.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
1 Scoring Provincial Large-Scale Assessments María Elena Oliveri, University of British Columbia Britta Gundersen-Bryden, British Columbia Ministry of.
Quantitative Data Collection Techniques In contrast to qualitative data quantitative data are numerical. Counted, calculated tallied and rated. Also ratings.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Kaplan University Writing CenterSaturday, March 05, IX520 Needs Assessment and Evaluation in Education and Training Unit 7 Seminar September 13,
M & E System for MI’s Training Program & Guidelines for MI’s Completion Report Presented by Monitoring and Evaluation Officer Mekong Institute January.
Chapter Twelve Copyright © 2006 McGraw-Hill/Irwin Attitude Scale Measurements Used In Survey Research.
Training Evaluation Chapter 6
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Output of Development Phase Training Title: Training Duration: Trainer: Seating: Target Population: Overall Training Objectives: After having attended.
Crystal Reinhart, PhD & Beth Welbes, MSPH Center for Prevention Research and Development, University of Illinois at Urbana-Champaign Social Norms Theory.
Phillips Associates 1 Boost Training Transfer Using Predictive Learning Analytics™ Presented by: Ken Phillips Phillips Associates May 22, 2016 Session:
Phillips Associates 1 Business Results Made Visible: Design Proof Positive Level 4 Evaluations Presented by: Ken Phillips Phillips Associates May 24, 2016.
How to Demonstrate the Value of Contract Education Jack J. Phillips, Ph.D. Chairman ROI Institute, Inc. May 11, :00 AM- 12:00 PM 5 Levels of Evaluation.
Farmers Market and Local Food Promotion Program Grant Writing Workshop Developing Your Idea These workshops are funded by the USDA’s Agricultural Marketing.
Finding Answers through Data Collection
Chapter 7 Survey research.
SPIRIT OF HR.in TRAINING EVALUATION.
Evaluation of Training
Training Evaluation Chapter 6
6 Chapter Training Evaluation.
Measuring Behavior Kirkpatrick’s Level 3
Presentation transcript:

Phillips Associates Learn 11 Surprising Techniques for Obtaining Powerful Data From Level 1 Evaluations Learn 11 Surprising Techniques for Obtaining Powerful Data From Level 1 Evaluations Presented by: Ken Phillips Phillips Associates April 10, 2012

Phillips Associates 2AGENDAAGENDA ➤ Review Kirkpatrick/Phillips five level evaluation model ➤ Examine Level 1 evaluation facts ➤ Analyze strengths and shortcomings of a sample Level 1 evaluation form ➤ Discover 11 tips to create Level 1 evaluations that produce valued data

Phillips Associates 3 Levels of EvaluationMeasurement FocusTime Frame Level 1: ReactionParticipant reaction to a learning program Conclusion of program Level 2: LearningDegree to which participants acquired new knowledge, skills or attitudes Conclusion of program or within 6 to 8 weeks after Level 3: BehaviorDegree to which participants applied back-on-the-job what was learned 2 to 12 months Level 4: ResultsDegree to which targeted business outcomes were achieved 9 to 18 months Level 5: ROIDegree to which monetary program benefits exceed program costs 9 to 18 months KIRKPATRICK/PHILLIPS EVALUATION MODEL

Phillips Associates 4 LEVEL 1 FACTS* ➤ 92% of organizations evaluate at least some learning programs at Level 1 ➤ Organizations that use Level 1s on average evaluate 78% of all learning programs ➤ 36% of organizations view data collected as having high or very high value *2009 ASTD Research Study: The Value of Evaluation: Making Training Evaluations More Effective

Phillips Associates 5 WHY THE DISCONNECT? ➤ Level 1 Evaluations focus on learning department not business operation data ➤ Examples Facility Course design Facilitator Food ➤ The data collected is not used

Phillips Associates 6 WHAT’S THE SOLUTION? ➤ Create Level 1 Evaluations that produce data with high perceived value for both you & your business stakeholders ➤ Aggregate data and conduct trend lines or comparisons at group, project, department, regional level

Phillips Associates 7 SAMPLEEVALUATIONFORMSAMPLEEVALUATIONFORM

8 11 TIPS FOR CREATING LEVEL 1 EVALUATIONS THAT PRODUCE VALUED DATA

Phillips Associates 9 TIP 1 (CONTENT) Only ask questions that lead to actionable data and focus on important issues

Phillips Associates 10 WHAT’S WRONG WITH THESE? 3c. The workshop materials were well designed 5c. The room set-up and size were comfortable

Phillips Associates 11 TIP 2 (CONTENT) Write learner-centered not trainer- centered evaluation items* *Jim Kirkpatrick, “The New World Level 1 Reaction Sheets, “ unpublished article

Phillips Associates 12EXAMPLESEXAMPLES 3c. The workshop materials aided my understanding of the program content 5c. My learning was enhanced by the room set-up and size

Phillips Associates 13 TIP 3 (CONTENT) Where appropriate, match up qualitative questions with quantitative measures

Phillips Associates 14EXAMPLEEXAMPLE In a word, how would you describe this session? ____________________________ Using a number, how would you describe this session? No Great Value Value

Phillips Associates 15 TIP 4 (CONTENT) Include at least one item asking participants how relevant the learning event/material was to them and their job

Phillips Associates 16EXAMPLEEXAMPLE This: This: How would you rate the overall relevance of this session to you and your job? Not at all Very Relevant Relevant Not this: Not this: 3a. Overall I was satisfied with the program

Phillips Associates 17 TIP 5 (CONTENT) Create predictive questions that forecast participant learning, intent to apply what was learned back-on-the-job and likely impact on business results

Phillips Associates 18 EXAMPLE – LEVEL 2 LEARNING How much did you know about developing individual performance objectives before attending this session? No Thorough Knowledge How much do you know about developing individual performance objectives after attending the session? No Thorough Knowledge

Phillips Associates 19 EXAMPLE – LEVEL 3 BEHAVIOR How likely are you to apply the skills and behaviors learned in this seminar back-on-the-job? What obstacles, if any, are likely to prevent you from applying these skills and behaviors back on the job?____________________________ Not at all Likely Extremely Likely

Phillips Associates 20 EXAMPLE – LEVEL 4 RESULTS How likely are any of the key business metrics tracked by your department to improve as a result of you applying the knowledge and skills you learned in this program? Not at all Likely Extremely Likely How confident are you in your response? Not at all Confident Extremely Confident

Phillips Associates 21 WHAT’S WRONG WITH THIS? SA = Strongly Agree A = Agree D = Disagree SD = Strongly Disagree SAADSD

Phillips Associates 22 TIP 6 (MEASUREMENT) * Palmer Morrel-Samuels, “Getting the Truth into Workplace Surveys”, Harvard Business Review, When collecting quantitative data using a Likert scale, create a response scale with numbers at regularly spaced intervals and words only at each end*

Phillips Associates 23 Not at all True Completely True EXAMPLEEXAMPLEThis: Not This: Not at all True Rarely True Occasionally True Somewhat True Mostly True Frequently True Completely True Never This: Not at all True Rarely True Occasionally True Somewhat True Mostly True Frequently True Completely True

Phillips Associates 24 TIP 7 (MEASUREMENT) * Palmer Morrel-Samuels, “Getting the Truth into Workplace Surveys” Use a response scale with an odd number of points (7, 9, & 11 point scales are best)*

Phillips Associates 25 ODD vs EVEN SCALE

Phillips Associates 26 TIP 8 (MEASUREMENT) Place small numbers at the left or low end of the scale and large numbers at the right or high end of the scale

Phillips Associates 27EXAMPLE Not at all True Completely True Not at all True This: Not This:

Phillips Associates 28 WHAT’S WRONG WITH THESE? 4b. Did the facilitator clearly describe the learning objectives? 4c. Did the facilitator keep the session lively and interesting?

Phillips Associates 29 TIP 9 (MEASUREMENT) Write items either as a continuum or as a statement

Phillips Associates 30EXAMPLESEXAMPLES This: This: How effectively did the AV materials used during the session help reinforce your understanding of the program material? Or This: Or This: The AV materials used during the session helped reinforce my understanding of the program material. Not at all True Completely True Not Effectively Very Effectively

Phillips Associates 31 TIP 10 (ADMINISTRATION) Only use Level 1 evaluations to improve a learning program not to prove something

Phillips Associates 32 TIP 11 (FORMAT) TIP 11 (FORMAT) Place questions regarding respondent demographics (e.g. name, title, department, etc.) at end of evaluation form, make completion optional and keep questions to a minimum

Phillips Associates 33REFERENCESREFERENCES Boehle, Sarah, “Remember that mean, crabby teacher in high school?”, Training, August 2006, pps Morrel-Samuels, Palmer, “Getting the Truth into Workplace Surveys”, Harvard Business Review, February 2002, pps Phillips, Ken, “Eight Tips on Developing Valid Level 1 Evaluation Forms”, Training Today, Fall 2007, pps. 8 & 14. Phillips, Ken, “Developing Valid Level 2 Evaluations”, Training Today, Fall 2009, pps. 6-8.

Phillips Associates 34 Ken Phillips Phillips Associates N. Wooded Glen Drive Grayslake, Illinois (847)