Phillips Associates 1 Business Results Made Visible: Design Proof Positive Level 4 Evaluations Presented by: Ken Phillips Phillips Associates May 24, 2016.

Slides:



Advertisements
Similar presentations
Measuring ROI to Make the Case for Training HRPYR Topic by Table Thursday, April 17, 2008.
Advertisements

Return on Investment: Training and Development
HR: Value or Expense? Measuring Future Business Relationships Presentation to the 2001 HRINZ Conference of the Results of the National Baseline Survey.
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
Show Me the Money: Moving From Impact to ROI
Evaluating Training Programs Level 5: Return on Investment Kelly Arthur Richard Gage-Little Dale Munson.
Phillips Associates 1 Capturing Elusive Level 3 Data: The Secrets Of Survey Design Session: SU116 Capturing Elusive Level 3 Data: The Secrets Of Survey.
UGA Libraries Compensation Satisfaction Consulting Project Carrie McCleese Starr Daniell.
©2003 Prentice Hall Business Publishing, Cost Accounting 11/e, Horngren/Datar/Foster Strategy, Balanced Scorecard, and Strategic Profitability Analysis.
Unit 10: Evaluating Training and Return on Investment 2009.
Return On Investment Integrated Monitoring and Evaluation Framework.
Kirkpatrick.
Accountability in Human Resource Management Dr. Jack J. Phillips.
Strategy, Balanced Scorecard, and Strategic Profitability Analysis
6 Chapter Training Evaluation.
Chapter 6 Training Evaluation
Measuring (and Driving) the Value of Training Bruce Winner, Los Rios CCD – Government Training Academy Bruce blogs to the training community at -
Chief Learning Officer Webinar Sponsored by Skillsoft September 13, 2012 David Vance.
Customer Focus Module Preview
+ Training Evaluation Plan Increasing transfer and effectiveness through the proper evaluation of our current and future training programs. I am here to.
Getting to a Return on Investment for Transportation Training Presented by: Victoria Beale, JD, SPHR Ohio LTAP Center Director August 1, 2012 – Grapevine,
08/2009 The Benefits of Mentoring. Mentoring Mentoring has evolved in the workplace to be less about bosses grooming their handpicked successors to being.
2010 Annual Employee Survey Results
Enterprise Level Goals
2008 Indiana State Personnel Department Conference Presented by Krista F. Skidmore, Esq., SPHR, President Strategic Doing—A Model to Align and Execute.
TTMG 5103 Module Techniques and Tools for problem diagnosis and improvement prior to commercialization Shiva Biradar TIM Program, Carleton University.
Phillips Associates Learn 11 Surprising Techniques for Obtaining Powerful Data From Level 1 Evaluations Learn 11 Surprising Techniques for Obtaining Powerful.
District Workforce Module Preview This PowerPoint provides a sample of the District Workforce Module PowerPoint. The actual Overview PowerPoint is 62 slides.
Orientation to 2008 Regional Training: Building and Sustaining Quality in NRS Data National Webinar June 24, 2008 Larry Condelli Steve Coleman.
Chapter 6 Training Evaluation. Chapter 6 Training Evaluation Concepts Training Evaluation: The process of collecting data regarding outcomes needed to.
5 Chapter Training Evaluation.
Metrics Simple definition: Metrics are a set of measurements that quantify results. Metrics are used to establish benchmarks, make comparisons, and evaluate.
Group HR Training & Development Welcome Good Evening 18 th September 2012 Sukanya Patwardhan.
…and how to give it to them What CEOs Want Patti Phillips, Ph.D.
Market Research The key to the customers wallet …..
CHAPTERCHAPTER 8 Evaluation. Topics Covered in Chapter 8 The Purpose of Evaluation Objectives: A Prerequisite for Evaluation Current Status of Measurement.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
Interactive Training Skills: Evaluation Study Executive Summary Presentation Service Merchandise Company Date: Wednesday, April 6, 2011 CONFIDENTIAL Consultants:
Training Evaluation.
Chapter 6 Training Evaluation
14-1 CHAPTER 14 McGraw-Hill/Irwin © 2008 The McGraw-Hill Companies, Inc., All Rights Reserved. Cost Analysis for Planning.
Copyright 2004 ROI Institute, Inc. how me the money! Moving from Impact to ROI Patti Phillips, Ph.D.
Data Analysis Processes: Cause and Effect Linking Data Analysis Processes to Teacher Evaluation Name of School.
Lynn Schmidt, PhD ATD Puget Sound October 21, 2014.
Performance Consulting: Make Performance Your Business! (and Prove It)
How to Use the ROI Methodology in Your Team Coaching Engagements
Return on Investment De Kock, Philip Training Evaluation & Measuring ROI on Training. Ripple Training, Cape Town, January 2007.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
SAN JOSE STATE UNIVERSITY SCHOOL OF SOCIAL WORK FIELD INSTRUCTION INITIATIVE PARTNERED RESEARCH PROJECT Laurie Drabble, Ph.D., MSW, MPH Kathy Lemon Osterling,
HA 7712: Human Resource ManagementProfessor Sturman, Spring 2010 Individual Performance and HR Metrics February 2, 2010.
To Accompany Russell and Taylor, Operations Management, 4th Edition,  2003 Prentice-Hall, Inc. All rights reserved. Chapter 8 Forecasting To Accompany.
Training Evaluation Chapter 6
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
A Strategic Measurement and Evaluation Framework to Support Worker Health COMMITTEE ON DHS OCCUPATIONAL HEALTH AND OPERATIONAL MEDICINE INFRASTRUCTURE.
Phillips Associates 1 Boost Training Transfer Using Predictive Learning Analytics™ Presented by: Ken Phillips Phillips Associates May 22, 2016 Session:
How to Demonstrate the Value of Contract Education Jack J. Phillips, Ph.D. Chairman ROI Institute, Inc. May 11, :00 AM- 12:00 PM 5 Levels of Evaluation.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Return on Investment: Training and Development Session 1 ROI and Evaluation.
Leadership Development at Bruce Power
Shopper Traffic Case Study:
Chapter Six Training Evaluation.
SPIRIT OF HR.in TRAINING EVALUATION.
Presenter: Angelia Bendolph Harkcon, Inc.
Assessments and the Kirkpatrick Model
Training Evaluation Chapter 6
6 Chapter Training Evaluation.
M O O N S H O T Is Where Magic Happens.
Presentation transcript:

Phillips Associates 1 Business Results Made Visible: Design Proof Positive Level 4 Evaluations Presented by: Ken Phillips Phillips Associates May 24, 2016 Session: TU :00-11:15 AM

Phillips Associates 2 Agenda 1.Examine Level 4 evaluation facts 2.Discover which learning programs are ideally suited for conducting a Level 4 evaluation 3.Examine a two-phase eight-guideline process for conducting Level 4 evaluations 4.Analyze three methods for connecting learning programs to business results

Phillips Associates 3 Kirkpatrick / Phillips Evaluation Model Levels of EvaluationMeasurement FocusTime Frame Level 1: Reaction Participant favorable reaction to a learning program Conclusion of learning program Level 2: Learning Degree to which participants acquired new knowledge, skills or attitudes Conclusion of learning program or within 6 to 8 weeks after Level 3: Behavior Degree to which participants applied back-on-the-job what was learned 2 to 12 months Level 4: Results Degree to which targeted business outcomes were achieved 9 to 18 months Level 5: ROI Degree to which monetary program benefits exceed program costs 9 to 18 months

Phillips Associates 4 Level 4 Evaluation Facts Source: ASTD Research Study, “The Value of Evaluation: Making Training Evaluations More Effective,” % Organizations evaluate some programs at Level 4 of all programs evaluated 15%75% Organizations view data collected as having high or very high value

Phillips Associates 5 CEOs Say The #1 thing they would most like to see from their learning and performance investments is evidence of business impact (Level 4 evaluation)… Source: ROI Institute Research Study, % receive this type of information …but only 8%

Phillips Associates 6 Why the Disconnect? Level 4 business results perceived as too difficult to measure Belief that high probability of uncontrollable variables affecting business outcomes makes measuring business results meaningless Not currently measuring Level 2 & 3 so can’t be expected to measure Level 4

Phillips Associates 7 What’s the Solution? Find ways to provide CEOs with the information they want!

Phillips Associates 8 Source: Donald & James Kirkpatrick, “Evaluating Training Programs: The Four Levels,” Data Collection Methods

Phillips Associates 9 Programs to Evaluate at Level 4 Ones that are strategically important Ones with high development and/or implementation costs Ones that a large number of participants will attend Ones the business executive stakeholder wants evaluated at Level 4

Phillips Associates 10 Most Common Level 4 Metrics Adapted from ASTD Research Study, “The Value of Evaluation: Making Training Evaluations More Effective,” 2009 MetricPercentage Customer satisfaction39% Employee satisfaction/engagement37% Proficiency/competency levels33% Productivity indicators26% Turnover/promotions25% Actual business outcomes22%

Phillips Associates 11 Metrics that Correlate Strongest with Evaluation Success Source: ASTD Research Study, 2009 MetricPercentage Customer satisfaction39% Employee satisfaction/engagement37% Proficiency/competency levels33% Productivity indicators26% Turnover/promotions25% Actual business outcomes22%

Phillips Associates 12 Conduct Level 4 Evaluations in Two Phases Phase 1 Phase 2 Identify business results that have a strong relationship with learning program content Connect the learning program to the business results

Phillips Associates 13 Phase 1: Guiding Principles 1. Plan for Level 4 evaluations from start of training project 2. Identify existing Level 4 business metrics – think operational, financial & HR 3. Search first at stakeholder, next at department and last at organizational level to identify business metrics 4. Select business metric(s) with a strong relationship to program content & ones you can access

Phillips Associates 14 Instructions 1. Form a group of 3, 4 or 5 persons 2. Identify what business results you might track to conduct a Level 4 evaluation, given the training topic selected by your group 3. Use your own organization as a reference to identify actual business results you might track – think operational, financial & HR data 4. Be prepared to share your ideas with the whole group

Phillips Associates 15 Select One Training Program First-level supervisor Selling skills Performance management New hire orientation Employee engagement

Phillips Associates 16 Business Metric Examples Financial Sales Net income Revenue Expenses Cost of workforce Operational Quality Productivity Supply chain Innovation Customer satisfaction

Phillips Associates 17 Business Metric Examples HR Turnover Grievances Safety incidents/accidents Absenteeism Employee satisfaction/engagement Retention Bench strength

Phillips Associates 18 Source: Jim Kirkpatrick Level 4 business results are the easiest to measure, but the hardest to correlate with training.

Phillips Associates 19 Phase 2: Guiding Principles 5.Three ways to connect a learning program to business results: Trend-line analysis Expert estimation Control group 6.Use data collection frequency & data analysis level to determine best approach 7.When using Expert Estimation, discount results by potential error of estimate 8.When conducting a Level 4 evaluation, also conduct a Level 1, 2 & 3 evaluation

Phillips Associates 20Instructions Note: All managers & supervisors attended a leadership program during highlighted period. 1. Form a group of 3, 4 or 5 persons 2. Review the 2 Business Results Data charts 3. Develop a plan for how to connect the leadership program with each of the business results 4. Be prepared to share your ideas with the whole group

Phillips Associates 21 Business Results Data HR data collected over a 12 month period MonthTurnoverGrievances Jan178 Feb157 Mar187 Apr166 May177 June156 July127 Aug126 Sept106 Oct75 Nov75 Dec54 Jan53

Phillips Associates 22 Business Results Data Percent of favorable responses to annual Employee Engagement Survey Year Employee Engagement Score

Phillips Associates 23 Business Results Data MonthTurnoverGrievances Jan178 Feb157 Mar187 Apr166 May177 June156 July127 Aug126 Sept106 Oct75 Nov75 Dec54 Jan53 HR data collected over a 12 month period

Phillips Associates 24 Don’t Do This! Report the data chronologically and note where the learning program occurred Compute the average of the data before and after implementing the learning program

Phillips Associates 25 Use trend-line analysis to connect the leadership program to the business results data Do This!

Phillips Associates 26 Trend-Line Analysis Use preprogram performance as base and extend trend into future. Following program implementation, actual performance is compared to projected trend line performance

Phillips Associates 27 Trend-Line Example Compute Difference

Phillips Associates 28 Trend-Line Analysis Five conditions must be present to use: 1. At least 6 data points of pre-program performance available 2. Pre-program data is relatively stable 3. Pre-program trend is expected to continue regardless of whether learning program is implemented 4. No new variables enter process after learning program is implemented 5. Frequent collection points for targeted business results data

Phillips Associates 29 Business Results Data Year Employee Engagement Score Percent of favorable responses to annual Employee Engagement Survey

Phillips Associates 30 Don’t Do This! Report comparative data noting previous and current business results data

Phillips Associates 31 Do This! Use expert estimation or a control group design to connect the leadership program to the business results data

Phillips Associates 32 Expert Estimation Obtain estimates from people who have first-hand knowledge or credible insight into cause-and-effect relationship between implementing a learning program and a change in a business metric Technique developed by Jack Phillips in 1983

Phillips Associates 33 Calculating Expert Estimation Participant % Leadership Program Contributed to Improved Results Confidence Level of Estimate Adjusted Connection Fact: Employee Engagement score improved from 52 to 70% Favorable Responses

Phillips Associates 34 Expert Estimation Formula Sum Participant Adjusted Estimates Number of Participants Average Adjusted Percentage Calculation 1: Post-Program Business Metric Value Pre-Program Business Metric Value Average Adjusted Contribution Percentage Calculation 2: (571)(10)(57.1) (70)(52)(57.1) (10.3)

Phillips Associates 35 Expert Estimation Three conditions must be present to use: 1. Learning program conducted 2. Business metrics identified prior to program and monitored following program reveal an improvement 3. Experts are able to provide input connecting learning program and business results – discounted by potential error of their estimate

Phillips Associates 36 Control Group Experimental design in which two groups of learners are compared where one group attends a learning program and the other doesn’t. Comparisons are made either following the learning program or both before and after the program

Phillips Associates 37 Post-Test Only Control Group Design Experimental Group Learning Program Business Result Control Group Business Result Note: To obtain valid results, Experimental and Control Groups must be similar! Compute Difference

Phillips Associates 38 Pre/Post-Test Control Group Design Learning Program BR M2 BR M1 Control Group BR M2 BR M1 Compute Difference Compute Difference Experimental Group

Phillips Associates 39 Control Group Four conditions must be present to use: 1. Natural control group exists – large number of participants to attend same learning program staggered over time 2. Business results can be broken down to individual participant level 3. Stakeholder agrees with method 4. Credibility of results is of highest importance

Phillips Associates 40 Summary Plan for Level 4 evaluations from the start of the training project Seek first to identify existing business metrics before creating new ones Only select business metrics that have a strong relationship with the program content

Phillips Associates 41 Summary (cont.) To connect a learning program to business results, use either: Control group Trend-line analysis Expert estimation Use data collection frequency & data analysis level as guide in determining best approach to connect learning to business results

Phillips Associates 42 Everything that can be counted doesn’t necessarily count and everything that counts cannot necessarily be counted. Source: Albert Einstein Keep in Mind:

Phillips Associates 43 Phillips, Ken, “Eight Tips on Developing Valid Level 1 Evaluation Forms”, Training Today, Fall 2007, pps. 8 & 14. Phillips, Ken, “Developing Valid Level 2 Evaluations”, Training Today, Fall 2009, pps Phillips, Ken, “Capturing Elusive Level 3 Data: The Secrets of Survey Design”, Unpublished article, Phillips, Ken, “Level 1 Evaluations: Do They Have a Role in Organizational Learning Strategy?”, Unpublished article, Phillips, Ken, “Business Results Made Visible: Designing Proof Positive Level 4 Evaluations”, Unpublished article, Free Articles

Phillips Associates 44 Phillips Associates (847) N. Wooded Glen Drive Grayslake, Illinois Ken Phillips