Phillips Associates 1 Boost Training Transfer Using Predictive Learning Analytics™ Presented by: Ken Phillips Phillips Associates May 22, 2016 Session:

Slides:



Advertisements
Similar presentations
Return on Investment: Training and Development
Advertisements

A Systems Approach To Training
Non-Classroom Teacher Evaluation Guidelines. The single most influential component of an effective school is the individual teachers within that school.
Reading ARDT Teams September 10 & 11, Agenda Your Role Your Role Big Picture Big Picture PLC for ARDT PLC for ARDT CI support CI support Plan on.
“Putting the pieces together – as a community”. Certification recognizes the experience, knowledge and skill of an individual as measured against a standard.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Gallup Q12 Definitions Notes to Managers
Training Kirkpatrick’s Four Levels of Evaluation Kelly Arthur Richard Gage-Little Dale Munson Evaluation Strategies for Instructional Designers.
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
Show Me the Money: Moving From Impact to ROI
Phillips Associates 1 Capturing Elusive Level 3 Data: The Secrets Of Survey Design Session: SU116 Capturing Elusive Level 3 Data: The Secrets Of Survey.
Implementing the CCSS Through Coaching Atomic Conference December 2, 2014.
SIX STAGE MODEL FOR EFFECTIVE HRD EVALUATION
Unit 10: Evaluating Training and Return on Investment 2009.
Metrics That Matter Learning Analytics
Kirkpatrick.
Accountability in Human Resource Management Dr. Jack J. Phillips.
Formative and Summative Evaluations
Quality evaluation and improvement for Internal Audit
IMPLEMENTATION OF AN E-LEARNING PLATFORM USING CMS
Measuring (and Driving) the Value of Training Bruce Winner, Los Rios CCD – Government Training Academy Bruce blogs to the training community at -
Chief Learning Officer Webinar Sponsored by Skillsoft September 13, 2012 David Vance.
SCHINDLER Sales Force Training Needs Assessment and Development Project Michael Yurchuk Sales Training Manager, Schindler Elevator Richard Dapra Ph.D.,
501 Commons: A resource for nonprofits. A partner for philanthropy Offered by: Sherwood Trust & Nonprofit Learning Center Presented by: 501 Commons & BMHRA.
MGT-555 PERFORMANCE AND CAREER MANAGEMENT
08/2009 The Benefits of Mentoring. Mentoring Mentoring has evolved in the workplace to be less about bosses grooming their handpicked successors to being.
Phillips Associates Learn 11 Surprising Techniques for Obtaining Powerful Data From Level 1 Evaluations Learn 11 Surprising Techniques for Obtaining Powerful.
Mountain View College Spring 2008 CCSSE Results Community College Survey of Student Engagement 2008 Findings.
Chapter no:6 Training and development of sales force.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Training 2 MANA 3320 Dr. Jeanne Michalski. Phase 3: Implementing the Training Program Importance of training outcomes Type of trainees Choosing the instructional.
…and how to give it to them What CEOs Want Patti Phillips, Ph.D.
Human Resource DevelopmentMuhammad Adnan Sarwar 1 Training and Development Human Resource Management.
1© 2010 by Nelson Education Ltd. Chapter Ten Transfer of Training.
Interactive Training Skills: Evaluation Study Executive Summary Presentation Service Merchandise Company Date: Wednesday, April 6, 2011 CONFIDENTIAL Consultants:
We make information useful! Developing Training and Measurement Strategies to Align with Business Goals By Jay Kasdan, Project/Account Manager.
11 Report on Professional Development for and Update Developed for the Providence School Board March 28, 2011 Presented by: Marco Andrade.
Teaching Reading Comprehension
Chapter Fourteen Communicating the Research Results and Managing Marketing Research Chapter Fourteen.
Strategies for Knowledge Management Success SCP Best Practices Showcase March 18, 2004.
Project 3 Supporting Technology. Project Proposal.
College of Education Graduate Programs
Copyright 2004 ROI Institute, Inc. how me the money! Moving from Impact to ROI Patti Phillips, Ph.D.
Hawaiian Airlines Na Leo Survey 2010 Your Results.
Lynn Schmidt, PhD ATD Puget Sound October 21, 2014.
Performance Consulting: Make Performance Your Business! (and Prove It)
European Social Fund Promoting improvement Shirley Jones.
Kirkpatrick’s Four-Level Model of Evaluation
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
Overcoming Challenges with B2B Sales Name Position, Company Website Steve Sienkiewicz Accelerate Consulting | Sales Xceleration
M & E System for MI’s Training Program & Guidelines for MI’s Completion Report Presented by Monitoring and Evaluation Officer Mekong Institute January.
TERM 2. SESSION 3. EVALUATION AND ASSESSMENT – summative and formative THE REFLECTIVE PRACTITIONER THE CUSTOMER’S PERSPECTIVE.
How to Raise the Value of Global Leadership Development (With the CFO’s Help!). ATD Session M315 Terrence Donahue, Corporate Director, Learning, Emerson.
JA It’s My Future Name Title Company. JA It’s My Future Session One Objectives: My Brand Recognize the choices we make now can have a direct impact on.
1 The other 80% of Learning in Government –Informal Learning.
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
CEIT 225 Instructional Design Prof. Dr. Kürşat Çağıltay
Phillips Associates 1 Business Results Made Visible: Design Proof Positive Level 4 Evaluations Presented by: Ken Phillips Phillips Associates May 24, 2016.
How to Demonstrate the Value of Contract Education Jack J. Phillips, Ph.D. Chairman ROI Institute, Inc. May 11, :00 AM- 12:00 PM 5 Levels of Evaluation.
Designing and Implementing Local Faculty Development Programs
Optimizing L&D Contribution to Business Outcomes
MTM Measurement Initiative
SPIRIT OF HR.in TRAINING EVALUATION.
Training Evaluation Chapter 6
Welcome to Your New Position As An Instructor
6 Chapter Training Evaluation.
ROI INSTITUTE ™ ROI Institute is the only organization in the world that provides a comprehensive learning process to evaluate programs at five levels,
Presentation transcript:

Phillips Associates 1 Boost Training Transfer Using Predictive Learning Analytics™ Presented by: Ken Phillips Phillips Associates May 22, 2016 Session: SU 217 1:30-2:30 PM

Phillips Associates 2 Agenda 1. Discover the impact “scrap learning” has on wasted organization resources & lost credibility with business executives 2. Analyze how to build a PLA model that identifies those learners who are most & least likely to apply what they learned 3. Examine the 7 benefits of using PLA in your organization & the 3 phases & 9 steps for getting started

Phillips Associates 3 Scrap learning – What is it?

Phillips Associates 4 Term coined by KnowledgeAdvisors, a CEB company, that describes the difference between learning that’s delivered and learning that’s applied back on the job Scrap Learning

Phillips Associates 5 How big is the problem?

Phillips Associates 6 Benchmark Study 1 45% Source: Confronting Scrap Learning CEB White Paper, 2014

Phillips Associates 7 Benchmark Study 2 < 20% Applied new skills back on the job > 15% Didn’t try to apply new skills back on the job 65% Tried applying new skills back on the job but reverted back Source: Robert Brinkerhoff, 2004

Phillips Associates 8 View from individual organization level

Phillips Associates 9 According to ATD 2015 “State of the Industry Report” Average per employee training expenditure Average number of training hours consumed per employee $ = =

Phillips Associates 10 Calculating Scrap Learning at individual organization level $1229 X 45% = 32.4 hours X 45% = $1229 X 80% = 32.4 hours X 80% = $ $

Phillips Associates 11 Houston, we have a problem! Source: James Lovell, Apollo 13 flight

Phillips Associates 12 The solution: Predictive Learning Analytics ™

Phillips Associates 13 is a methodology for peering into the future, at the conclusion of a learning program, and forecasting learner outcomes and actions, with the intent of changing those outcomes and actions for the better. Predictive Learning Analytics™(PLA) Source: Ken Phillips

Phillips Associates 14 PLA vs. Traditional Measurement & Evaluation

Phillips Associates 15 Kirkpatrick / Phillips Evaluation Model Levels of EvaluationMeasurement FocusTime Frame Level 1: Reaction Participant favorable reaction to a learning program Conclusion of learning program Level 2: Learning Degree to which participants acquired new knowledge, skills or attitudes Conclusion of learning program or within 6 to 8 weeks after Level 3: Behavior Degree to which participants applied back-on-the-job what was learned 2 to 12 months Level 4: Results Degree to which targeted business outcomes were achieved 9 to 18 months Level 5: ROI Degree to which monetary program benefits exceed program costs 9 to 18 months

Phillips Associates 16 PLA vs. Traditional Learning M&E High Low PastFuture Value Time Traditional Learning M&E Predictive Learning Analytics Traditional Learning M&E Focuses on programs or cohorts Describes what has happened Predictive Learning Analytics Focuses on individual learners Predicts future likelihood of certain behaviors and actions

Phillips Associates 17 Algorithm consisting of 10 factors that are known to contribute to training transfer & research shows have a strong positive correlation with either Level 2 learning, Level 3 behavior or Level 4 results Heart of PLA:

Phillips Associates 18 Instructions 1. Form a group of 3, 4 or 5 persons 2. Brainstorm a list of things we know to be “truths” about training transfer 3. Be prepared to share your ideas with the whole group Example: Training transfer increases when learners have an immediate opportunity to apply what they learned back on the job

Phillips Associates 19 Level 2 Training Transfer Factors 1. Acquire new information – knowledge, skills or attitudes 2. See a program as relevant to themselves and their job 3. See a program as an important investment in their career development Learners need to:

Phillips Associates 20 Level 3 Training Transfer Factors 4. Be personally motivated to apply what was learned 5. Have confidence in their ability to apply what was learned 6. Reflect on key lessons learned & how they can help improve their performance Learners need to:

Phillips Associates 21 Level 3 Training Transfer Factors (Cont.) 7. Be actively engaged by their manager post-program regarding what was learned 8. Be supported by work colleagues, post- program, to apply what was learned 9. Have an immediate opportunity to apply what was learned Learners need to:

Phillips Associates 22 Level 4 Training Transfer Factor 10. See a likely improvement in a key business metric tracked by their department if new information learned is applied Learners need to:

Phillips Associates 23 Using the Algorithm

Phillips Associates 24 Using The Algorithm Step 1: Convert factors into survey items and include as part of a Level 1 evaluation or administer as a separate survey Step 2: Collect data from Calibration Cohort – first participants to attend PLA target training program

Phillips Associates 25 Using The Algorithm Step 4: Target participants who are at risk & least likely to apply what they learned for follow-up & reinforcement activities in order to increase training transfer Step 3: Calculate Learner Application Index™ score for each Calibration Cohort program participant

Phillips Associates 26 LAI Individual Scores

Phillips Associates 27 LAI Factor Scores

Phillips Associates 28 Additional Calculations

Phillips Associates 29 Additional Calculations Manager Training Support Index Manager Training Support Index™ Sort learners into groups arranged by manager Compute Manager Training Support Index™ score for each manager & identify managers who do a good & poor job of supporting learning Work with those managers who need help doing a better job

Phillips Associates 30 Manager Training Support Index™ Score

Phillips Associates 31 Additional Calculations Compute Overall Program Quality score & compare the quality of one program with another Target programs not delivering value for either revision or elimination Overall Program Quality

Phillips Associates 32 Benefits of Using Predictive Learning Analytics

Phillips Associates Less money & time wasted on learning that is delivered but not applied back on the job – scrap learning 2. Benefits Of Using PLA 3. Increased personal credibility in eyes of business executive stakeholders More effective & efficient use of follow-up activities by targeting participants who are at risk & least likely to apply what they learned in a program back on the job

Phillips Associates Benefits Of Using PLA (cont.) Objective way to identify managers who do a poor job of supporting learning so that their approach can be improved Objective way to evaluate the effectiveness of a learning program design

Phillips Associates Objective way to compare the overall quality of one learning program with another using a single number 7. Benefits Of Using PLA (cont.) Enhanced reputation among L&D colleagues

Phillips Associates 36 3 Phases & 9 Steps for Implementing PLA

Phillips Associates 37 PLA Phases & Steps

Phillips Associates 38 Summary The issue of scrap learning has been around forever. But, what’s different today is that with Predictive Learning Analytics™ there now is a way to manage it. Source: Ken Phillips

Phillips Associates 39

Phillips Associates 40 Phillips Associates (847) N. Wooded Glen Drive Grayslake, Illinois Ken Phillips

Phillips Associates 41 Ken Phillips, PhD, CPLP He regularly speaks to Association for Talent Development (ATD) groups, university classes and corporate learning and development departments. Since 2008, he has spoken at the ATD International Conference on topics related to measurement and evaluation of learning. In addition, in 2015 he was invited to present at the ATD Middle East North Africa (MENA) Conference and the ATD China Summit on topics related to measurement and evaluation of learning. Prior to pursuing a Ph.D. in the combined fields of organization behavior and educational administration at Northwestern University, Ken held management positions with two colleges and two national corporations. In addition, he has written articles that have appeared in td magazine, Training Today and HR.com, and is a contributing author to four books in the L&D field. Ken earned the Certified Professional in Learning and Performance (CPLP ® ) credential from ATD in 2006 as a pilot pioneer and was recertified in 2009, 2012 and again in Ken is founder and CEO of Phillips Associates, a consulting and publishing company with expertise in measurement and evaluation of learning and performance management. He has more than 30 years experience designing learning instruments and assessments and has authored more than a dozen published learning instruments. LinkedIn:

Phillips Associates 42 Select a single, high profile, and costly learning program for your PLA project and initially “stay under the radar” Implementation Guidelines 2. Phase 1: Setting the stage 1. Build your Predictive Learning Analytics™ algorithm

Phillips Associates 43 Collect data & calculate individual Learner Application Index™ score for each program participant Implementation Guidelines 3. Phase 1: Setting the stage (cont.) Calculate “scrap learning” percentage baseline score associated with program 4.

Phillips Associates 44 Implementation Guidelines Phase 2: Implementing the methodology 5. Identify learners at risk & least likely to apply what was learned back on job & target for follow-up activities 6. Conduct Level 2 & Level 3 evaluations in order to validate accuracy of PLA algorithm 7. Recalculate scrap learning percentage following implementation of follow-up activities

Phillips Associates 45 Implementation Guidelines Phase 3: Sharing your success Report results to business executive stakeholders Enhance accuracy of PLA algorithm by including additional data from company LMS or HRIS, if available

Phillips Associates 46 The goal of training: “Helping learners achieve great results, not providing great training!” Source: Robert Brinkerhoff