Session 412 Why Don’t We Weigh Them? Gloria Gery Gery Associates March 2, 2004 Training and Online Learning Atlanta.

Slides:



Advertisements
Similar presentations
E-learning Design & Development Models EdTech101.
Advertisements

Designing and Developing Online Courses. Course Life Cycle Design Develop Implement Evaluate Revise.
A GUIDE TO CREATING QUALITY ONLINE LEARNING DOING DISTANCE EDUCATION WELL.
Business meetings & conferences matter New thinking on ROI Ed Bernacki - The Idea Factory Seven Rules for Designing More innovative Conferences
Jennifer Strickland, PhD,
Training Kirkpatrick’s Four Levels of Evaluation Kelly Arthur Richard Gage-Little Dale Munson Evaluation Strategies for Instructional Designers.
Use Case Development Social Journey Template. A “Use Case” is simply a defined way of using Yammer to accomplish a goal or complete a task. Define the.
Unit 10: Evaluating Training and Return on Investment 2009.
Chapter 6 Training Evaluation
Measuring (and Driving) the Value of Training Bruce Winner, Los Rios CCD – Government Training Academy Bruce blogs to the training community at -
1 SESSION 1 using The New Performance Standards and New VDOE Requirements
+ Training Evaluation Plan Increasing transfer and effectiveness through the proper evaluation of our current and future training programs. I am here to.
STRATEGIES FOR ONLINE LEARNING IN A GLOBAL NETWORK UNIVERSITY INTED 2013 Annette Smith, Kristopher Moore, Erica Osher Reifer New York University.
Mental Skills Project LTA Senior Coach Course. Mental skills project consists of a presentation that has to be delivered during module 6 of the course.
Assessing the Curriculum Gary L. Cates, Ph.D., N.C.S.P.
Chapter 6 Training and Development in Sport Organizations.
Performance Technology Dr. James J. Kirk Professor of HRD.
Evaluating the Effectiveness of Online Learning Programs ITEC 860 Fall 2003 Aniruddh Mukerji.
EFFECTIVENESS OF TRAINING Group 5. Effectiveness of Training  What is “effectiveness of training”? Effectiveness means producing an intended result.
Chapter 6 Training Evaluation. Chapter 6 Training Evaluation Concepts Training Evaluation: The process of collecting data regarding outcomes needed to.
Group HR Training & Development Welcome Good Evening 18 th September 2012 Sukanya Patwardhan.
RPPS Education Development Process Debbie Bender.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
Computers for Education Ferdinand B. Pitagan, PhD Factors Affecting the Use of Computers for Education Roles of Teachers in the Use of computers for Education.
 My philosophy is perseverance cannot be divorced from progress to success.  My goal is to use these tools that I have learnt and insight I have gained.
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
TRAINING & DEVELOPMENT Dr. Anil Mehta DEFINITION OF TRAINING “A PLANNED ACTIVITY TO MODIFY ATTITUDE, KNOWLEDGE OR SKILL THROUGH LEARNING EXPERIENCE TO.
Chapter 6 Training Evaluation
Teaching with Data Cathy Manduca Iowa State University, 2005.
COACHING IN CHILD WELFARE MARCH 21, DEFINITION OF COACHING.
Chapter 10 Learning and Development in a Knowledge Setting
HA 7712: Human Resource ManagementProfessor Sturman, Spring 2010 Individual Performance and HR Metrics February 2, 2010.
Brunning Chapter 6 Beliefs About Self.
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
How to Demonstrate the Value of Contract Education Jack J. Phillips, Ph.D. Chairman ROI Institute, Inc. May 11, :00 AM- 12:00 PM 5 Levels of Evaluation.
Information Retention in e-Learning De Leon Kimberly Obonyo Carolyne Penn John Yang Xiaoyan.
An instructional design theory for interactions in web-based learning environments 指導教授 : 陳 明 溥 研 究 生 : 許 良 村 Lee, M.& Paulus, T. (2001). An instructional.
Return on Investment: Training and Development Session 1 ROI and Evaluation.
Building Stronger Relationships between Us… “Data People” and Them… “End Users.” Joshua M. Warne Business Intelligence Analyst Fishback Financial Corporation.
Orientation Classroom Teaching and Learning Theory Course Introduction and Overview.
Updating the Value Proposition:
Introduction to Employee Training and Development Chapter 1
ISBE Mathematics Foundational Services Training
Training Trainers and Educators Unit 8 – How to Evaluate
Intermediate Small Business Programs, Part B SBP 202 Lesson 1: Introduction February 2017 Lesson 1: Introduction.
Dana Keener, Ph.D. ICF Macro 2009 AEA Annual Meeting November 12, 2009
The Year of Core Instruction
Recall The Team Skills Analyzing the Problem
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Deep dive on learning progressions + Critiquing lesson methods
MTM Tools key to running
Meet ADDIE: Instructional Systems Development
Training Trainers and Educators Unit 8 – How to Evaluate
Social Change Implications
Practice makes perfect. Simulations make perfect practice.
An introduction into learning and performance
Learner-Centered Teaching: The Basics
Assessments and the Kirkpatrick Model
Training Evaluation Chapter 6
Introduction to Employee Training and Development Chapter 1
Engaged Plenary Patricia McGee.
Adult Learning and Training
The FIDGE Model – ANALYSIS Phase
Assessment and Authentic Learning
Learning Analytics (LA’s), Student Engagement and Retention
Kirkpatrick’s Four Levels of Evaluation
Pati Kravetz Associate Director for Experiential Learning and Student Employment Main title: 40 pt. Arial Presenter Name: 16 pt. Arial Presenters Title:
Presentation transcript:

Session 412 Why Don’t We Weigh Them? Gloria Gery Gery Associates March 2, 2004 Training and Online Learning Atlanta

A True Story: circa 1978 Director, IT and End User Training at Aetna Monthly metrics: Number of Student Days Number of Student Days Cost per Student Day Cost per Student Day No shows No shows Classroom Utilization Classroom Utilization Completions vs. drop outs Completions vs. drop outs Average satisfaction levels Average satisfaction levels

My Boss: Mr. Numbers Activity and results plotted to the 4 th decimal point Graphed and charted Every nit discussed. I said… Why don’t we weigh them” Why don’t we weigh them” Let’s add “cost per pound” to the metrics? Let’s add “cost per pound” to the metrics?

Article My boss told me not to “be so smart”. I am still asking the question:

Doing Learning Referencing Content Resources People Data Collaborating The Performance/Learning Cycle Examples Instruction Demonstrations Illustrations Process Support Wizards Templates Variable Manipulators Task automation tools Peers Experts Courtesy of Ariel Performance Centered Systems, Inc. Cincinnati, OH

What are the Issues in Evaluation Real and Perceived Relevance To the business To the business To the individual To the individual What can really be assessed for both participants and management Reaction Reaction Emotion Emotion Cognitive response Cognitive response Attitude Attitude Behavior Behavior

More Issues What can be measured Content Content Context Context Duration and Timing Duration and Timing Instructor performance Instructor performance Attributes and structure of the course Attributes and structure of the course Activity levels Activity appropriateness Skills and behaviors Skills and behaviors Knowledge and skill transfer Knowledge and skill transfer Meaningfulness of what is transferred Meaningfulness of what is transferred

More Issues Commitment to really evaluate Low to moderate Low to moderate More driven by Training function than by management or participants More driven by Training function than by management or participantsWhy? De Facto acceptance of the intervention De Facto acceptance of the intervention Lack of alternatives Lack of alternatives Collective collusion Collective collusion Fear of what will be determined Fear of what will be determined

Traditional Kirkpatrick Model Evaluation of the course against standards standards Outcomes Outcomes Level 1 – 4 from “happiness” through business impact Rare that people go beyond learner satisfaction or “remote” assessments of materiality

Consider Another Point of View: An Oblique Angle Evaluate instructional events against alternative performance development mechanisms Stop the self-referencing model the self-referencing model the “we’re no better or worse off” than anyone else perspective the “we’re no better or worse off” than anyone else perspective wheel spinning wheel spinning

What to Consider Comparative outcomes on all the dimensions mentioned earlier Look at relative effectiveness of the kind of intervention for specific types of content, skills, competencies and behaviors Compare against Job aids Job aids Coaching Coaching On-line reference On-line reference Integrated performance support Integrated performance support Performance centered software Performance centered software

Possible outcomes Relative performance or value is more or less than we thought Question about where money is spent Illusions shattered or belief’s reaffirmed Nothing at all….

Comparing to Reference Current form vs. better form Current form vs. better form Large searchable (sometimes) objectives vs. small granular tagged content that can be assembled ad hoc The point of view of the reference The point of view of the reference Provider perspective Provider perspective Typically not task oriented Typically not task oriented Technical vs. performance or goal oriented Technical vs. performance or goal oriented Separating accessibility from utility or usefulness Separating accessibility from utility or usefulness

Comparing to Knowledge and Content Mgt (however that is defined) Much training occurred because performers could not find what they need Just in case Just in case Structured, granular content, tagged will be the foundation of in context learning Capturing new knowledge assets is different, but essential to increasing the quality and depth of synthesis More powerful because it’s experience -based More powerful because it’s experience -based

Comparing to Tools Tools higher leverage Decrease requirements for knowledge and skill Decrease requirements for knowledge and skill Embody complex rules and relationship Embody complex rules and relationship Institutionalize best practice Institutionalize best practice Can rapidly integrate disseminate changed process, rules and relationships without changing the performers Can rapidly integrate disseminate changed process, rules and relationships without changing the performers Can (rather must) include content and knowledge and enable learning Can (rather must) include content and knowledge and enable learning

Compare non-integrated to integrated environments Requires “demonstration” or prototype projects or examples of the integrated or fused design May require “makeovers” of existing resources to illustrate Evaluation is frequently “face validity… but it must begin

What are the New Metrics Time to understanding Time to retrieval Time to performance Pre-post evaluations (if it matters) of learning outcomes Pre-post evaluations of performance Demonstrations of business impact Efficiency Efficiency Effectiveness Effectiveness Value Added and Strategy Value Added and Strategy

Sample Value Propositions Return on invested capital Return on invested capital Offsets of costs for duplicate, non-integrated resources Offsets of costs for duplicate, non-integrated resources Flexibility in work assignment Flexibility in work assignmentPeopleOrganizations Pushing work to customers Time to implementation, change or integration Time to implementation, change or integration Resources required for “success” Resources required for “success” Quality Quality Etc.h Etc.h