Evaluating HRD Programs. Effectiveness The degree to which a training (or other HRD program) achieves its intended purpose Measures are relative to some.

Slides:



Advertisements
Similar presentations
Performance Management
Advertisements

Training Evaluation Presentation by Ranjith Menon.
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
Advances in Human Resource Development and Management
Evaluation of Training
Managing Learning and Knowledge Capital Human Resource Development: Chapter 11 Evaluation Copyright © 2010 Tilde University Press.
Unit 10: Evaluating Training and Return on Investment 2009.
Return On Investment Integrated Monitoring and Evaluation Framework.
Kirkpatrick.
Accountability in Human Resource Management Dr. Jack J. Phillips.
Chapter 3 Training for Organizations Evaluating Organizational Training.
Evaluation of Training
3 Chapter Needs Assessment.
6 Chapter Training Evaluation.
1© 2013 by Nelson Education Ltd. CHAPTER TWELVE The Costs and Benefits of Training.
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved Chapter Training Evaluation.
Evaluating Training Programs. How can training programs be evaluated? Measures used in evaluating training programs Measures used in evaluating training.
Chapter 6 Training Evaluation
Software Process and Product Metrics
Understanding Validity for Teachers
Quantitative Research
Chapter 4 – Strategic Job Analysis and Competency Modeling
Getting to a Return on Investment for Transportation Training Presented by: Victoria Beale, JD, SPHR Ohio LTAP Center Director August 1, 2012 – Grapevine,
Training for Improved Performance
REC 375—Leadership and Management of Parks and Recreation Services Jim Herstine, Ph.D., CPRP Assistant Professor Parks and Recreation Management UNC Wilmington.
Human Resource Management Gaining a Competitive Advantage
Performance Technology Dr. James J. Kirk Professor of HRD.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Chapter 6 Training Evaluation. Chapter 6 Training Evaluation Concepts Training Evaluation: The process of collecting data regarding outcomes needed to.
5 Chapter Training Evaluation.
Management & Leadership
Training Evaluation. Evaluation of Training Since huge sums of money are spent on training and development, how far the programme has been useful must.
TRAINING EVALUATION WHAT? WHAT? WHY? WHY? HOW? HOW? SO WHAT? SO WHAT?
Group HR Training & Development Welcome Good Evening 18 th September 2012 Sukanya Patwardhan.
Evaluation and Case Study Review Dr. Lam TECM 5180.
Evaluating HRD Programs
 Advantage of a skill-based plan is that people can be deployed in a way that better matches the flow of work ◦ Avoids bottle necks ◦ Avoids idling.
CPS ® and CAP ® Examination Review ADVANCED ORGANIZATIONAL MANAGEMENT By Garrison and Bly Turner ©2006 Pearson Education, Inc. Pearson Prentice Hall Upper.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Dr. Shulagna Sarkar Assistant Professor , Jt. Coordinator – PGDM – HRM
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Training Evaluation.
Chapter 6 Training Evaluation
Information, Analysis, and Knowledge Management in the Baldrige Criteria Examines how an organization selects, gathers, analyzes, manages, and improves.
Copyright 2004 ROI Institute, Inc. how me the money! Moving from Impact to ROI Patti Phillips, Ph.D.
DAY-5 Evaluation and Control Process 6 Yes Determine what to measure. Measure performance. Take corrective action. STOP No Does perform- ance match.
Advances in Human Resource Development and Management Course Code: MGT 712 Lecture 29.
A Presentation on TRAINING NEEDS ANALYSIS. Shradha(02) Vidya(34) Rothin(58) Pallav(48) Preeti Minz(11) Preeti Kumari(S2) Rohan Charly(24)
The Marketing Plan Chapter 2. Section 2.1: Marketing Planning  Good marketing requires good planning Research your company Study your business environment.
Training Evaluation Chapter 6
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Output of Development Phase Training Title: Training Duration: Trainer: Seating: Target Population: Overall Training Objectives: After having attended.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Did you find the instructor contact information and office hours? Have you reviewed the upcoming assignments and due dates? Any questions on the grading.
Fact Finding (Capturing Requirements) Systems Development.
6 - 1 Training Evaluation Introduction (1 of 2) Training effectivenessTraining effectiveness refers to the benefits that the company and the trainees.
Return on Investment: Training and Development Session 1 ROI and Evaluation.
Classroom Assessments Checklists, Rating Scales, and Rubrics
MANAGING HUMAN RESOURCES
Classroom Assessments Checklists, Rating Scales, and Rubrics
HRD Evaluation Introduction.
Chapter Six Training Evaluation.
Chapter Six Training Evaluation.
Evaluating HRD Programs
SPIRIT OF HR.in TRAINING EVALUATION.
Evaluating HRD Programs
Assessments and the Kirkpatrick Model
Training Evaluation Chapter 6
6 Chapter Training Evaluation.
Contemporary Issues of HRM
Presentation transcript:

Evaluating HRD Programs

Effectiveness The degree to which a training (or other HRD program) achieves its intended purpose Measures are relative to some starting point Measures how well the desired goal is achieved

Evaluation

HRD Evaluation Textbook definition: “The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.”

In Other Words… Are we training: the right people the right “stuff” the right way with the right materials at the right time?

Evaluation Needs Descriptive and judgmental information needed –Objective and subjective data Information gathered according to a plan and in a desired format Gathered to provide decision making information

Purposes of Evaluation Determine whether the program is meeting the intended objectives Identify strengths and weaknesses Determine cost-benefit ratio Identify who benefited most or least Determine future participants Provide information for improving HRD programs

Purposes of Evaluation – 2 Reinforce major points to be made Gather marketing information Determine if training program is appropriate Establish management database

Evaluation Bottom Line Is HRD a revenue contributor or a revenue user? Is HRD credible to line and upper-level managers? Are benefits of HRD readily evident to all?

How Often are HRD Evaluations Conducted? Not often enough!!! Frequently, only end-of-course participant reactions are collected Transfer to the workplace is evaluated less frequently

Why HRD Evaluations are Rare Reluctance to having HRD programs evaluated Evaluation needs expertise and resources Factors other than HRD cause performance improvements – e.g., –Economy –Equipment –Policies, etc.

Need for HRD Evaluation Shows the value of HRD Provides metrics for HRD efficiency Demonstrates value-added approach for HRD Demonstrates accountability for HRD activities Everyone else has it… why not HRD?

Make or Buy Evaluation “I bought it, therefore it is good.” “Since it’s good, I don’t need to post-test.” Who says it’s: –Appropriate? –Effective? –Timely? –Transferable to the workplace?

Evolution of Evaluation Efforts 1.Anecdotal approach – talk to other users 2.Try before buy – borrow and use samples 3.Analytical approach – match research data to training needs 4.Holistic approach – look at overall HRD process, as well as individual training

Models and Frameworks of Evaluation Table 7-1 lists six frameworks for evaluation The most popular is that of D. Kirkpatrick: –Reaction –Learning –Job Behavior –Results

Kirkpatrick’s Four Levels Reaction –Focus on trainee’s reactions Learning –Did they learn what they were supposed to? Job Behavior –Was it used on job? Results –Did it improve the organization’s effectiveness?

Issues Concerning Kirkpatrick’s Framework Most organizations don’t evaluate at all four levels Focuses only on post-training Doesn’t treat inter-stage improvements WHAT ARE YOUR THOUGHTS?

A Suggested Framework – 1 Reaction –Did trainees like the training? –Did the training seem useful? Learning –How much did they learn? Behavior –What behavior change occurred?

Suggested Framework – 2 Results –What were the tangible outcomes? –What was the return on investment (ROI)? –What was the contribution to the organization?

Data Collection for HRD Evaluation Possible methods: Interviews Questionnaires Direct observation Written tests Simulation/Performance tests Archival performance information

Interviews Advantages: Flexible Opportunity for clarification Depth possible Personal contact Limitations: High reactive effects High cost Face-to-face threat potential Labor intensive Trained observers needed

Questionnaires Advantages: Low cost to administer Honesty increased Anonymity possible Respondent sets the pace Variety of options Limitations: Possible inaccurate data Response conditions not controlled Respondents set varying paces Uncontrolled return rate

Direct Observation Advantages: Nonthreatening Excellent way to measure behavior change Limitations: Possibly disruptive Reactive effects are possible May be unreliable Need trained observers

Written Tests Advantages: Low purchase cost Readily scored Quickly processed Easily administered Wide sampling possible Limitations: May be threatening Possibly no relation to job performance Measures only cognitive learning Relies on norms Concern for racial/ ethnic bias

Simulation/Performance Tests Advantages: Reliable Objective Close relation to job performance Includes cognitive, psychomotor and affective domains Limitations: Time consuming Simulations often difficult to create High costs to development and use

Archival Performance Data Advantages: Reliable Objective Job-based Easy to review Minimal reactive effects Limitations: Criteria for keeping/ discarding records Information system discrepancies Indirect Not always usable Records prepared for other purposes

Choosing Data Collection Methods Reliability –Consistency of results, and freedom from collection method bias and error Validity –Does the device measure what we want to measure? Practicality –Does it make sense in terms of the resources used to get the data?

Type of Data Used/Needed Individual performance Systemwide performance Economic

Individual Performance Data Individual knowledge Individual behaviors Examples: –Test scores –Performance quantity, quality, and timeliness –Attendance records –Attitudes

Systemwide Performance Data Productivity Scrap/rework rates Customer satisfaction levels On-time performance levels Quality rates and improvement rates

Economic Data Profits Product liability claims Avoidance of penalties Market share Competitive position Return on investment (ROI) Financial utility calculations

Use of Self-Report Data Most common method Pre-training and post-training data Problems: –Mono-method bias Desire to be consistent between tests –Socially desirable responses –Response Shift Bias: Trainees adjust expectations to training

Research Design Specifies in advance: the expected results of the study the methods of data collection to be used how the data will be analyzed

Research Design Issues Pretest and Posttest –Shows trainee what training has accomplished –Helps eliminate pretest knowledge bias Control Group –Compares performance of group with training against the performance of a similar group without training

Recommended Research Design Pretest and posttest with control group Whenever possible: –Randomly assign individuals to the test group and the control group to minimize bias –Use “time-series” approach to data collection to verify performance improvement is due to training

Ethical Issues Concerning Evaluation Research Confidentiality Informed consent Withholding training from control groups Use of deception Pressure to produce positive results

Assessing the Impact of HRD Money is the language of business. You MUST talk dollars, not HRD jargon. No one (except maybe you) cares about “the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.”

HRD Program Assessment HRD programs and training are investments Line managers often see HR and HRD as costs – i.e., revenue users, not revenue producers You must prove your worth to the organization –Or you’ll have to find another organization…

Evaluation of Training Costs Cost-benefit analysis –Compares cost of training to benefits gained such as attitudes, reduction in accidents, reduction in employee sick-days, etc. Cost-effectiveness analysis –Focuses on increases in quality, reduction in scrap/rework, productivity, etc.

Return on Investment Return on investment = Results/Costs

Calculating Training Return On Investment Results OperationalHowBefore AfterDifferences Expressed Results AreaMeasuredTraining (+ or –)in $ Quality of panels% rejected2% rejected 1.5% rejected.5%$720 per day 1,440 panels1,080 panels360 panels$172,800 per day per year HousekeepingVisual10 defects2 defects8 defectsNot measur- inspection (average) able in $ using 20-item checklist PreventableNumber of24 per year16 per year8 per year accidents Direct cost$144,000$96,000 per$48,000$48,000 per of each per year year accident Return Investment Total savings: $220, ROI = = SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43(8), 41. Printed by permission. Operational Results Training Costs = $220,800 $32,564 =6.8

Types of Training Costs Direct costs Indirect costs Development costs Overhead costs Compensation for participants

Direct Costs Instructor –Base pay –Fringe benefits –Travel and per diem Materials Classroom and audiovisual equipment Travel Food and refreshments

Indirect Costs Training management Clerical/Administrative Postal/shipping, telephone, computers, etc. Pre- and post-learning materials Other overhead costs

Development Costs Fee to purchase program Costs to tailor program to organization Instructor training costs

Overhead Costs General organization support Top management participation Utilities, facilities General and administrative costs, such as HRM

Compensation for Participants Participants’ salary and benefits for time away from job Travel, lodging, and per-diem costs

Measuring Benefits –Change in quality per unit measured in dollars –Reduction in scrap/rework measured in dollar cost of labor and materials –Reduction in preventable accidents measured in dollars –ROI = Benefits/Training costs

Utility Analysis Uses a statistical approach to support claims of training effectiveness: –N = Number of trainees –T = Length of time benefits are expected to last –d t = True performance difference resulting from training –SD y = Dollar value of untrained job performance (in standard deviation units) –C = Cost of training  U = (N)(T)(d t )(Sd y ) – C

Critical Information for Utility Analysis d t = difference in units between trained/untrained, divided by standard deviation in units produced by trained SD y = standard deviation in dollars, or overall productivity of organization

Ways to Improve HRD Assessment Walk the walk, talk the talk: MONEY Involve HRD in strategic planning Involve management in HRD planning and estimation efforts –Gain mutual ownership Use credible and conservative estimates Share credit for successes and blame for failures

HRD Evaluation Steps 1.Analyze needs. 2.Determine explicit evaluation strategy. 3.Insist on specific and measurable training objectives. 4.Obtain participant reactions. 5.Develop criterion measures/instruments to measure results. 6.Plan and execute evaluation strategy.

Summary Training results must be measured against costs Training must contribute to the “bottom line” HRD must justify itself repeatedly as a revenue enhancer, not a revenue waster