Evaluating HRD Programs

Slides:



Advertisements
Similar presentations
Training. Training & Development Definition “The systematic acquisition of attitudes, concepts, knowledge, roles, or skills, that result in improved performance.
Advertisements

Evaluation Procedures
Experimental and Quasi-Experimental Research
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
Defining Characteristics
Advances in Human Resource Development and Management
CHAPTER 8, experiments.
Research Design and Validity Threats
Research Problems.
TRAINING EVALUATION. STEPS TO EFFECTIVE TRAINING 1.Assess Needs 2.Design Training 3.Conduct/Deliver Training 4.Ensure Transfer –Support –Consequences.
Evaluation of Training
TRAINING & DEVELOPING EMPLOYEES. Human Resource Management Activities necessary for staffing the organization and sustaining high employee performance.
6 Chapter Training Evaluation.
9 Quantitative Research Designs.
Evaluating Training Programs. How can training programs be evaluated? Measures used in evaluating training programs Measures used in evaluating training.
Chapter 6 Training Evaluation
Chapter 9 Experimental Research Gay, Mills, and Airasian
1.
Chapter 8 Experimental Research
Experimental Design The Gold Standard?.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
I want to test a wound treatment or educational program in my clinical setting with patient groups that are convenient or that already exist, How do I.
Human Resource Management Gaining a Competitive Advantage
1 Commissioned by PAMSA and German Technical Co-Operation National Certificate in Paper & Pulp Manufacturing NQF Level 3 Collect and use data to establish.
Evaluation of Training Chapter- Nine(9)
Chapter 6 Training Evaluation. Chapter 6 Training Evaluation Concepts Training Evaluation: The process of collecting data regarding outcomes needed to.
5 Chapter Training Evaluation.
CHAPTER 8, experiments.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Copyright ©2008 by Pearson Education, Inc. Pearson Prentice Hall Upper Saddle River, NJ Foundations of Nursing Research, 5e By Rose Marie Nieswiadomy.
CPS ® and CAP ® Examination Review ADVANCED ORGANIZATIONAL MANAGEMENT By Garrison and Bly Turner ©2006 Pearson Education, Inc. Pearson Prentice Hall Upper.
1 Experimental Research Cause + Effect Manipulation Control.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Training Evaluation.
Chapter 6 Training Evaluation
Chapter-Nine Evaluation of Training Evaluation Phase InputProcess Output Evaluation Objectives Evaluation Strategy and Design Process Measures Design Issues.
Chapter 10 Experimental Research Gay, Mills, and Airasian 10th Edition
Experimental Research Methods in Language Learning Chapter 5 Validity in Experimental Research.
DAY-5 Evaluation and Control Process 6 Yes Determine what to measure. Measure performance. Take corrective action. STOP No Does perform- ance match.
Chapter 8 – Lecture 6. Hypothesis Question Initial Idea (0ften Vague) Initial ObservationsSearch Existing Lit. Statement of the problem Operational definition.
Quasi Experimental and single case experimental designs
1 Module 3 Designs. 2 Family Health Project: Exercise Review Discuss the Family Health Case and these questions. Consider how gender issues influence.
Experimental & Quasi-Experimental Designs Dr. Guerette.
SOCW 671: #6 Research Designs Review for 1 st Quiz.
Chapter 8 Experiments Topics Appropriate to Experiments The Classical Experiment Selecting Subjects Variations on Experimental Designs An Illustration.
EXPERIMENTS AND EXPERIMENTAL DESIGN
The Experiment Chapter 7. Doing Experiments In Everyday Life Experiments in psychology use the same logic that guides experiments in biology or engineering.
EXPERIMENTAL DESIGNS. Categories Lab experiments –Experiments done in artificial or contrived environment Field experiments –Experiments done in natural.
IREL 561: Research Methods Fall 2013 Week 10 Based largely on Neuman’s Basics of Social Research, Chapter 8 Prepared by Craig Webster, Ph.D.
A Pilot Study of a Multimedia Instructional Program for Teaching of ESL Grammar with Embedded Tracking.
11-1 Chapter 11 Experiments and Test Markets Learning Objectives Understand... uses for experimentation advantages and disadvantages of the experimental.
Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design.
Advances in Human Resource Development and Management Course Code: MGT 712 Lecture 29.
Training Evaluation Chapter 6
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Output of Development Phase Training Title: Training Duration: Trainer: Seating: Target Population: Overall Training Objectives: After having attended.
Criminal Justice and Criminology Research Methods, Second Edition Kraska / Neuman © 2012 by Pearson Higher Education, Inc Upper Saddle River, New Jersey.
Research designs Research designs Quantitative Research Designs.
Educational Research Experimental Research Chapter 9 (8 th Edition) Chapter 13 (7 th Edition) Gay and Airasian.
HRD Evaluation Introduction.
Research Methods: Concepts and Connections First Edition
Chapter Six Training Evaluation.
The Experiment Chapter 7.
SPIRIT OF HR.in TRAINING EVALUATION.
Purpose of Training Evaluation
The Nonexperimental and Quasi-Experimental Strategies
Evaluation of Training
Chapter 18: Experimental and Quasi-Experimental Research
Training Evaluation Chapter 6
6 Chapter Training Evaluation.
Presentation transcript:

Evaluating HRD Programs Chapter 7 Human Resource Development

Purpose of HRD Evaluation Determine if accomplished objectives Identify strengths and weaknesses Cost-benefit analysis Who should participate and who benefited most Determine if program appropriate Database for decision making

Potential Questions to Be Addressed in a Process Analysis (During Training) Was there a match between trainer, training techniques, and training/learning objectives?   • Were lecture portions of the training effective? Was involvement encouraged/solicited?   Were questions used effectively?  • Did the trainer appropriately conduct the various training methodologies (case study, role play, etc.)? Were they explained well?        Did the trainer use the allotted time for activities?        Was enough time allotted?        Did trainees follow instructions Was there effective debriefing following exercises?   • Did the trainer follow the training design and lesson plans? Was enough time given for each of the requirements? Was time allowed for questions?

Kirkpatrick’s Levels of Criteria Reaction – did trainees like the program Learning – demonstration of learning at end of program Behavior – actual transfer to job Results – impact on bottom line including efficiency, productivity, cost, etc.

Reaction to Training – Part 1 of 2 Answer the following questions about the training in Active Listening skills using the scale below: 1= Strongly disagree 2= Disagree 3= Neither agree nor disagree 4= Agree 5=Strongly agree    1. The training met the stated objectives 1 2 3 4 5   2. The information provided was enough so I understood the concepts being taught 1 2 3 4 5 3. The practice sessions provided were sufficient 1 2 3 4 5 to give me an idea of how to perform the skill 4. The feedback provided was useful in helping 1 2 3 4 5 me understand how to improve. 

Reaction to Training – Part 2 of 2 5. The training session kept my interest throughout 1 2 3 4 5 6. The pace of the Active Listening session was 1. Way too fast 2.   A bit fast 3.   Just right 4.   A bit slow 5. Way too slow   7. What did you like best about this part of the training    8. What would you have changed Additional Comments: Note: A similar scale would be used for each of the other components of training that were taught.

Paper & Pencil Test for Evaluation of Learning   There is no specific time limit on this test, but most should be able to finish in about one hour. Answers to the questions should be written in the booklet provided. Please read each question carefully as some of the questions have more than one part to them. 1. List four types of active listening and provide an example for each. 2. List the steps in the Conflict resolution model. After each step, provide a relevant example of a phrase that would represent that step. 3. Multiple choice or fill in the blank questions

Potential Questions to Be Addressed in a Process Analysis (During Training) Was there a match between trainer, training techniques, and training/learning objectives?   • Were lecture portions of the training effective? Was involvement encouraged/solicited?   Were questions used effectively?  • Did the trainer appropriately conduct the various training methodologies (case study, role play, etc.)? Were they explained well?        Did the trainer use the allotted time for activities?        Was enough time allotted?        Did trainees follow instructions Was there effective debriefing following exercises?   • Did the trainer follow the training design and lesson plans? Was enough time given for each of the requirements? Was time allowed for questions?

Possible Additions to Kirpatrick’s Model Expanding reaction measures to include reactions to training methods Splitting reactions to assess perceptions of enjoyment, usefulness, difficulty Adding 5th level to include societal contributions Adding 5th level to include return on investment

Data Collection Methods Interview Questionnaire Direct observation Tests and simulations Archival performance data Types of data – individual, group, system wide

Research Design Internal Validity External Validity Did a change occur? Was the change a result of the training? External Validity Will the change occur in other situations with different trainees?

Threats to Internal Validity History – events occurring during training Maturation – natural improvements with development Testing – effects on pre-tests on changes Instrumentation – different measures at different point in time

Threats to Internal Validity (continued) Statistical regression – select trainees measured at extremes in abilities/KSAs and regress to mean Reactive effects of research situation – motivation (Hawthorne effect) Multiple treatment effects – previous training

Threats to External Validity Representativeness of sample and setting Differential selection – basis for choosing trainees Experimental mortality – turnover

Experimental Designs Control group – random assignment to treatment and control groups so trainees have similar characteristics Two-group posttest only Two-group pretest/posttest Four group design – control for effects of pretest and prior knowledge

Non-experimental Designs Case study – intensive, descriptive study with after only measures One group pretest/posttest design

Quasi-experimental Designs Nonequivalent control group design – analyze for equivalence or use multiple regression to control for demographic factors Time series – establish base line, then training, then series of measures to determine if change has occurred

Ethical Issues Concerning Evaluation Research Confidentiality Informed consent Withholding training – can provide later Use of deception Pressure to produce positive results

Assessing the Impact of HRD Programs Cost-benefit – monetary costs in relation to nonmonetary benefits Cost-effectiveness – monetary costs in relation to monetary benefits Return on investment = results/costs