Human Resources Training and Individual Development February 11: Training Evaluation.

Slides:



Advertisements
Similar presentations
McGraw-Hill/Irwin © 2003 The McGraw-Hill Companies, Inc.,All Rights Reserved. Part Three SOURCES AND COLLECTION OF DATA.
Advertisements

Experimental and Quasiexperimental Designs Chapter 10 Copyright © 2009 Elsevier Canada, a division of Reed Elsevier Canada, Ltd.
Chapter 8: Construct and External Validity in Experimental Research
Training. Training & Development Definition “The systematic acquisition of attitudes, concepts, knowledge, roles, or skills, that result in improved performance.
Evaluation Procedures
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
1 COMM 301: Empirical Research in Communication Lecture 10 Kwan M Lee.
GROUP-LEVEL DESIGNS Chapter 9.
Experimental Research Designs
Quantitative Research. 1.Goals, principles & processes 2.Operationalization & measuring variables: ex: job satisfaction 3.Examining correlations 4.Using.
FUNDAMENTAL RESEARCH ISSUES © 2012 The McGraw-Hill Companies, Inc.
Correlation AND EXPERIMENTAL DESIGN
Experimentation. What experiments are not: Two commonsense uses of “experiment”: – any study – to “try something out” – …both of these understandings.
Studying Behavior. Midterm Review Session The TAs will conduct the review session on Wednesday, October 15 th. If you have questions, your TA and.
Doing Social Psychology Research
6 Chapter Training Evaluation.
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved Chapter Training Evaluation.
Chapter 6 Training Evaluation
Chapter 8 Experimental Research
Experimental Design The Gold Standard?.
Chapter 3 The Research Design. Research Design A research design is a plan of action for executing a research project, specifying The theory to be tested.
Day 6: Non-Experimental & Experimental Design
Chapter 6 Training Evaluation. Chapter 6 Training Evaluation Concepts Training Evaluation: The process of collecting data regarding outcomes needed to.
5 Chapter Training Evaluation.
Understanding Variability Unraveling the Mystery of the Data’s Message Becoming a “Data Whisperer”
Evaluating a Research Report
Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 3 The Research Endeavor.
Chapter 8 Causal-Comparative Research Gay, Mills, and Airasian
Techniques of research control: -Extraneous variables (confounding) are: The variables which could have an unwanted effect on the dependent variable under.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Enhancing Rigor in Quantitative Research.
Group Quantitative Designs First, let us consider how one chooses a design. There is no easy formula for choice of design. The choice of a design should.
Correlational Research Chapter Fifteen Bring Schraw et al.
CAUSAL INFERENCE Presented by: Dan Dowhower Alysia Cohen H 615 Friday, October 4, 2013.
STUDYING BEHAVIOR © 2009 The McGraw-Hill Companies, Inc.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
After giving this lecture the student should be able to do the following: After giving this lecture the student should be able to do the following: List.
Thomson South-Western Wagner & Hollenbeck 5e 1 Chapter Sixteen Critical Thinking And Continuous Learning.
Training Evaluation.
Chapter 6 Training Evaluation
Research Methods in Psychology Chapter 2. The Research ProcessPsychological MeasurementEthical Issues in Human and Animal ResearchBecoming a Critical.
METHODS IN BEHAVIORAL RESEARCH NINTH EDITION PAUL C. COZBY Copyright © 2007 The McGraw-Hill Companies, Inc.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
Research Design ED 592A Fall Research Concepts 1. Quantitative vs. Qualitative & Mixed Methods 2. Sampling 3. Instrumentation 4. Validity and Reliability.
SOCW 671: #6 Research Designs Review for 1 st Quiz.
Non-Experimental Design Where are the beakers??. What kind of research is considered the “gold standard” by the Institute of Education Sciences? A.Descriptive.
Chapter Eight: Quantitative Methods
11 Chapter 9 Experimental Designs © 2009 John Wiley & Sons Ltd.
Ch 1: Scientific Understanding of Behavior Ch 4: Studying Behavior.
Behrens Automotive Components Company Pre-International Assignment Training Proposal.
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
Training Evaluation Chapter 6
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
6 - 1 Training Evaluation Introduction (1 of 2) Training effectivenessTraining effectiveness refers to the benefits that the company and the trainees.
Criminal Justice and Criminology Research Methods, Second Edition Kraska / Neuman © 2012 by Pearson Higher Education, Inc Upper Saddle River, New Jersey.
Copyright © 2009 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 47 Critiquing Assessments.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
CHAPTER 4: HOW PSYCHOLOGISTS USE THE SCIENTIFIC METHOD.
Assessing Impact: Approaches and Designs
Causation & Experimental Design
Chapter 10 Causal Inference and Correlational Designs
Chapter 4: Studying Behavior
Part Three SOURCES AND COLLECTION OF DATA
Chapter Six Training Evaluation.
SPIRIT OF HR.in TRAINING EVALUATION.
Chapter 4 Studying Behavior
Quantitative Research
External Validity.
Training Evaluation Chapter 6
Research Methods for the Behavioral Sciences
6 Chapter Training Evaluation.
Presentation transcript:

Human Resources Training and Individual Development February 11: Training Evaluation

Objectives Explain why evaluation is important Identify & choose outcomes to evaluate Identify how to measure outcomes Discuss validity threats and experimental designs Discuss issues to consider in evaluation design to improve the ability to make inferences

What is Training Evaluation? Assessing the effectiveness of the training program in terms of the benefits to the trainees and the company –process of collecting outcomes to determine if the training program was effective –from whom, what, when, and how information should be collected

Importance of Evaluation Why evaluate training?

Purpose of Evaluation Summative Evaluation: Collecting data to assess learning & other criteria Formative Evaluation: collecting data to assess how to make the program better

What Should Be Evaluated? Cognitive Learning Skills Learning Affect ‘Objective’ results ROI

How Do You Measure Outcomes? Cognitive Learning Skills Affect Results ROI

Criteria for Evaluation Criteria should be based on training objectives –all objectives should be evaluated Criteria should be relevant (uncontaminated, not deficient), reliable, practical, and they should discriminate Criteria should include reactions, learning (verbal, cognitive, attitudes), results & ROI

Outcomes: Relevance, Contamination and Deficiency Criteria relevance Criterion contamination Criterion deficiency

Criterion deficiency, relevance, and contamination Relevance Outcomes Identified by Needs Assessment and Included in Training Objectives Outcomes Measured in Evaluation Deficiency Contamination Outcomes Related to Training Objectives

Outcomes: Reliability, Discrimination and Practicality Reliability Discrimination Practicality

Evaluation Design: Purpose What is the objective of the training program? What do you want to accomplish with the evaluation?

Evaluation Design How do you determine whether the program has worked or not? -Measuring outcomes -Outcome constructs have changed as it was expected -The training program was responsible for the change, and not something else -The broader question is: How can you infer causality?

Causal Inferences Knowledge is most applicable when it can be expressed in terms of cause-and-effect relationships One thing causes another if: –Temporal precedence –Covariation –No alternative explanations Control

Experimental Designs Test whether one or more manipulated variables have an effect on specific criteria when controlling for other factors Treatment – manipulated variables Two features –A. Timing of treatment and measurement ensures temporal precedence –B. Attempt to eliminate alternative explanations If A and B, then covariation is interpreted as causality

Experimental Designs: Threats to Validity Threats to validity refer to a factor that will lead one to question either: –The believability of the study results (internal validity), or –The extent to which the evaluation results are generalizable to other groups of trainees and situations (external validity)

The Correlation Coefficient A relationship index What is r for a perfect positive relationship? How about a perfect negative relationship? No relationship? What is the interpretation of the squared correlation coefficient?

Correlation and Causality Ex. 1: Job satisfaction – job performance Ex. 2: Reading – IQ

Evaluation Design No one “best way” Some ways definitely better than others Need to rely on logic to design an evaluation program that will allow you to make inferences Purpose of Training Constraints Design evaluation so that you can make inferences

Considerations for Evaluation Design Use pre-test & post-test Have a comparison group Random Assignment

Evaluation Why are the “best designs” not used?

Self-Talk Training Example Increase confidence of out of work managers –subject pool -- managers who have given up job search –1/2 given self-talk training and encouraged to continue job search –results: “self-talk training worked because the treatment group had a higher rate of reemployment” –implications: “self-talk is useful in increasing reemployment of the hard core unemployed” What is wrong with this?

Implications for Evaluation Design Explicitly consider evaluation at all levels –reactions, learning (verbal, skills, attitudes) results, ROI Make links from objectives clear Specify types of outcome measures (include examples) Specify evaluation strategy

Next Time Traditional training methods –Noe Chapter 7 –Broadwell and Dietrich (1996)