Download presentation
Presentation is loading. Please wait.
Published byClyde Gibson Modified over 8 years ago
1
Output of Development Phase Training Title: Training Duration: Trainer: Seating: Target Population: Overall Training Objectives: After having attended the training, the participants will be able to: 1. 2. 3. TimeTopicLearning Objective Learning Points MethodAV & Other Resources 0900-1000Welcome, Ground Rules Evaluation Methods: When to be conducted, How to be conducted Measures to Assist Transfer:
2
Training Evaluation
3
Learning Objectives After attending this session, you will be able to: List the views in favour of and against evaluation of training Differentiate between (through explaining) ‘process’ and ‘outcome’ evaluations Describe the purpose of different levels of outcome evaluations through use of appropriate examples Explain the concept of ‘ROI’ of training Apply the costs-benefits and utility analysis while deciding on measures of effectiveness to be put in place
4
Training Processes Model Outcome Evaluation Data Needs Analysis Phase Input Process Output Design Phase Input Process Output Development Phase Input Process Output Implementation Phase Input Process Output Evaluation Phase Input Process Output Process Evaluation Data Triggering Event
5
5 ADDIE “Model” Analysis Design Development Implementation Evaluation
6
Design Phase Input Process Output Training Needs Organizational Constraints Learning Theory Develop Training Objectives Determine factors that facilitate learning & transfer Identify alternative method ofinstruction Evaluation objectives
7
InputProcessOutput Methods Alternative Instructional Strategy Determine Factors that Facilitate Learning & Transfer Program Development Plan Instructional Material Instructional Equipment Trainee and Trainer Manuals Facilities Trainer Development and Implementation Phase
8
Evaluation
9
What Do We Evaluate? Effectiveness of Training Process Outcomes Reaction (Entry Behaviour, Attention, Motivation) Learning (Improvement in KSA) Behaviour (After Training, On the Job) Results (Has the performance problem been addressed?) ROI – Helps establish credibility of training function; Accountability is increasingly being demanded; Transition to a strategic function
10
Evaluation Phase Input Process Output Evaluation Objectives Design Issues Organizational Constraints Evaluation Strategy and Design Process Measures Outcome Measures - Reaction - Learning - Behavior - Results
11
Process Evaluation Design Vs Implementation Design – Implementation As per Design – Expected outcome should follow What if: Design – Implementation Gap is there? Implementation as per Design but outcome does not follow? Process Before Training During Training
12
9-12 Issues: Before Training Were needs diagnosed correctly? What data sources were used? Was a knowledge/skill deficiency identified? Were trainees assessed to determine their prerequisite KSAs? Were needs translated into training objectives? Were all objectives identified? Were the objectives written in a clear, appropriate manner? Was an evaluation system designed to measure objectives? Was the training program designed to meet all the training objectives? Was previous learning that supports or inhibits training identified? Were individual differences assessed/factored into training design? Was trainee motivation to learn assessed? What steps were taken to assess trainee motivation to learn? What processes were built to facilitate recall and transfer? Are the training techniques to be used appropriate for each of the learning objectives?
13
Issues: During Training Were the trainer, training techniques, and training/learning objectives well matched? Were lecture portions of the training effective? Was involvement encouraged/solicited? Were questions used effectively? Did the trainer conduct the various training methodologies (case study, role play, etc.) appropriately? Was enough time allotted? Did the trainer use the allotted time for activities? Was enough time allotted? Did trainees follow instructions Was there effective debriefing following exercises? Did the trainer follow the training design and lesson plans? Was enough time given for each of the requirements? Was time allowed for questions?
14
Kirkpatrick’s Model of Evaluation Evaluation at 4 Levels Reaction Level Learning Level Behaviour Level Results Level
15
Reaction Level Affective and Utility Reaction evaluation does not measure learning – but, attitudes and perception towards training Areas for focus: Training Relevance Content Materials and exercises Reactions to trainer Facilities Closed and open ended
16
Questionnaire - Reaction Level What you want to find out (training objective) Develop a written set of questions covering areas mentioned earlier Develop a scale to quantify Demographic and other details (easier to analyse) Anonymous Open questions
17
Learning Level Learning = Enhancement in K/S/A Learning level evaluation tries to measure above changes Knowledge Declarative – Paper & Pencil Tests Procedural - Mental Models, Sequence Tests, Relationships Strategic – Probe questions, What-If and What-next Skills – Situation creation, standardisation, inter-rater reliability Compilation Automaticity Attitude – Attitude Scales
18
(Job) Behaviour Level Often ignored because difficult to measure Measures used at TNA can be used as a post-test evaluation Interviews, Questionnaires, Direct Observation, Performance Records, 360 assessment, Scripted Situations
19
(Organisational) Results Level Relatively easy to measure Performance problem identified during TNA (number of defects, sales closure, number of employee grievances etc) Can have pre-test, post-test Easy to measure but difficult to establish cause-effect relationship
20
Results Job Behaviour Learning Reaction External Environment of the Org: Economy, regulations, suppliers etc Internal Environment of the Org: Policies, Procedures, Systems, Employee Performance, KSAs, Needs etc Transfer of Training, Motivational Forces in Job Setting, Opportunity to apply training on the job, KSAs Trainee readiness for the course, Trainee motivation to learn, Design, Materials, Contents, Trainer’s Behaviour Perceived match between trainee expectations and what training provided Factors Affecting the Outcomes
21
Going Beyond Kirkpatrick – 5 th Level – ROI (Phillips) Evaluation Level DescriptionMeasurement Focus 1.Reaction and Planned Action Measures participants satisfaction with the programme and captures planned actions 2.LearningMeasures changes in KSAs 3.Application and Implementation Measures changes in on-the-job behaviour and progress with applications 4.Business ImpactMeasures changes in business impact measures 5.Return on Investments Compares programme monetary benefits to programme costs
22
Why Training ROI? The pressure from clients and senior managers The competitive economic pressures that are causing intense scrutiny of all expenditures The general trend towards accountability with all staff support groups To justify the existence of the training department To decide whether to continue or discontinue the training programmes
23
What ROI Intends to Accomplish Measures Contribution Sets Priorities Focuses on Results Alters Management Perceptions of Training
24
Main Challenges Isolating the effects of Training Converting data into monetary values
25
Designing a Training Training Title: Training Duration: Trainer: Seating: Target Population: Overall Training Objectives: After having attended the training, the participants will be able to: 1. 2. 3. TimeTopicLearning Objective Learning Points MethodAV & Other Resources Evaluation Methods: When to be conducted, How to be conducted Measures to Assist Transfer:
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.