Part 2. Evaluation FORMATIVE SUMMATIVE Why Evaluate ? Makes good economic sense Allows for accountability Answers the increased scrutiny of Budgets Fulfills.

Slides:



Advertisements
Similar presentations
Return on Investment: Training and Development
Advertisements

A Systems Approach To Training
Performance Management
Effective Supervision
Ardianto Prabowo Indira Dwiajeng A. Maria Angela Masruroh Susan Kuncoro.
HR SCORECARD Presented By ADEEL TARIQ MOBASHIR ALI.
Training Evaluation Presentation by Ranjith Menon.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
Show Me the Money: Moving From Impact to ROI
Supervisor Workplace Skills Series: Delegation
Evaluating Training Programs Level 5: Return on Investment Kelly Arthur Richard Gage-Little Dale Munson.
Assessing Learning for Classroom Success Friday, February 16, :00-3:00.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Systems Analysis and Design 9th Edition
2/5/2004CEdMA Europe CEdMA Toolset 2004 ROI in Training Norman Buckberry Chairman CEdMA Europe
7 Chapter Management, Leadership, and the Internal Organization
Return On Investment Integrated Monitoring and Evaluation Framework.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Evaluation of Training
Measuring Change Evaluation Phase EME 6636 Change Management Instructor: Dr. Darabi Yang Xu.
OS 352 4/22/08 I. Reminders. Read Hammonds and Combs et al. articles for Thurs. There will be a check of articles so please bring them to class. II.Exam.
6 Chapter Training Evaluation.
The Analyst as a Project Manager
By Saurabh Sardesai October 2014.
Chapter 6 Training Evaluation
Coaching and Providing Feedback for Improved Performance
Lecture 23.
© 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Chapter 13: Developing and Implementing Effective Accounting Information Systems
5 Chapter Training Evaluation.
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
Evaluating a Research Report
…and how to give it to them What CEOs Want Patti Phillips, Ph.D.
Marketing Management Marketing Plan Prepared by Kathleen Porter.
Delivering Strategic Value through Meetings and Events.
Interactive Training Skills: Evaluation Study Executive Summary Presentation Service Merchandise Company Date: Wednesday, April 6, 2011 CONFIDENTIAL Consultants:
Chapter 12 Evaluating and improving human resource management.
Systems Analysis and Design 8 th Edition Chapter 2 Analyzing the Business Case.
Dr. Shulagna Sarkar Assistant Professor , Jt. Coordinator – PGDM – HRM
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Chapter 6 Training Evaluation
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Strategies for Knowledge Management Success SCP Best Practices Showcase March 18, 2004.
Copyright 2004 ROI Institute, Inc. how me the money! Moving from Impact to ROI Patti Phillips, Ph.D.
DAY-5 Evaluation and Control Process 6 Yes Determine what to measure. Measure performance. Take corrective action. STOP No Does perform- ance match.
Lynn Schmidt, PhD ATD Puget Sound October 21, 2014.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Performance Consulting: Make Performance Your Business! (and Prove It)
1.Upgrade bandwidth 2.Upgrade desktop operating systems 3.Increase the number of servers 4.Determine industry standards 5.Convert to VoIP Which of the.
Kathy Corbiere Service Delivery and Performance Commission
How to Use the ROI Methodology in Your Team Coaching Engagements
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Return on Investment De Kock, Philip Training Evaluation & Measuring ROI on Training. Ripple Training, Cape Town, January 2007.
Info-Tech Research Group1 Manage the IT Portfolio World Class Operations - Impact Workshop.
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
قياس أثر التدريب أكاديمية العبيكان للمعرفة. تقييم التدريب Training evaluation methodologies Donald Kirkpatrick's Four Levels of Evaluation The Phillips.
Output of Development Phase Training Title: Training Duration: Trainer: Seating: Target Population: Overall Training Objectives: After having attended.
How to Demonstrate the Value of Contract Education Jack J. Phillips, Ph.D. Chairman ROI Institute, Inc. May 11, :00 AM- 12:00 PM 5 Levels of Evaluation.
Improved socio-economic services for a more social microfinance.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Return on Investment: Training and Development Session 1 ROI and Evaluation.
1 1 Using Level 5 ROI to Ensure Training Success Breakout Session # Name: Wayne R. Brantley, MS Ed., PMP, CRP, ITIL, CPLP Senior Director of Professional.
3 Chapter Needs Assessment.
SPIRIT OF HR.in TRAINING EVALUATION.
Management, Leadership, and the Internal Organization
Management, Leadership, and the Internal Organization
Evaluating the Performance of Salespeople
6 Chapter Training Evaluation.
Presentation transcript:

Part 2

Evaluation FORMATIVE SUMMATIVE

Why Evaluate ? Makes good economic sense Allows for accountability Answers the increased scrutiny of Budgets Fulfills pressure to contribute

Why Evaluate ? Considers peer pressure Provides self satisfaction Relies on more information now available Supports professionalism Meets your need for Survival

Results-Based Approach 1.Requires tangible results that can be measured 2.Includes at least one method of evaluations 3.Needs someone to have responsibility for evaluation; all PIT staff should be involved 4.Encourages management to be involved in intervention 5.Supports a proactive increase in management commitment

Results-Based Approach 6.Needs a systematic measurement and evaluation plan in place 7.Needs participants to understand their role to achieve results 8.Works best when interventions are connected to strategic initiatives

Exercise # 1 Briefly identify what you would do as a PIT professional to ensure the results-based approach was utilized in any type of HRD intervention. (Handout 2)

Purposes and Uses of Evaluations 1.Asks: Are we accomplishing program objectives? 2.Identifies strengths and weakness of interventions (Effectiveness and efficiency) 3.Compares costs to the benefits 4.Helps answer who should participate in and future interventions 5.Tests the clarity and validity of tests, cases, and exercises used in the intervention

Purposes and Uses of Evaluations 6.Identifies which participants were most successful with the intervention 7.Gathers data for marketing future programs 8.Determines: Was the intervention appropriate solution for the identified gap in performance 9.Establishes database for future decision making concerning interventions (Becomes best practices)

Exercise # 2 List 2-3 reasons why you think these elements are necessary in the overall purpose/use of evaluation for HRD processes. (Handout 3)

Last Week  Why evaluate  Results-based approach  Purposes and uses of evaluation  Reviewed Kirkpatrick model

This Week  More on Kirkpatrick model  Types of data collection  Types of data  Evaluation instruments  Tips for Survey/Questionnaire * Will vary from note taking H.O.

Warm Up Exercise F

Levels of Evaluations Donald Kirkpatrick - 4 Levels I. Reaction – Were the participants pleased with intervention II. Learning – What did the participants learn III. Behavior – Did the participants change their behavior based on what was learning? IV. Results – Did the change in behavior positively affect the organization

Exercise # 3 What are some advantages and limitations in each of the four levels of Kirkpatrick’s evaluation model? (Use note-taking handout)

Collect Post Intervention Data (Handout 3) 1. Surveys 2. Questionnaires 3. On the job observation 4. Post intervention interviews 5. Focus groups 6. Program assignments 7. Action plans 8. Performance contracts 9. Follow up session 10. Performance monitoring

Exercise # 4 Using the 10 items in the Post Intervention Data handout (3): Each group describe (fabricate) a situation where you might gather post intervention data for a level 2/3/4 evaluation. Be prepared to explain your rationale.

Hard Data Hard data can be group into four categories 1. Output – of the work unit 2. Quality – how well produced or serviced 3. Cost – improvement in costs 4. Time – savings

Soft Data Work habits – absenteeism, tardiness, violations Climate – number of grievances, complaints, job satisfaction Satisfaction – Favorable reactions, employee loyalty, increased confidence

Exercise # 5 List some advantages and disadvantages / limitations when collecting hard and soft data. (Use Notetaking H.O. p. 3/4)

Evaluation Instruments Validity – does it measure what it is supposed to measure Content validity – how well does the instrument measure the content/objectives of the program Construct validity –How well does it measure the construct (abstract variable such as KSA) Concurrent validity – How well does the instrument measure up against other instruments Predictive validity – how well can it predict future behavior

Part 3

Last Week  More on Kirkpatrick model  Types of data collection  Types of data

This Week  Developing evaluation instruments  The survey process * New Handouts

Exercise # 5 List some five things you would do to enhance the chances of getting a good number returns for surveys/ questionnaires.

Survey Process -- Tips Communicate –in advance –the purpose Signed introductory letter Explain who’ll see data Use anonymous input?

More Tips Keep it simple Simplify the response process –Bubble format –SASE Utilize local support

More Tips Consider incentives Use follow-up reminders Send a copy of the results to the participants

Action Planning Most common type of follow-up assignments. Developed by participants. Contains detailed steps to accomplish measurable objectives. Shows what is to be done, by whom, when. Must be monitored

Action Plans Communicate the action plan requirement early and explain its value (avoids resistance) Describe the action-planning process at the beginning of the program (outline) Teach the action-planning process Allow time to develop the plan Have the facilitator approve the action plans Require participants to assign a monetary value for each improvement (helps ROI later)

Action Plans Ask participants to isolate the effects of the program Ask participants to provide a confidence level for estimates Require action plans to be presented to the groups by participants (peer review) if possible Explain the follow up mechanism Collect action plans Summarize the data and calculate ROI

Converting Data to Monetary Benefits Focus on a unit of measure Determine the value of each unit Calculate the change in performance Determine an annual amount for the change Calculate the total value of the improvement

Ways to Put Value on Units Cost of quality Converting Employee time Using Historical Costs Using Internal and External Experts External Databases Estimates from the participants Estimates from Supervisors Estimates from Senior Managers Using HRD staff estimates

Credibility Source of the Data Source for the study Motives of evaluators Methodology of the study Assumptions made in the analysis Realism of the outcome data Types of data Scope of analysis

Guidelines for Study Credible and reliable sources for estimates Present material in an unbiased, objective way Fully explain methods (step by step) Define assumptions and compare to other studies Consider factoring or adjusting output values when they appear unrealistic Use hard data whenever possible

Identifying intangible Measures (Not based upon monetary values) Employee satisfaction Stress reduction Employee turnover Customer satisfaction, retention Team effectiveness

Determining Costs Collect costs on every intervention Costs will not be precise (hard to be perfect) Be practical - work with accounting department Define which costs to collect, categories, sources Computerize Cost accumulation (track accounts) Cost estimation (forumulas - page 227) Fully load with all costs possible - be truthful Overhead, benefits, perpherial costs, etc

Data Analysis Statistics (use a professional) Use terms appropriately (ie, Significant difference) Statistical deception (erroneous conclusions)

Return on Investment Compares costs to benefits Complicated Usually annualized Business case specific Communicate the formula used

I. Reaction and Planned Action – measure’s participants reactions and plans to change II. Learning – Measures KSA III. Job Applications – Measures change of behavior on the job and specific use of the training material IV. Business results – Measures impact of the program V. Return on investments – Measures the monetary value of the results and costs for the program, usually expressed as a percentage Phillips ROI Framework

Level 1ReactionParticipants Level 2LearningParticipants Level 3Job Applications Immediate Managers Level 4Business Impact Immediate/Senior Managers Level 5Return on Investment Senior Managers Executives Evaluation as a Customer Satisfaction Tool

From Level 4 to Level 5 Requires Three Steps: 1. Level 4 data must be converted to monetary values 2. Cost of the intervention must be tabulated 3. Calculate the formula

ROI Process Model Collect Data Isolate Effects of Training, Convert Data to Monetary Valve Calculate the Return on Investment Identify Intangible Benefits Tabulate Program Costs

ROI Formula ROI (%) = Net Program Benefits Program Costs X 100

Two Methods 1. Cost/Benefit Ratio An early model that compares the intervention’s costs to its benefits in a ratio form. For every one dollar invested in the intervention, X dollars in benefits were returned. 2. ROI Formula Uses net program benefits divided by costs, and expressed as a percent.

Cost / Benefit CBR = Program Benefits Program Costs

ROI Formula ROI (%) = Net Program Benefits Program Costs X 100

Cautions With Using ROI Make sure needs assessment has been completed Include one or more strategies for isolating the effects of training Use the reliable, credible sources in making estimates Be conservative when developing benefits and costs Use caution when comparing the ROI in training and development with other financial returns Involve management in developing the return Approach sensitive and controversial issues carefully Do not boast about a high return (internal politics)

Implementation Issues Identify an internal champion (cheerleader) Develop an implementation leader Assign responsibilities so everyone will know their assigned tasks and outcomes Set targets (annual) Develop a project plan, timetable Revise/Develop Policies and Procedures (Page 367) Assess the climate – gap analysis, SWOT, barriers

Preparing Your Staff Involve the staff in the process Using Evaluation Data as a Learning Tool Identify and remove obstacles (complex, time, motivation, correct use of results)

ROI Administration  Which programs to select? Large target audiences Important to corporate strategies Expensive High Visibility Comprehensive needs assessment

ROI Administration Reporting Progress Status meetings (facilitated by expert) Report progress Add evaluation areas Establish Discussion Groups Train the Management Tool

Timing of Evaluation 1.During the program 2.Time series – multiple measures 3. Post tests – timing

Questionnaire Content Issues Progress with objectives Action plan status Relevance of intervention Use of program materials Knowledge/skill application Skill frequency Changes in the work unit Measurable improvements/ accomplishments Monetary impact Confidence level Improvement linked with the intervention Investment perception Linkage with output measures Barriers Enablers Management support Other solutions Target audience recommendations Suggestions for improvement