Building Capability and Expertise with ROI Implementations

Slides:



Advertisements
Similar presentations
Return on Investment: Training and Development
Advertisements

Key Performance Indicators KPI’s
Building the Balanced Scorecard
St. Louis Public Schools Human Resources Support for District Improvement Initiatives (Note: The bullets beneath each initiative indicate actions taken.
HR SCORECARD Presented By ADEEL TARIQ MOBASHIR ALI.
WELCOME GHANAHR TRAINING. OUR VISION & MISSION OUR VISION To be the recognised Consultancy Firm with the highest ethical standard that delivers unrivalled.
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
Show Me the Money: Moving From Impact to ROI
Mergers & Acquisitions The real success factor = 1,5 or 2,5? 1.
Systems Analysis and Design 9th Edition
1 Introduction to Workforce Planning and Development in State of Alaska Executive Branch Departments.
Return On Investment Integrated Monitoring and Evaluation Framework.
Accountability in Human Resource Management Dr. Jack J. Phillips.
Formative and Summative Evaluations
1 14. Project closure n An information system project must be administratively closed once its product is successfully delivered to the customer. n A failed.
3 Chapter Needs Assessment.
The Analyst as a Project Manager
Human Resource Management: Gaining a Competitive Advantage
Measuring (and Driving) the Value of Training Bruce Winner, Los Rios CCD – Government Training Academy Bruce blogs to the training community at -
Chapter 19 OPERATIONS AND VALUE CHAIN MANAGEMENT © 2003 Pearson Education Canada Inc.19.1.
Developing the Marketing Plan
Chapter 2 Strategic Training
Human capital management
MGT-555 PERFORMANCE AND CAREER MANAGEMENT
Training for Improved Performance
What is Business Analysis Planning & Monitoring?
Measuring ROI: Progress,
Job Analysis - Competency Modeling
Performance Technology Dr. James J. Kirk Professor of HRD.
Unit 5:Elements of A Viable COOP Capability (cont.)  Define and explain the terms tests, training, and exercises (TT&E)  Explain the importance of a.
Industrial Engineering Roles In Industry
GBA IT Project Management Final Project - Establishment of a Project Management Management Office 10 July, 2003.
…and how to give it to them What CEOs Want Patti Phillips, Ph.D.
Show Me the Money Jack J. Phillips, Ph.D.. AgendaAgenda Explore implementation of a balanced set of human capital measures Examine the various ways to.
Delivering Strategic Value through Meetings and Events.
Interactive Training Skills: Evaluation Study Executive Summary Presentation Service Merchandise Company Date: Wednesday, April 6, 2011 CONFIDENTIAL Consultants:
Chapter 12 Evaluating and improving human resource management.
© 2001 Change Function Ltd USER ACCEPTANCE TESTING Is user acceptance testing of technology and / or processes a task within the project? If ‘Yes’: Will.
3 1 Project Success Factors u Project management important for success of system development project u 2000 Standish Group Study l Only 28% of system development.
Training and Developing a Competitive Workforce 17/04/2013.
HRM Human Resource management. HRM Class Emphasis Show “best-in-class” HRM practices Understand how HRM practices support business strategy How to use,
Job Analysis - Competency Modeling MANA 5322 Dr. Jeanne Michalski
Copyright 2004 ROI Institute, Inc. how me the money! Moving from Impact to ROI Patti Phillips, Ph.D.
Lynn Schmidt, PhD ATD Puget Sound October 21, 2014.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Performance Consulting: Make Performance Your Business! (and Prove It)
The Implementation of BPR Pertemuan 9 Matakuliah: M0734-Business Process Reenginering Tahun: 2010.
How to Use the ROI Methodology in Your Team Coaching Engagements
Or How to Gain and Sustain a Competitive Advantage for Your Sales Team Key’s to Consistently High Performing Sales Organizations © by David R. Barnes Jr.
UNIT-1 Introduction to quality management PRESENTED BY N.VIGNESHWARI.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Info-Tech Research Group1 Manage the IT Portfolio World Class Operations - Impact Workshop.
Human Resource Management: Gaining a Competitive Advantage Chapter 07 Training Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
قياس أثر التدريب أكاديمية العبيكان للمعرفة. تقييم التدريب Training evaluation methodologies Donald Kirkpatrick's Four Levels of Evaluation The Phillips.
Output of Development Phase Training Title: Training Duration: Trainer: Seating: Target Population: Overall Training Objectives: After having attended.
How to Demonstrate the Value of Contract Education Jack J. Phillips, Ph.D. Chairman ROI Institute, Inc. May 11, :00 AM- 12:00 PM 5 Levels of Evaluation.
Unifying Talent Management. Harnessing the Power of Workforce Intelligence in Talent Planning to Drive Business Performance.
1 Human Resource Audits. 2 Human Resource Audit? A human resource audit evaluates the personnel activities used in an organization. The audit may include.
Return on Investment: Training and Development Session 1 ROI and Evaluation.
1 1 Using Level 5 ROI to Ensure Training Success Breakout Session # Name: Wayne R. Brantley, MS Ed., PMP, CRP, ITIL, CPLP Senior Director of Professional.
PERSPECTIVE OF HUMAN RESOURCE MANAGEMENT
3 Chapter Needs Assessment.
Systems Analysis and Design in a Changing World, 4th Edition
Overview – Guide to Developing Safety Improvement Plan
Chapter Six Training Evaluation.
Overview – Guide to Developing Safety Improvement Plan
Core Competencies of a World Class Customer Advisory Board
6 Chapter Training Evaluation.
Presentation transcript:

Building Capability and Expertise with ROI Implementations ROI Certification Building Capability and Expertise with ROI Implementations Jack J. Phillips, Ph. D. Patti P. Phillips, Ph. D.

Reaction Objectives Provide participants knowledge and skills that are: Relevant to their job Important to their current job success Immediately applicable New to their understanding of accountability Relevant to their colleagues in similar job situation

Learning Objectives Enable participants to: Describe the five critical components of a successful evaluation practice Describe the five levels of evaluation Describe the six types of data in the chain of impact Describe the ten steps in the ROI Methodology and . . .

Learning Objectives Follow the 12 guiding principles Plan and execute an ROI evaluation project Calculate and explain the difference in the benefit- cost ratio (BCR) and the return on investment (ROI) Communicate the results of an ROI study to a variety of stakeholders Implement the ROI Methodology within their organization

Application Objectives Support participants as they: Build support for the ROI Methodology in their organization Complete their initial ROI evaluation project Plan and implement future ROI projects Revise/update internal evaluation strategy/practice Brief/teach other in the ROI Methodology Change the way the propose, implement, and evaluation programs, processes and initiatives

Impact Objectives Enable participants to realize positive consequences as a result of applying what they learn such as: Improving program effectiveness Improving program efficiencies Expanding successful programs Redesigning or discontinuing ineffective programs Improving relationships with clients and executives Enhancing the influence of their function within the organization

Setting the Stage The ROI Methodology Implementing ROI Planning Evaluation Implementing ROI Converting Data to Money Collecting Data Tabulating Costs and Calculating ROI Reporting Results Isolating the Effects of the Program Forecasting ROI

Program Success will be Measured by: Ratings achieved on end of course evaluation Increase in knowledge gain as reported on end- of course evaluation Demonstration of knowledge through: Course exercises Case study presentations ROI project plan presentation ROI implementation plan and . . .

Program Success will be Measured by: ROI project completion following ROI Methodology steps and guiding principles Evaluation planning Data collection Data analysis Report submittal Steps toward implementing (beyond ROI project) completed as planned.

Roles of the ROI Implementation Leader Technical Expert Consultant Problem Solver Initiator Designer Developer Coordinator Cheerleader Communicator Process Monitor Planner Analyst Interpreter Teacher

Skill Areas for Certification Planning for ROI calculations Collecting evaluation data Isolating the effects of solutions Converting data to monetary values Monitoring program costs Analyzing data including calculating the ROI Presenting evaluation data Implementing the ROI process Providing internal consulting on ROI Teaching others the ROI process

Certification Projects Item Due Date Case Study Presentation During Workshop Implementation Plan for the ROI Process End of Workshop ROI Project Plan Implementation complete 3-6 Months ROI project complete 6 Months, Ideally

Case Study Presentation Team based assignment Present results to executive audience (you own the study) Q and A session Critique the case study (you don’t own the study)

Ideally, complete within 6 months ROI Project Based on a planned or anticipated ROI impact study Individual or team based Provide a copy of Data collection Plan during workshop Provide a copy of ROI Analysis Plan during workshop Ask for input from the group Ideally, complete within 6 months

Implementation Plan Requirements Specific Motivational Achievable Realistic Time-based Must be within your control!

Global Communications

Paradigm Shift in Programs Activity-Based Results-Based Characterized by: Characterized by: no business need for the program no assessment of performance issues no specific measurable objectives no effort to prepare program participants to achieve results program linked to specific business needs assessment of performance effectiveness specific objectives for application and business impact results expectations communicated to participants

Paradigm Shift Activity-Based Results-Based Characterized by: Characterized by: no effort to prepare the work environment to support transfer no efforts to build partnerships with key managers no measurement of results or cost benefit analysis reporting on programs is input focused environment prepared to support transfer partnerships established with key managers and clients measurement of results and cost benefit analysis (ROI) reporting on programs is output focused

Definition of Results-Based Programs Programs are initiated, developed, and delivered with the end in mind. A comprehensive measurement and evaluation system is in place for each program. Impact and ROI evaluations are regularly developed. Program participants understand their responsibility to obtain results with programs. Support groups help to achieve results from training.

How Results-Based Are Your Programs? Take the assessment entitled “How Results-Based Are Your Programs?” When taking this assessment, try to be candid in selecting the appropriate response. Score your assessment using the guidelines provided. Compare your scores with others. What is considered to be an adequate score? What are the potential uses of this survey?

Human Capital Perspectives Traditional View Emerging View Expenses are considered costs Expenditures are viewed as a source of value Function is perceived as a support staff Function is perceived as a strategic partner Involved in setting HR budget Top executives involved in budget Metrics focus on cost and activities Metrics focus on results Metrics created and maintained by HR alone Top executives involved in metrics design and use . . . and

Human Capital Perspectives Traditional View Emerging View Little effort to understand the ROI in HC ROI has become an important tool Measurement focuses on the data at hand Measurement focuses on the data needed Measurement is based on what others measure Measurement is based on organization needs Programs initiated without a business need Programs linked to specific business needs Reporting is input-focused Reporting is output-focused

Increased Interest in the Value of Human Capital DRIVERS: The increasing cost of human capital Consequences of improper or ineffective HR practices Linkage of human capital to strategic initiatives Increased accountability of all functions Top executive requirement for HR contribution, and human capital ROI

ROI Profitability Vital Signs Effectiveness Benefits vs Costs Balanced Scorecard PROGRAM IMPACT BOTTOM LINE CONTRIBUTION ROI Strategic Accountability Evaluation Value Based PERFORMANCE STANDARDS Effectiveness Vital Signs ECONOMIC VALUE ADDED Shareholder Value Benefits vs Costs

Three Journeys The need to change the HR measurement mix Setting the investment level for human capital Valuing human capital Each is explored next…

Apex, Inc.

Measuring the HR Contribution: Status Comparison Of Approaches To Measure The HR Contribution Measuring the HR Contribution: Status

HR Accountability Progress ROI Methodology HR Profit Center Leading Edge Approaches Balanced Scorecard HR Macro Studies Human Capital Measurement Solid Value-Added Approaches Competitive HR Benchmarking HR Accountability HR Satisfaction Surveys HR Cost Monitoring HR Key Indicators HR Auditing Approaches Early HR Case Studies Feedback Surveys MBO in Personnel 1960’s 1970’s 1980’s 1990’s 2000

Leading Edge Approaches to Measuring the HR Contribution Balanced Scorecard HR Profit Center Human Capital Measures HR Macro Studies ROI Process Most promise as an immediate tool

Recommendations for Measurement Categories Select an approach in each of these categories: Attitudinal Data Comparative Data Human Capital Measures Benefit/Cost Analysis (ROI) Notes:________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________

Common Human Capital Measures Innovation and Creativity Employee Attitudes Workforce Stability Employee Capability Human Capital Investment Leadership Productivity Workforce Profile Job Creation and Recruitment Compensation and Benefits Compliance and Safety Employee Relations

Setting the Investment Level 1 Let Others Do It!

Motivating Forces Approaches Cost control Lack of infrastructure Instability Access to expertise Short-term focus Survival Hire fully competent employees Use contract employees Outsource major functions

Setting the Investment Level 2 Invest the Minimum!

Motivating Forces Approaches Low cost industry High labor use Strong competition Employees are dispensable Pay minimum wages Provide few benefits Keep training simple Expect turnover and address it

Human Resources Development Issues Training Job-related Skills Focus Costs per Employee Low Risk for Payback Time for Payback Short Risk for Payback Low

Human Resources Development Issues Education Preparation for the next job Focus Costs per Employee Moderate Risk for Payback Time for Payback Medium Risk for Payback Moderate

Human Resources Development Issues Cultural change and continuous learning Focus Costs per Employee High Risk for Payback Time for Payback Long Risk for Payback High

Setting the Investment Level 3 Invest with the Rest!

Motivating Forces Approaches Desire to have best practices Benchmarking is acceptable Benchmarking is used in all parts of organization Benchmarking can be low cost Benchmarking is low risk Locate existing reports Participate in existing projects Create a custom project Search the literature

Human Capital Investment Benchmarks Human Resource Expenses (HR Department Costs/Budget Total Investment in Human Capital (Total HR expenses plus all salaries and benefits of non-HR staff HR Expenses by Function HR Expenses by Process/Programming Selected HR Costs

Phases of the Benchmarking Process 1 Determining What to Benchmark 7 Initiating Improvement from Benchmarking 2 Building the Benchmarking Team Phases of the Benchmarking Process 6 Distributing Information to Benchmarking Partners 3 Identifying Benchmark Partners 4 Collecting Benchmarking Data 5 Analyzing the Data

Setting the Investment Level 4 Invest Until it Hurts!

Motivating Forces Approaches Fad chasing Happy employee dilemma Quick fixes Retention concerns Competitive strategy Union demands We can afford it! Pay above-market wages Provide above-market employee benefits Implement most new fads/ programs Provide all types of employee services

The Relationship Between Over-Investing and Performance Under-Investing Over-Investing Optimal Investment in Human Capital

Setting the Investment Level 5 Invest as Long as There is a Payoff!

Motivating Forces Approaches Need to show HR contribution Increasing cost of human capital Secure funding Business partner Improve processes Measure success of each HR program Collect up to six types of data Use ROI routinely Involve stakeholders Use the data

The ROI Methodology Evaluation Data Collection Planning Develop LEVEL 1: REACTION AND PLANNED ACTIONS LEVEL 3: APPLICATION AND IMPLEMENTATION Develop Objective of Solution(s) Develop Evaluation Plans and Baseline Data Collect Data During Solution Implementation Collect Data After Implementation LEVEL 2: LEARNING AND CONFIDENCE LEVEL 4: BUSINESS IMPACT

Data Analysis Reporting Tabulate Costs of Solution Data Analysis Reporting Generate Impact Study Isolate the Effects Convert Data To Monetary Value Calculate the Return On Investment LEVEL 5: ROI Identify Intangible Measures INTANGIBLE BENEFITS

Methodical Development Training and Learning Organization Development HR Programs Change Initiatives Technology Implementation Quality / Six Sigma Meetings and Events Coaching

Valuing Human Capital: Three Approaches What we know from Logic and Intuition What we know from Macro Level Research What we know from ROI Analysis

1. Logic and Intuition Automation has limitations People are necessary Stock market mystery Accounting dilemma Last source of competitive advantage Superstar Phenomena

Superstar Characteristics People are the difference Good and great Great places to work Most admired companies

2. Macro Level Research HR Effectiveness Index Gallup Studies The Service Profit Chain Watson-Wyatt Studies Deloitte & Touche Studies . . . . and many others

3. ROI Analysis Micro Analysis Tool 5,000 studies per year Over 40 Countries / 25 Languages Variety of Applications ROI Certification ROI Networks ROI Standards ROI Best Practices

Valuing Human Capital The Complete Picture Micro Analysis (ROI Studies) Macro Analysis (Relationships) Logic & Intuition (Intangibles)

Reliance Insurance Company

Matching Evaluation Levels with Objectives Level 1: Reaction Level 2: Learning Level 3: Application Level 4: Business Impact Level 5: Return on Investment

Measurement in Learning and HR Level Measurement Category Current Status* Coverage (Now) (%) Goal in 5 Years Coverage (Goal) (%) Comments About Status O Inputs/Indicators Measures the number of programs, participants audience, costs, and efficiencies 100% This is being accomplished now 1 Reaction and Planned Action Measures reaction to, and satisfaction with, the experience, contents, and value of program Need more focus on content and perceived value

Measurement in Learning and HR Level Measurement Category Current Status* Coverage (Now) (%) Goal in 5 Years Coverage (Goal) (%) Comments About Status 2 Learning Measures what participants learned in the program – information, knowledge, skills, and contacts (takes-away from the program) 30 – 40% 80 – 90% Must use simple learning measures 3 Application Measures progress after the program – the use of information, knowledge, skills, and contacts 10% 30% Need more follow-up

Measurement in Learning and HR Level Measurement Category Current Status* Coverage (Now) (%) Goal in 5 Years Coverage (Goal) (%) Comments About Status 4 Business Impact Measures changes in business impact variables such as output, quality, time, and costs linked to the program 5% 10% This is the connection to business impact 5 ROI Compares the monetary benefits of the business impact measures to the costs of the program. 1% The ultimate level of evaluation

The Results Reacted very positively to the program and found it to be very relevant to their work; Learned new skills and gained new insights about themselves; Utilized the skills and insights routinely with their teams, although they had some difficulty in a few areas; Improved several important work unit measures, with some measures improving as much as 28%; Achieved an impressive 105% return on investment; and Reported an increase in job satisfaction in the work unit.

Key Issues with This Level of Analysis Objectives? Credibility of data? Source of data? Consistent methodology? Scope? Standards? Use of data? Cost of process? Fear of data?

…..and includes a technique to isolate the effects of the program The ROI Process …Generates six types of data: Reaction to a project or program Learning skills/knowledge Application/Implementation progress Business impact related to the project or program Return on Investment Intangible Benefits …..and includes a technique to isolate the effects of the program

ROI by the Numbers Process refined over a 25-year period 5,000 impact studies conducted each year 100 case studies published on ROI 3,000 individuals certified to implement the ROI Methodology 15 ROI books developed to support the process 600 member professional network formed to share information ROI methodology adopted by over 2,000 organizations in manufacturing, service, non-profit, and government settings in over 40 countries

ROI Dilemma Why the gap? 70-80% of organizations want to use ROI Use List ----- ROI Wish List ROI ----- HIGH 15-20% of organizations are currently using ROI LOW Why the gap?

Why Use Impact and ROI Analysis? Reactive Show contributions of selected programs Justify/defend budgets Identify inefficient programs that need to be redesigned or eliminated

Why Use Impact and ROI Analysis? Proactive Aligns programs to business needs Earn respect of senior management/administrators Improve support for programs Enhance design and implementation processes Identify successful programs that can be implemented in other areas

Applications Learning and Development Career Development Competency Systems Diversity Programs E-Learning Executive Coaching Gainsharing Meetings and Events Leadership Development Organization Development Orientation Systems Recruiting Strategies Safety & Health Programs Self-Directed Teams Skill-Based/Knowledge-Based Compensation Technology Implementation Quality Programs Wellness/Fitness Initiatives

Basic Elements An Evaluation Case Applications Framework and Practice Implementation Operating Standards and Philosophy A Process Model

Evaluation Framework Level Measurement Focus 1. Reaction & Planned Action Measures participant satisfaction and captures planned actions, if appropriate 2. Learning & Confidence Measures changes in knowledge, skills, and attitudes related 3. Application & Implementation Measures changes in on-the-job behavior or actions 4. Business Impact Measures changes in business impact variables 5. Return on Investment Compares project benefits to the costs

Defining the Return on Investment Monetary Benefits Program Costs Benefits/Cost Ratio = Net Monetary Benefits Program Costs ROI = X 100

ROI Example Costs for project $80,000 Benefits from project $240,000 BCR = ROI = x 100 = % 3.0 $160,000 $80,000 200

ROI Target Options Set the value at the same level as other investments, e.g. 15% Set slightly above other investments, e.g. 25% Set at break even - 0% Set at client expectations Private sector organizations usually go with option #2; public sector usually prefer option #3.

Characteristics of Evaluation Levels Chain of Value of Customer Frequency Difficulty of Impact Information Focus of Use Assessment Satisfaction Lowest Consumer Frequent Easy Learning Application Impact ROI Highest Client Infrequent Difficult Customers Consumers: The customers who are actively involved in the process. Client: The customers who fund, support, and approve the project

Evaluation Framework and Key Questions Levels of Evaluation Key Questions Answered Level 1: Reaction and Planned Action Was the program relevant to participants’ jobs and mission? Was the program important to participants’ job/mission success? Did the program provide new information? Do participants intend to use what they learned? Would participants recommend it to others? Is there room for improvement with facilitation, materials, and the learning environment?

Evaluation Framework and Key Questions Levels of Evaluation Key Questions Answered Level 2: Learning and Confidence Do participants know what they are supposed to do with what they learned? Do participants know how to apply what they learned? Are participants confident to apply what they learned? Did participants gain new knowledge, change their attitude, increase awareness?

Evaluation Framework and Key Questions Levels of Evaluation Key Questions Answered Level 3: Application and Implementation How effectively are participants applying what they learned? How frequently are they applying what they learned? If they are applying what they learned, what is supporting them? If they are not applying what they learned, why not?

Evaluation Framework and Key Questions Levels of Evaluation Key Questions Answered Level 4: Business Impact So what? To what extent does participant application of what they learned improve the measures the program was intended to improve? How did the program impact output, quality, cost, time, customer satisfaction, employee satisfaction, work habits? What were the consequences of participants’ application of knowledge and skills acquired during the program, process, intervention, change? How do we know it was the program that improved these measures?

Evaluation Framework and Key Questions Levels of Evaluation Key Questions Answered Level 5: ROI Do the monetary benefits of the improvement in business impact measures outweigh the cost of the program?

Chain of Impact Reaction & Planned Action Learning & Confidence Application & Implementation Isolate the Effects of the Program Impact ROI Intangible Benefits

Assessment Objectives Evaluation Needs Program Assessment Objectives Evaluation Potential ROI ROI Payoffs Objectives 5 5 Business Impact Business Needs Objectives Impact 4 4 3 Job Performance Application Application Needs Objectives 3 2 Skills/Knowledge Learning Learning Needs Objectives 2 1 Preferences Satisfaction Reaction Objectives 1

Matching Evaluation Levels with Objectives Reaction Learning Application Impact Return on Investment

The ROI Methodology Evaluation Data Collection Planning Develop LEVEL 3: APPLICATION AND IMPLEMENTATION LEVEL 1: REACTION AND PLANNED ACTION Develop Objective of Solution(s) Develop Evaluation Plans and Baseline Data Collect Data During Solution Implementation Collect Data After Implementation LEVEL 2: LEARNING AND CONFIDENCE LEVEL 4: BUSINESS IMPACT

Data Analysis Reporting Tabulate Costs of Solution Data Analysis Reporting Generate Impact Study Isolate the Effects Convert Data To Monetary Value Calculate the Return On Investment LEVEL 5: ROI Identify Intangible Measures INTANGIBLE BENEFITS

Evaluation Planning Develop/Finalize Objectives Reaction Learning Application Impact ROI

Evaluation Planning Data Collection Plan Broad Program Objectives Measures Data Collection Method/Instruments Data Sources Timing Responsibilities

Evaluation Planning ROI Analysis Plan Data Items (Usually Level 4) Methods for Isolating the Effects of the Program/Process Methods of converting Data to Monetary Values Cost Categories Intangible Benefits Communication Targets for Final Report Other Influences/Issues during Application Comments

Evaluation Planning Project Plan Major Milestones Deliverables Timelines Flow

Data Collection During Program Level 1 Level 2 Method Surveys   Questionnaires   Observation   Interviews   Focus Groups   Tests/Quizzes  Demonstrations  Simulations 

Data Collection Post Program Method Level 3 Level 4 Surveys   Questionnaires   Observations on the job  Interviews  Focus Groups  Action planning/improvement plans   Performance contracting   Performance monitoring  

Isolating the Effects of the Program Use of control groups Trend line analysis Forecasting methods Participant’s estimate Management’s estimate of impact (percent) Use of experts/previous studies Calculate/Estimate the impact of other factors Customer input

Isolating the Effects of the Program Method1 Best Practice Use2 Comparison Group Analysis 35% Trend/Forecasting Analysis 20% Expert Estimation 50% Other 20% NOTES: Which techniques are appropriate in your organization? 1 Listed by credibility 2 Percentages exceed 100%

Example - Use of Control Groups Customer Service Compensation Six sites chosen for program evaluation Each site had a control group and an experimental group randomly selected Experimental group received new plan - control group did not Observed performance for both groups at the same time

Use of Trend Line Analysis Shipment Productivity Percent of Schedule Shipped Shipment Productivity 100% Team Implementation Actual Average 94.4% 95% Average of Trend Projected 92.3% 90% Pre Program Average 87.3% Trend Projection 85% J F M A M J J A S O N D J Months

Example of a Participant’s Estimation

Converting Data to Money Profit/savings from output (standard value) Cost of quality (standard value) Employee time as compensation (standard value) Historical costs/savings from records Expert input External studies Linking with other measures Participant estimation Management Estimation Estimation from staff

Converting Data to Money Credibility Resources Needed Standard values High Low Records/Reports analysis Databases Moderate Expert Estimation

Example of Converting Data Using External Database Cost of one turnover* Middle Manager $70,000 annual salary Cost of turnover 150% Total cost of turnover $105,000 * External data - value obtained from industry related study

Cost of a Sexual Harassment Complaint 35 Complaints Actual Costs from Records Additional Estimated Costs from Staff Legal Fees, Settlements, Losses, Material, Direct Expenses EEO/AA Staff Time, Management Time $852,000 Annually $852,000 35 Cost per complaint = $24,343

Example of Linkage with Other Measures A Compelling Place A Compelling Place A Compelling Place to Work to Shop to Invest Attitude About the Job Customer Retention Employee Recommendations Behavior About the Company Service Helpfulness Impression Merchandise Value Return on Assets Operating Margin Revenue Growth 5-Unit Increase in Employee Drives 1.3-Unit Increase in Customer 0.5 Increase in Revenue Growth

Tabulating Program Costs Direct Program Materials Facilitator Costs Facilities Travel Indirect Needs Assessment Program Development Participant Time Administrative Overhead Evaluation

Stress Intangible Benefits Teamwork Complaints Customer Service Commitment Customer Service Conflicts Engagement Job Satisfaction

ROI Process Flexibility Look Forward Pre-program ROI forecast End-of-program ROI estimation Examine Accomplishments Application data (Level 3) Impact data (Level 4)

Do not Confuse the CFO ROI – Return on Investment . . . not Information . . . not Intelligence . . . not Inspiration . . . not Involvement ROE – Return on Equity . . . not Expectation ROA – Return on Assets . . . not Anticipation ROCE – Return on Capital Employed . . . not Client Expectation

Common Target Audiences Reason for Communication Primary Target Audience Secure approval for program Client, top executives Gain support for the program Immediate managers, team leaders Build credibility for the training Top executives staff Enhance reinforcement of the Immediate managers program Enhance results of future programs Participants Show complete results of the Key client team Stimulate interest in HR programs Top executives Demonstrate accountability for client expenditures All employees Market future HR programs Prospective clients

Select Media Full report Executive summary General overview Impact Studies Full report Executive summary General overview One-page summary Meetings Executive meetings Manager meetings Staff meetings Panel discussions Best practice meetings Internal Publications Announcements Bulletins Newsletters Magazines Progress Reports Schedules Preliminary results Memos Case Studies Program Brochures Scoreboards Electronic Media E-mail Web sites Video blogs

Implementation Issues Resources (staffing / budget) Leadership (individual, group, cross functional team) Timing (urgency, activities) Communication (various audiences) Commitment (staff, managers, top executives) Notes: ____________________________________

Key Implementation Actions Determine /establish responsibilities Develop skills /knowledge with ROI Develop transition / implementation plan Conduct ROI studies Prepare /revise/evaluation /policy/ procedures/guidelines Train/brief managers on the ROI Process Communicate progress/results

Retail Merchandise Company

Utility Services Company

Utility Services Company Business Impact Monthly Improvement in Six Months A Percent Contribution From Team Building B Average Confidence Estimate (Percent) C Adjusted Improvement in Six Months A x B x C Productivity 23% 57% 86% 11.3% Quality 18% 38% 74% 5% Efficiencies 14.5% 64% 91% 8.4%

Utility Services Company Program Costs for 18 Participants = $54,300 Annualized, First Year Benefits Productivity 197,000 Quality 121,500 Efficiency 90,000 408,500 ROI = x 100 = 652% 408,000 - 54,300 54,300

Matching Exercise: The Twelve Guiding Principles of ROI

Regional Public Utility

Level 3 and 4 Objectives Provide: Direction to designers and developers Guidance to instructors and facilitators Goals for participants Satisfaction for program sponsors A framework for evaluators Notes: Explain each of the above: Developers:_________________________________________________________________________________________________________________________________________________________________Facilitators:_________________________________________________________________________________________________________________________________________________________________Participants:_________________________________________________________________________________________________________________________________________________________________Sponsors:___________________________________________________________________________________________________________________________________________________________________Evaluators:__________________________________________________________________________________________________________________________________________________________________

Assessment Objectives Evaluation Needs Program Assessment Objectives Evaluation Potential ROI ROI Payoffs Objectives 5 5 Business Impact Business Needs Objectives Impact 4 4 3 Job Performance Application Application Needs Objectives 3 2 Skills/Knowledge Learning Learning Needs Objectives 2 1 Preferences Satisfaction Reaction Objectives 1

Linking Needs Assessment with Evaluation Program Objectives Evaluation 4 3 2 1 4 3 2 1 An absenteeism problem exists Discussions between team leader/supervisor are not occurring when there is an absence Deficiency in counseling/discussion skills Supervisor prefers to attend training one day per week Weekly absenteeism rate will reduce Counseling discussions conducted in 95% of situations when an unexpected absence occurs Counseling discussion skills will be acquired/enhanced Program receives favorable rating of 4 out of 5 on the structure of program Monitor absenteeism data for six months Follow-up questionnaire to participants to check frequency of discussions - three months Skill practice sessions during program Reaction questionnaire at the end of program

Application Objectives Project Business Alignment and Forecasting The ROI Process Model V Model Learning Needs Preference Needs Measurement and Evaluation Reaction Learning Application Impact ROI Reaction Objectives Learning Objectives Application Objectives Performance Needs Impact Objectives Business Needs Payoff Needs ROI Objectives End Here Start Here 5 4 3 2 1 Initial Analysis

Program Alignment V-Model Needs Objectives Evaluations End Here Start Here Payoff Needs 5 ROI Objectives 5 ROI Absenteeism is costing $10,000 monthly. 4 Impact Objectives 4 Business Needs Impact Unexpected absenteeism is 9% and increasing; greater than benchmarking of 5% Job Performance Needs 3 Application Objectives 3 Application Discussions between team member and supervisor are not occurring when there is an unplanned absence. 2 Learning Objectives 2 Learning Needs Learning Deficiency in counseling/ discussion skills. Measurement and Evaluation Initial Analysis 1 Reaction Objectives 1 Preference Needs Reaction One-day counseling skills workshop must provide usable necessary and relevant skills; facilitator-led; participants are supervisors Project Business Alignment and Forecasting The ROI Process Model

Program Alignment V-Model Needs Objectives Evaluations End Here Start Here Payoff Needs 5 ROI Objectives 5 ROI Absenteeism is costing $10,000 monthly. 4 Impact Objectives 4 Business Needs Impact Unexpected absenteeism is 9% and increasing; greater than benchmarking of 5% Job Performance Needs 3 Application Objectives 3 Application Discussions between team member and supervisor are not occurring when there is an unplanned absence. 2 Learning Objectives 2 Learning Needs Learning Deficiency in counseling/ discussion skills. Measurement and Evaluation Initial Analysis 1 Reaction Objectives 1 Preference Needs Reaction One-day counseling skills workshop must provide usable necessary and relevant skills; facilitator-led; participants are supervisors Project Business Alignment and Forecasting The ROI Process Model

Nissan Motor Manufacturing Company

Wachovia Bank

Metro Hospital

Regional Health Center

Department of Internal Affairs

International Car Rental

Results-Based Approach Performance Assessment and Analysis Process Level 5 Level 4 Level 3 Level 3 and 2 Level 1 Problem/ Opportunity Present or Anticipated Identify Job Performance Needs, Gaps, and Why Identify Solutions Identify Transfer Strategy Options and L-2 & L-3 Support For All Stakeholders Identify Preferences Develop Objectives/ Evaluation Strategy Identify Business, Needs, Gaps, and Stakeholders Each level includes: Data Sources Data Collection Key Questions Key Issues Specify Skill/ Knowledge Deficiencies of Affected Population Training Required ? Level 2 Design Solution and Stakeholder Components Consider Resources/ Logistics Delivery Develop Content/ Materials Implement Pre-activity Conduct/ Implement Solution Implement Transfer Strategy - Solution Transfer Strategy Level 1 Level 2

Phillips ROI Methodology TM Tabulate Costs of Solution Isolate the Effects of the Solution Convert Data To Monetary Value Collect Data After Solution Implementation Develop Report and Communicate Results Calculate The Return On Investment Level 5 Level 3 Level 4 Identify Intangibles Significant Influences Policy Statement Procedures and Guidelines Staff Skills Management Support Technical Support Organizational Culture Intangible Benefits

Key Alignment Questions Is this a problem worth solving? Is there a potential pay off? Needs Program Assessment Objectives Evaluation 5 Potential ROI ROI Payoffs Objectives 5 What is the actual ROI? What is the BCR?

Key Alignment Questions What is the specific measure? What happens if we do nothing? Needs Program Assessment Objectives Evaluation 4 Business Impact Business Needs Objectives Impact 4 Which business measure improved? How much is related to the program?

Key Alignment Questions What is occurring or not occurring on the job that influences the business measure? Needs Program Assessment Objectives Evaluation 3 Job Performance Application Application Needs Objectives 3 What has changed? Which skills/knowledge has been applied?

Key Alignment Questions What skills or knowledge is needed to support the job performance need? Needs Program Assessment Objectives Evaluation 2 Skills/Knowledge Learning Learning Needs Objectives 2 What did they learn? Who did they meet?

Key Alignment Questions How should the solution be structured? Needs Program Assessment Objectives Evaluation 1 Satisfaction Preferences Objectives Reaction 1 What was the reaction to the program? Do we intend to implement the program?

Developing Reaction Objectives Measure At the end of course, participants will perceive program content as relevant to their jobs. 80% of participants rate program relevance a 4.5 out of 5 on Likert scale.

Developing Learning Objectives Measure At the end of the course, participants will be able to implement Microsoft Word. Within a 10-minute time period, participant will be able to demonstrate to the facilitator the following applications on Microsoft Word with zero errors. File, Save as, Save as Web Page Format, including font, paragraph, background, and themes Insert tables, add columns and rows, and delete columns and rows.

Developing Application Objectives Measures Participants will use effective meeting behaviors. Participants will develop a detailed agenda outlining the specific topics to be covered for 100% of meetings. Participants will establish meeting ground rules at the beginning of 100% of meetings. Participants will follow up on meeting action items within three days following 100% of meetings.

Developing Impact Objectives Measures Increase market share. Increase market share of young professionals by 10% within nine months of new ad launch Improve the quality of the X-1350 Reduce the number of warranty claims on the X-1350 by 10% within six months after the program. Improve overall customer satisfaction with quality of the X- 1350 by 10% as indicated by customer satisfaction survey taken six months after the program. Achieve top scores on product quality measures included in industry quality survey.

Developing Level 3 and 4 Objectives

Evaluation Targets (Large Telecommunications Company)

Criteria For Selecting Programs For Level 3 Evaluation Significant gaps in performance suspected Safety and health of employees at risk Learning transfer significantly important to customer service / satisfaction goals Learning transfer significantly important to success of company strategic initiatives Pilot program delivered

Criteria for Selecting Programs for Level 4 & 5 Evaluation The life cycle of the program The linkage of the program to operational goals and issues The importance of the program to strategic objectives The cost of the program Visibility of the program The size of the target audience The investment of time A comprehensive needs assessment is conducted Top executives are interested in the evaluation

Evaluation Planning Meeting Who Should Be Involved? Program owner Program designer Program analyst Program facilitator Business unit partner Subject matter expert Typical participant

Evaluation Planning Meeting Factors For Success Credible sources Access to data Complete coverage Move quickly Consider outputs to be drafts Sponsor sign-off

Evaluation Planning Meeting Agenda Explain purpose Finalize/adjust objectives Complete data collection plan Complete ROI analysis plan step by step Compile ROI project plan step by step

Global Financial Services, Inc.

Data Analysis and Results Results at Level 1 Rating of 4.23 out of 5 achieved on the relevance of ACT! for specific job applications. 92.5% of participants indicated an intention to use ACT! within two weeks of the workshop.

Data Analysis and Results Results at Level 2 83% of the participants scored 75 or better on the ACT! use test. Participants successfully demonstrated an average of 4.2 out of 5 key features of ACT! which are . . .

Key Features of Act! Enter a new contact Create a mail-merge document Create a query Send an e-mail Create a call report

Data Analysis and Results Results at Level 3 Participants indicated that within 10 days, 92% of new customer prospects are entered into the system. Participants report an increase in the number of planned follow-up contacts with customers. Unscheduled audit of daily use resulted in a score of 76% out of a possible 100%.

Average Monthly Change Results at Level 4 Impact Measure Average Monthly Change Contribution of ACT! Annual Value Customer Complaints 24.2 - 43% $ 575,660 Customer Response 18 minutes per customer 72% N/A Sales to Existing Customers $321,000 + 14% $539,280 Customer Satisfaction 26% + 49% Total: $1,114,940

Project Costs Development Costs $ 10,500 Materials/Software $ 18,850 Equipment $ 6,000 Instructor (Including Expenses) $ 7,200 Facilities/Food/Refreshments 60 @ $58 $ 3,480 Participants Time (Lost Opportunity) 58 @ $385 $ 22,330 Coordination/Evaluation $ 15,600 Total: $ 83,960

ROI Calculation $1,114,940 - $83,960 X 100 = 1228% $ 83,960 ROI (%) =

Program Profile Title: Interactive Selling Skills Target Group: Sales Associates in Electronics Vendor Produced and Delivered 3 Days - (2 Days Plus 1 Day) Significant Use of Skill Practices 3 Groups Trained (48 Participants from 3 Stores)

ROI Analysis Profile Post Program Data Collection (4) Performance Monitoring 3 months (3) Questionnaire 3 months (3) Program Follow-up Session 3 weeks (last session) Isolating the Effects of Training Control Group Arrangement Participant’s Estimate (For Back-up) Converting Data to Monetary Values Profit Contribution of Increased Output

Level 1 - Selected Data Success with Objectives 4.3 Relevance of Material 4.4 Usefulness of Program 4.5 Exercises/Skill Practices 3.9 Overall Instructor Rating 4.1

All Participants Demonstrated Level 2 - Selected Data All Participants Demonstrated That They Could Use The Skills Successfully

Level 3 - Selected Data (2 Questions out of 20) I utilize the skills taught in the program Frequency of use of skills

Level 4 Data: Average Weekly Sales Post Training Data Weeks After Training Trained Groups Control Groups 1 $ 9,723 $ 9,698 2 9,978 9,720 3 10,424 9,812 13 $13,690 $11,572 14 11,491 9,683 15 11,044 10,092 Average for Weeks $12,075 $10,449 13, 14, 15

Annualized Program Benefits 46 participants were still in job after 3 months. Average Weekly Sales per Employee Trained Groups $12,075 Untrained Groups 10,449 Increase 1,626 Profit Contribution (2% of Store Sales) 32.50 Total Weekly Improvement (x 46) 1,495 Total Annual Benefits (x 48 Weeks) $71,760

Cost Summary 48 participants in 3 courses Facilitation Fees: 3 courses @ $3750 $11,250 Program Materials: 48 @ $35/participant 1,680 Meals/Refreshments: 4,032 3 days @ $28/participant Facilities: 9 days @ $120 1,080 Participant Salaries Plus Benefits (35% factor) 12,442 Coordination/Evaluation 2,500 Total Costs $ 32,984

Level 5 Data BCR = = ROI (%) = X 100 =

ROI Example: Retail Merchandise Company Tabulating Program Costs $32,984 Calculating the Return on Investment Converting Data to Monetary Value Collecting Post Program Data Isolating the Effects of the Program Follow-up Session Questionnaire Performance Monitoring Control Groups Participants’ Estimates Standard Values $71,760 118% Identifying Intangible Benefits

The ROI Process Takes A Balanced View by Measuring And Reporting: Reaction to program Learning and attitudes Application on the job Impact in work unit Impact on the customer The financial results Intangible benefits Nature and source of problems and opportunities

The Business Case for EI

Self Test: How Results-Based Are Your Human Resources Programs?

Data Collection Issues Objectives Type of data Instruments Methods Sources of data Timing of collection Responsibilities

Collecting Program Data Level 1 2 3 4 Surveys    Questionnaires     Observation   Interviews with Participants    Focus Groups    Tests  Action Planning   Performance Contracting   Performance Monitoring 

Classic Evaluation Instruments Questionnaires Surveys Tests Interviews Focus Groups Observation Performance Records

Applications of Data Collection Instruments Matching Exercise Focus groups Observation Performance Records Survey Test Questionnaire Interview

Data Collection Exercise Part 1 Program: 3-day Leadership Workshop Audience: 50 Middle Level Managers Level 3 Objectives: Apply 11-step goal setting process with each employee three months after workshop Apply techniques that influence motivational climate within three months Apply techniques that inspire teamwork Apply coaching techniques to enhance employee engagement Level 4 Objective: Improve business measures, important to your work unit

Survey/Questionnaire Design Determine the specific information needed Review information with stakeholders Select the type(s) of questions Keep questions and statements simple Develop the questions Design for easy tabulation and analysis and . . .

Survey/Questionnaire Design Check the reading level Address the anonymity issue Test the questions Review results of the field test Develop the completed questionnaire Develop administrative procedures

Common Mistakes in Survey/Questionnaire Design Vague statements/questions Too many questions Reading level too high Improperly worded questions Confusing instructions Too difficult to analyze

Questionnaire Design Checklist Is the overall length appropriate? Is it a valid instrument? Is it a reliable instrument? Do the questions flow properly? Are the types of questions appropriate for the information desired?

Questionnaire Design Checklist Are the questions designed to take advantage of data comparisons? Is it designed to minimize distortion? Are the questions designed to ease data tabulation and analysis? Have administrative issues been addressed? Is it easy to read?

Questionnaire Design Checklist Are the instructions clear? Have steps been taken to ensure confidentiality? Have provisions been made for demographic data? Is the appearance of the questionnaire adequate? Is a pre-test scheduled?

Selecting Survey Scales Variance – are there enough choices? Discrimination – can you tell the difference between choices? Accuracy – do the scale labels accurately describe the choices? Symmetry – is the scale balanced appropriately?

What Makes an Effective Survey Question? Focus – every question should focus on a single issue or specific topic Brevity – short questions present less opportunity for measurement error Clarity – clear questions are understandable to all respondents

Types of Tests Objective Criterion reference tests Norm referenced Performance tests

Types of Objective Tests True/false Matching items Multiple choice items Fill in the blank items Short answer items Essay items

Steps to Developing Objective Tests Focus on one set of related course objectives at a time Determine behavioral evidence of capability related to these objectives Select a format and an item type that fits the objectives Develop 3 to 5 items for each objective Sequence items in a logical order Prepare test instructions that are simple and easy to understand Pilot test

Structured and Unstructured Interview Design Structured and Unstructured List basic questions to be asked. Follow the same principles as survey/ questionnaire design. Allow for probing. Try out the interview. Prepare the interviewers. Provide instructions to the individual being interviewed. Administer the interviews consistently.

Focus Group Guidelines Select topics, questions, and strategy carefully. Keep the group size small. Ensure that there is a representative sample of the target population. Insist on facilitators having appropriate expertise. Stay on track and on time Allow equal time for all participants Control over-talking and under-talking

Observation Guidelines Observations should be systematic Observers should know how to interpret and record what they see Observer’s influence should be minimized Observers must be carefully selected Observers must be prepared

Observation Methods Behavior Checklist Coded Behavior Record Delayed Report Method Video Recording Audio Monitoring Computer Monitoring (software)

Typical Sources of Performance Data Operating reports Departmental reports Work unit audits Key performance indicators Six Sigma reports Scorecards Dashboards

Monitoring Performance Data Identify appropriate data sources. Collect data related to objectives only. Develop new data as needed. Convert current data to usable items. Develop a collection plan to include Who, What, Where, and When.

Characteristics of Effective Instruments Valid Reliable Simple Economical Easy to administer Easy to analyze data

Factors to Consider When Selecting Data Collection Methods Time required for participants Time required for participant’s supervisor Costs of method Amount of disruption of normal activities Accuracy Utility Culture / Philosophy

Sources of Information for Program Evaluation Participants Supervisors of participants Subordinates of participants Peer group Internal staff External Group Organizational performance records

Factors to Consider When Determining Timing of Follow-Up Availability of data Ideal time for behavior change (level 3) Ideal time for business impact (level 4) Convenience of collection Constraints on collection

Data Collection Exercise Program: 3-Day Leadership Workshop Audience: 50 Middle Level Managers (2 Groups) Follow Up: Anonymous questionnaire in 3 months to collect application and impact data Assignment: 1. What topics should be included in the questionnaire?

Cyber International

Sales Culture at Progress Bank

Developing ROI with Action Planning Communicate the action plan requirement early. Describe the action planning process at the beginning of the intervention. Teach the action planning process. Allow time to develop the plan. Have the facilitator approve the action plan. Require participants to assign a monetary value for each improvement. and . . .

Developing ROI with Action Planning Ask participants to isolate the effects of the program. Ask participants to provide a level of confidence for estimates. If possible, require action plans to be presented to the group. Explain the follow-up mechanism. Collect action plans at the pre-determined follow-up time. Summarize the data and calculate the ROI.

Performance Contract Process Steps The participant and supervisor mutually agree on a subject for improvement. A specific measurable goal(s) is set. The learning participates in the program. The contract is discussed, and plans are developed to accomplish the goals. Notes: What key points should be included in a performance contract for a sales manager to attend a 3-day sales management workshop? ____________________________________ ____________________________________ and . . .

Performance Contract Process Steps After the program, the participant works on the contract against a specific deadline. The participant reports the results of the effort to his supervisor. The supervisor and participant document the results for the staff. Notes: What key points should be included in a performance contract for a sales manager to attend a 3-day sales management workshop? ____________________________________ ____________________________________

The Performance Contract Should Be: Written Understandable (by all involved) Challenging (requiring a concentrated effort to achieve) Achievable Largely under the control of the participant Measurable and dated Notes: Can you add other items to the list? ____________________________________ ____________________________________

Response Rate Exercise Program: 3-Day Leadership Workshop Audience: 50 Middle Level Managers (2 Groups) Follow Up: Anonymous questionnaire in 3 months to collect application and impact data, using a 5-page questionnaire Assignment: 1. How many responses do you need? 2. How are you going to ensure you receive the appropriate number response?

Follow-Up Questionnaire Checklist Progress with objectives Action plan implementation Relevance of program Perceived value Use of materials Knowledge/skill enhancement Skills used Changes with work Linkage with output measures Notes: ____________________________________

Follow-Up Questionnaire Checklist Other Benefits Barriers Enablers Management support Other solutions Recommendations for target audience Suggestions for improvement Other comments Notes: ____________________________________

Follow-Up Questionnaire Checklist (OPTIONAL) Improvements/accomplishments Improvement linked with program Monetary impact Confidence level Notes: ____________________________________

Impact Questions for Follow-Up Evaluation How did you use the material from this program? What influence did it have in your work? Team? What is the specific measure influenced? Define it. What is the unit value of the measures? (Profit or Cost) What is the basis of this value? How much did the measure change since the program was conducted? and . . .

Impact Questions for Follow-Up Evaluation What is the frequency of the measure? Daily, Weekly, Monthly, Etc What is the total annual value of improvement? What are the other factors that could have caused this total improvement? What percent of the total improvement can be attributed to this program? What is your confidence estimate of the above data? 0% = No confidence; 100% = Certainty

Performance Contract Sample

Option 1, When You Don’t Have a Clue Option 2, When the Measure is in a Defined Set Option 3, When the Measure is Known

Increasing Response Rates Provide advance communication Clearly communicate the reason for the questionnaire Indicate who will see the results Show how the data will be integrated Keep the questionnaire simple and brief Make it easy to respond Use the local manager to help distribute the questionnaires and show support Let the target audience know that they are part of a carefully selected sample Notes: Can you add to this list? ____________________________________ and . . .

Increasing Response Rates Use one or two follow-up reminders Have the introduction letter signed by a top executive Enclose a giveaway item with the questionnaire Provide an incentive for quick response Send a summary of results to target audience Distribute questionnaire to a captive audience Consider an alternative distribution channel Have a third party gather and analyze data. Notes: Can you add to this list? ____________________________________ and . . .

Increasing Response Rates Communicate the time limit Consider paying for the time it takes to complete the questionnaire Review the questionnaire at the end of the formal session Carefully select the survey sample Allow completion of the survey during work hours Add emotional appeal and . . .

Increasing Response Rates Design questionnaire to attract attention, with a professional format Let participants know what actions will be taken with the data Provide options to respond Use a local coordinator to help distribute and collect questionnaires Frame questions so participants can respond appropriately and make the questions relevant

First Bank Is this situation unusual? Please explain. Should the CEO drop the issue? What are some approaches to resolve this dilemma? What would you do?

Isolating the Effects of a Program Matching Exercise Control group Trend line analysis Forecasting Participant’s estimate Use of customer input Expert estimates

Several Factors Contribute to an Improvement After a Program in Conducted External Factors Management Attention TOTAL IMPROVEMENT AFTER PROGRAM Incentives Systems/Procedures Changes HR Programs EFFECT OF HR ON IMPROVEMENT

Techniques to Isolate the Effects of Programs Use of a control group arrangement Trend line analysis of performance data Use of forecasting methods of performance data Participant’s estimate of impact (percent) Supervisor’s estimate of impact (percent) Management’s estimate of impact (percent) Use of experts/previous studies Calculating/estimating the impact of other factors Use of customer input

Financial Services What are the major problems with the implementation of a control group arrangement illustrated in this case? How can these problems be tackled on a practical basis? Will the same strategy of using control groups work at your organization? Explain.

Use of Control Groups Customer service training Six sites chosen for program evaluation Each site had a control group and an experimental group randomly selected Experimental group received training, control group did not Collected customer service data for both groups at the same time

Control Group Design Control Group M1 M2 Experimental Group M1 Program

Post-Test Only, Control Group Design Measurement Experimental Group Program Measurement

Ideal Experiment Design Group A M1 Program M2 Group B M1 M2 Group C Program M3

Control Group Problems It is inappropriate in many settings Selection of groups Contamination of control group Duration / timing Influences are inconsistent Too research-based for some organizations

Using Pre Data as a Base 1.45% .7% Post Program Six-Month Average Micro Electronics, Inc. 1.85% Pre Program Average CPI Program Conducted 2% 1% Projected Average – Using Pre Data as a Base 1.45% REJECT RATE .7% Post Program Six-Month Average J F M A M J J A S O N D J MONTHS

Questions for Discussion Approximately what improvement in reject rate has resulted from the program? How reliable is this process? When can this process be used?

Formal Internal Complaints of Sexual Harassment Healthcare, Inc. Projected Value Pre-Program Average Complaints Post-Program Average Sexual Harassment Prevention Program O N D J F M A M J J A S O N D J F M A M J J A S O Time Formal Internal Complaints of Sexual Harassment

Use of Trend Line Analysis Percent Of Schedule Shipped Shipment Productivity 100.00% Team Training Program Actual Average 94.4% 95.0% Percent Of Schedule Shipped Average of Trend Projected 92.3% 90.0% Pre Program Average 87.3% Trend Projection 85.0% J F M A M J J A S O N D J Months

Conditions for Trend Line Analysis Use Pre-program data available Data items are stable Pre-program influences expected to continue No new influences enter the post-program period except for program

Woody’s What is the impact of the sales training program on sales? Is this process feasible in your organization? Explain.

• • • • Program Conducted Impact of Training Program $160 1800 1600 1400 1200 1000 800 600 400 200 $1500 $1340 $1100 • • • • Impact of Advertising $240 Y = 140 + 40x 3 9 12 15 18 21 24 27 30 33 36

National Bank

Monthly Increase: 175 Contributing Factors Average Impact on Results Average Confidence Level Sales Training Program 32% 83% Incentive Systems 41% 87% Goal Setting/Management Emphasis 14% 62% Marketing 11% 75% Other 2% 91%

Questions for Discussion What is the number of new credit card accounts per month that can be attributed to the sales training program? Is this a realistic process to estimate of the impact of the program on the increased sales? How could this process be improved?

Using Estimates to Isolate the Effects of a Program Describe the task and the process. Explain why the information was needed and how it will be used. Ask participants to identify any other factors that may have contributed to the increase. Have participants discuss the linkage between each factor and the specific output measure. and . . .

Using Estimates to Isolate the Effects of a Program Provide participants with any additional information needed Obtain the actual estimate of the contribution of each factor. The total must be 100%. Obtain the confidence level from each employee for the estimate for each factor (100%=certainty; 0%=no confidence). The values are averaged for each factor.

The Power of Estimates Research Comparison with other methods Handling objections Management reactions Participant reactions

Key Issues with Estimates Use as a last resort Use most credible source for data Collect data in an unbiased way Adjust for error Report it carefully

Credibility of Data Which of these items have the most credibility? Rank them. Why are these items credible or not credible? List all the factors that influence the credibility of data. Why are we uncomfortable using estimates in our programs?

Credibility of Outcome Data is Influenced by the: Reputation of the source of the data Reputation of the source of the study Motives of the researchers Personal bias of audience Methodology of the study Assumptions made in the analysis Realism of the outcome data Type of data Scope of analysis

Other Isolation Methods Supervisors Managers Experts Previous studies Customers

Use of Participants’ & Managers’ Estimate of Training’s Impact Factor Participants Managers ISDN knowledge, skills, or experience graduates had before they attended the training 13% 14% ISDN knowledge, skills or experience graduates gained from the training 37% 36% ISDN knowledge, skills, or experience graduates acquired on their own after the training 16% 12% ISDN reference material or job aids unrelated to the training, e.g. bulletins, methods & procedure documentation 7% 9% Coaching or feedback from peers 18% 18% Coaching or feedback from graduates’ managers 2% 5% Observation of others 7% 6%

National Computer Company (A)

Questions for Discussion Is this an appropriate opportunity for using a control group? Explain. What factors should be considered when selecting the groups? What other options should be explored? When should the attempt to use control groups be abandoned?

National Computer Company (B)

Program Implementation 42% 40% 38% 36% Δ Program Implementation VOLUNTARY TURNOVER RATE           J F M A M J J A S O MONTH

Questions for Discussion Can a trend line analysis be used? What conditions must be met for this approach to be used? How credible is this approach?

National Computer Company (C)

VOLUNTARY TURNOVER RATE Y = 50 – 3(X) 38% 36% 34% 32% 30%  VOLUNTARY TURNOVER RATE  4% 5% 6% 7% UNEMPLOYMENT RATE

Questions for Discussion How can this data be used to isolate the effects of the HR program? How much of a reduction in voluntary turnover is attributed to the increase in the unemployment rate? What cautions and concerns should be considered?

National Computer Company (D)

Average Confidence Level Contributing Factors Impact on Results Average Confidence Level HR Program 30% 80% Unemployment rate 50% 100% Management Emphasis 5% 70% Competition 15% 90%

Questions for Discussion Who should provide the input on this isolation estimate? How should the data be collected? What makes this process credible? What makes this process not so credible?

Wisdom of Crowds In this case, the average estimate is near perfect Estimates are used everywhere Set up your own experiment Estimates should be adjusted Estimates are okay – defend them; don’t prefer them

Multi National, Inc. (A) Critique the way in which the data was analyzed to develop the final value. What would you have done differently? Do you think that program benefits should be communicated without the cost of the program? Explain. What cautions or concerns should be addressed when communicating impressive results from training programs?

Multi National, Inc. (B) What is the ROI of this program? How does this value compare with the one previously reported? Which value would you use? Is there a way to integrate the two studies? How do you assess the credibility of this process?

Examples of Hard Data Output Costs Time Quality

Characteristics of Hard Data Objectively based Easy to measure and quantify Relatively easy to assign monetary values Common measures of organizational performance Very credible with management

Examples of Soft Data Work Habits Initiative/ Innovation Customer Service Employee Development/ Advancement Work Climate/ Satisfaction

Characteristics of Soft Data Subjectively based in many cases Difficult to measure and quantify, directly Difficult to assign monetary values Less credible as a performance measure Usually behaviorally oriented

Converting Data to Money Matching Exercise Profit/savings from output Cost of quality Employee time as compensation Historical cost/savings from records Expert input External database Linking with other measures End user/performer estimation Management estimation Estimation from HR staff

Five Steps to Convert a Measure to Money Unit of improvement Value of each unit (V) Unit performance change (Δ) Annual performance level change (Δ P) Improvement value (V times Δ P) Notes: ____________________________________

Example Spend about 4 minutes with your team to calculate the annual monetary value of improvement in grievances. Step 1: 1 Grievance Step 2: V = $6,500 Step 3: Δ P = Reduction of 7 grievances per month due to the program Step 4: A Δ P = Step 5: A Δ P x V=

Converting Data Converting output to contribution – standard value Converting the cost of quality – standard value Converting employee’s time – standard value Using historical costs Using internal and external experts Using data from external databases Using participants’ estimates Linking with other measures Using supervisors’ and managers’ estimates Using staff estimates

Data Conversion Issues Use the most credible sources If two credible sources are available, use the most conservative option Adjust for the time value of money Know when to stop this process

Standard Values are Everywhere Finance and Accounting Production Operations Engineering IT Marketing and Customer Service Procurement Research and Development HR

Examples of Techniques Convert Data to Monetary Value Data Conversion Techniques Examples Standard Values Output to Contribution Cost of Quality Employee Time Sales - Profit margin Donations - Overhead margin Unproductive man hours - Hourly Wage* Repackaging – Standard value based on time savings (hourly wage) OSHA fines – Fines associated with incident Unit Per Person Per Hour – Profit of one additional product produced per person per hour at same cost

Examples of Techniques Convert Data to Monetary Value Historical Costs Sexual harassment grievances – Litigation costs Food spoilage – Cost to replenish food inventory Turnover marine engineers – Average replacement costs plus separation costs Internal / External Experts Electric utility rate – Internal economist Life – Internal risk manager External Databases Turnover mid-level manager – ERIC Turnover restaurant wait staff – Google

Examples of Techniques Convert Data to Monetary Value Link with Other Measures Employee satisfaction – Linked to customer satisfaction linked to profit Customer complaints regarding baggage mishandling – Percent complaints linked to percent who will not repurchase seat on airline linked to lost revenue Estimations Participant Supervisors/Managers Staff Unexpected absence – Supervisor estimate (basis provided) x confidence adjustment Unwanted network intrusions – Participant estimate (basis provided) x confidence adjustment

Cost of a Sexual Harassment Complaint 35 Complaints Actual Costs from Records Additional Estimated Costs from Staff Legal Fees, Settlements, Losses, Material, Direct Expenses EEO/AA Staff Time, Management Time $852,000 Annually $852,000 35 Cost per complaint = $24,343

Where to Find Experts The obvious department They send the report It’s in the job title The directory Ask

What Makes an Expert Credible? Experience Neutrality No conflict of interest Credentials Publications Track record

Converting Data Using External Database Cost of one turnover Middle Manager $70,000 annual salary Cost of turnover 150% Total cost of turnover $105,000

Finding the Data Search engines Research databases Academic databases Industry / trade databases Government databases Commercial databases Association databases Professional databases

Customer Satisfaction Positive Correlation     Customer Satisfaction     Revenue

Classic Relationships Job satisfaction Organization commitment Engagement Customer satisfaction Conflicts vs. Turnover Absenteeism Customer satisfaction Productivity Revenue

Linkage with Other Measures 1.3-Unit Increase in Customer A Compelling Place A Compelling Place A Compelling Place to Work to Shop to Invest Customer Recommendations Attitude About the Job Service Helpfulness Return on Assets Operating Margin Revenue Growth Customer Impression Employee Behavior Merchandise Value Attitude About the Company Employee Retention Customer Retention 1.3-Unit Increase in Customer Impression 5-Unit Increase in Employee Attitude 0.5 Increase in Revenue Growth Drives Drives

Estimating the Value Use the most credible source Check for biases Discuss the value in general terms Provide information to assist in the estimates Collect data in a non-threatening way Adjust for the error

Turnover Cost Summary Entry level – hourly, non-skilled 30-50% Job Type / Category Turnover Cost Ranges Entry level – hourly, non-skilled 30-50% Service / Production workers – hourly 40-70% Skilled hourly 75-100% Clerical / Administrative 50-80% Professional 75-125% Technical 100-150% Engineers 200-300% Specialists 200-400% Supervisors / Team Leaders 100-150% Middle Managers 125-200%

Turnover Costs Summary Exit cost of previous employee Recruiting cost Employment cost Orientation cost Training cost Wages and salaries while training Lost productivity Quality problems Customer dissatisfaction Loss of expertise/ knowledge Supervisor’s time for turnover Temporary replacement costs

Converting Data: Questions to Ask What is the value of one additional unit of production or service? What is the value of a reduction of one unit of quality measurement (reject, waste, errors)? What are the direct cost savings? What is the value of one unit of time improvement? Are cost records available? Is there an internal expert who can estimate the value? Notes: An HRD executive is quoted “the conversion of soft data to a monetary value creates an illusion of a precision that does not exist. As a result, we do not use soft data savings in any of our evaluation projects.” Do you agree? ____________________________________ and. . . .

Converting Data: Questions to Ask Is there an external expert who can estimate the value? Are there any government, industry, or research data available to estimate the value? Are supervisors of program participants capable of estimating the value? Is senior management willing to provide an estimate of the value? Does the staff have expertise to estimate the value? Notes: An HRD executive is quoted “the conversion of soft data to a monetary value creates an illusion of a precision that does not exist. As a result, we do not use soft data savings in any of our evaluation projects.” Do you agree? ____________________________________

Short-Term Solutions Defined in terms of the time to complete or implement the program Is appropriate when this time is a month or less Is appropriate when the lag between Levels 3 and 4 is relatively short Reflects most HR solutions

When Estimating Time for Long-Term Solutions Secure input from all key stakeholders (sponsor, champion, implementer, designer, evaluator) Be conservative Have it reviewed by Finance & Accounting Use forecasting

Converting Your Level 4 Measures to Money Isolation Technique(s) Data Conversion Technique(s)

Total Fitness Company Calculate the annual savings from the improvement. Is this a credible process?

Absenteeism linked to program 7% - 4% = 3% 3% X 40% = 1.2% Absence days prevented 240 days X 120 employees X 1.2% = 346 days Monetary Value 346 days X $105/day = $36,330 or 346 X $90/day = $31,140

Data Conversion Test Is there a standard value? Is there a method to get there? Move to intangible benefits No No Yes Yes Add to numerator With minimum resources? Move to intangible benefits No Yes

Convince it’s credible in 2 minutes? Move to intangible benefits No Yes Convince it’s credible in 2 minutes? Move to intangible benefits No Yes Convert data and add to numerator

Reasons for Developing Cost Data To determine the overall expenditure To determine the relative cost To predict future program costs To calculate benefits versus costs To improve the efficiency To evaluate alternatives To plan and budget To develop a marginal cost pricing system To integrate data into other systems Notes: ____________________________________

Issues About Tracking Costs Monitor costs, even if they are not needed for evaluation Costs will not be precise Use a practical approach Minimize the resources to track costs Estimates are acceptable Use caution when reporting costs Do not report costs of a program without reporting benefits (or at least have a plan)

How Much Should You Spend on HR? Overall Expenditures Total Expenditures Total – Human Capital % of Payroll % of Revenues % of Operating Costs Expenditures per Employee

How Much Should You Spend on HR? Functional Area Needs Assessment Development Delivery/Implementation Operation/Maintenance Evaluation

Questions for Discussion Is there a significant difference between estimated and actual costs? Explain. How did you determine what your targets would be? What should you spend?

Overall Cost Categories Analysis costs Development costs Delivery costs Operating / Maintenance costs Evaluation costs Notes: Which cost categories are appropriate for your organization? ____________________________________ ____________________________________

Tabulating Program Costs Recommended Items Needs assessment (prorated) Development costs (prorated) Program materials Facilitator / coordinator costs Facilities costs Travel / Lodging / Meals Participants’ time (salaries and benefits) Administrative / Overhead costs Operations / Maintenance costs Evaluation costs Notes: Which cost categories are included in your calculations? ____________________________________

Prorating Cost Life cycle approach Initial cost plus annual updates

Example of Prorating Leadership 101 5-year life cycle 200 participants per year $75,000 initial development costs 2 groups of 25 are being evaluated at the ROI level How much development costs should be charged to the ROI project?

Overhead Allocation Example Portion of budget not allocated to specific projects $548,061 Total number of days dedicated to specific projects/programs 7,450 Per day overhead allocation $______ What is the total overhead allocation for the program that takes 3 days to complete?

Cost Estimating Worksheet Costs Classification Matrix Cost Estimating Worksheet

Federal Information Agency (A) What types of data should be collected for application and implementation? What business impact measures should be collected? What is the time frame for data collection? Which cost categories should be utilized in capturing the actual cost of the program? Can the value of this program be forecasted? If so, how?

Federal Information Agency (B) Please calculate the actual cost of the program for 100 participants. Assume a 5% dropout rate each year. Most of these costs are estimated or rounded off. It this appropriate? Explain. What issues surface when developing cost data? How can they be addressed?

Different Approaches Cost Benefit Analysis Return on Investment Payback Period Discounted Cash Flow Internal Rate of Return Utility Analysis Consequences of not providing learning systems Most Common Notes: ____________________________________

Defining the Benefit Cost Ratio Program Benefits Program Costs Benefit/Cost Ratio = Example Program Benefits = $71,760 Program Costs = $32,984 BCR = 2.1756

Defining the Return on Investment Net Program Benefits Program Costs ROI (%) = X 100 Example Net Program Benefits = $38,776 Program Costs = $32,984 ROI = 117%

Defining the Payback Period Total Investment Annual Savings Payback Period = X 12 Example Total Investment = $32,984 Annual Savings = $71,760 Payback Period = .85 X 12 = 10.2 months

ROI Target Options Set the value as with other investments, e.g. 15% Set slightly above other investments, e.g. 25% Set at break even - 0% Set at client expectations

A Rational Approach to ROI Keep the process simple Use sampling for ROI calculations Always account for the influence of other factors Involve management in the process Educate the management team Communicate results carefully Give credit to participants and managers Plan for ROI calculations Notes: Identify (2) HRD program examples where a cost/benefit calculation would be appropriate to use. 1.________________________________ _________________________________ 2.________________________________ Identify (2) HRD program examples where a cost/benefit calculation would not be appropriate to use. 1.________________________________

The Journey to Increased Accountability Level 1 (Reaction) Level 2 (Learning) Level 3 (Application) Level 4 (Business Impact) Level 5 (ROI) Profit Center Normal Accountability Time

Cautions When Using ROI Take a conservative approach when developing both benefits and costs. Use caution when comparing the ROI in HR with other financial returns. Involve management in developing the methodology. Fully disclose the assumptions and methodology. and . . .

Cautions When Using ROI Approach sensitive and controversial issues with caution. Teach others the methods for calculating the return. Recognize that not everyone will buy into ROI. Do not boast about a high return. Choose the place for the debates. Do not try to calculate the ROI on every program.

Improper Use of ROI ROI – return on information ROI – return on intelligence ROI – return on involvement ROI – return on inspiration ROI – return on implementation ROI – return on initiative

ROI Myths ROI is too complex for most users. ROI is too expensive, consuming too many critical resources. If senior management does not require ROI, there is no need to pursue it. ROI is a passing fad. ROI is too subjective ROI is for post analysis only

R O I The Potential Magnitude of an ROI % + 1,500 % + R O I 3. and an effective solution is implemented at the right time for the right people cost at a reasonable When with and and 5. linkage exists to one or more business measures 1. A need is identified 4. the solution is applied and supported in the work setting 2. a performance gap existing or a new requirement introduced

Guiding Principles 1. When a higher level evaluation is conducted, data must be collected at lower levels. 2. When an evaluation is planned for a higher level, the previous level of evaluation does not have to be comprehensive. 3. When collecting and analyzing data, use only the most credible sources. 4. When analyzing data, choose the most conservative among alternatives. At least one method must be used to isolate the effects of the meeting. If no improvement data are available, it is assumed that little or no improvement has occurred.

Guiding Principles 7. Estimates of improvement should be adjusted for the potential error of the estimate. 8. Extreme data items and unsupported claims should not be used in ROI calculations. 9. Only the first year of benefits should be used in the ROI analysis of short-term projects. 10. Meeting costs should be fully loaded for ROI analysis. Intangible measures are defined as measures that are purposely not converted to monetary value. The results from the ROI methodology must be communicated to all key stakeholders.

Typical Intangible Measures Linked with Programs Job satisfaction Organizational commitment Climate Engagement Employee complaints Recruiting image Brand awareness Stress Leadership effectiveness Resilience Caring Career minded Customer satisfaction Customer complaints Customer response time Teamwork Cooperation Conflict Decisiveness Communication

Identification of Intangible Measures: Timing and Source ROI Analysis Planning Needs Assessment Data Collection Data Analysis 1 2 3 4

Issues with Intangibles May be the most important data set Are not converted to money by definition Are usually not subjected to “isolating” Must be systematically addressed Must be reported “credibly”

Reporting Intangibles Usually presented as a table Must indicate how the data were collected Use rules to decide if a measure should be listed Be prepared for further analysis

Communication Challenges Measurement and evaluation are meaningless without communication Communication is necessary for making improvements Communication is a sensitive issue Different audiences need different information

Communication Principles Keep communication timely Target communication to specific audiences Stay unbiased and modest with the message Carefully select communication media Keep communication consistent with past practices Incorporate testimonials from influential individuals Consider your function’s reputation when developing the overall strategy Use language your audience understands

Audience Selection Questions Are they interested in the program? Do they really want to receive the information? Has someone already made a commitment to them regarding communication? Is the timing right for this audience? Are they familiar with the program? How do they prefer to have results communicated? Are they likely to find the results threatening? Which medium will be most convincing to them?

Common Target Audiences Reason for Communication Primary Target Audience Secure approval for program Client, top executives Gain support for the program Immediate managers, team leaders Build credibility for the staff Top executives Enhance reinforcement of the program Immediate managers Enhance results of future programs Participants Show complete results of the program Key client team Stimulate interest in programs Top executives Demonstrate accountability for client expenditures All employees Market future programs Prospective clients

Complete Report General information Methodology for impact study Data analysis Costs Results Barriers and enablers Summary of findings Conclusions and recommendations Exhibits

The Impact Study Serves Several Purposes: The method of communicating results, only for those audiences needing detailed information. As a reminder of the resources required to produce major studies. As a historical document of the methodology, instruments, and processes used throughout the impact study. A teaching and discussion tool for staff development.

Select Media Impact Studies Meetings Internal Publications Full report Executive summary General overview One-page summary Meetings Executive meetings Manager meetings Staff meetings Panel discussions Best practice meetings Internal Publications Announcements Bulletins Newsletters Magazines Progress Reports Schedules Preliminary results Memos and . . . .

Select Media Case Studies Program Brochures Scoreboards Electronic Media E-mail Web sites Video blogs

Builds credibility for the process Impact Study Outline General Information Objectives of study Background Builds credibility for the process Methodology for Impact Study Levels of evaluation ROI Process Collecting data Isolating the effects of the program Converting data to monetary values Costs Assumptions (Guiding Principles)

The results with six measures: Levels 1-5 and Intangibles Impact Study Outline Results General information Response profile Participant reaction Learning Application of skills / knowledge Barriers Enablers Business impact General comments Linkage with business measures Costs ROI calculation Intangible benefits The results with six measures: Levels 1-5 and Intangibles

Impact Study Outline Summary of Findings Conclusions and Recommendations Conclusions Recommendations Exhibits

Communicating with Senior Management Can they take it Do they believe you

Purpose of the Meeting Create awareness and understanding of ROI Build support for the ROI methodology Communicate results of study Drive improvement from results Cultivate effective use of the ROI methodology

Meeting Ground Rules Do not distribute the impact study until the end of the meeting Be precise and to the point Avoid jargon and HR speak Spend less time on the lower levels of evaluation data Present the data with a strategy in mind

Presentation Sequence Describe the program and explain why it is being evaluated Present the methodology process Present the reaction and learning data Present the application data List the barriers and enablers to success Address the business impact

Presentation Sequence Show the costs Present the ROI Show the intangibles Review the credibility of the data Summarize the conclusions Present the recommendations

Communication Progression Detailed Study Executive Summary One Page Summary Meeting No Meeting First 2 ROI Studies 3-5 ROI Studies 6 Plus ROI Studies

ROI Impact Study: One-Page Summary Program Title: Preventing Sexual Harassment at Healthcare, Inc. Target Audience: First and Second Level Managers (655) Secondary: All employees through group meetings (6,844) Duration: 1 day, 17 sessions

Brief Reports Executive Summary Slide Overview 1-page Summary (see example) Brochure

Electronic Reporting Website E-mail blogs Video

Mass Publications Announcements Bulletins Newsletters Magazines

Case Study Internal Use Communicate results Teach others Build a history Serve as a template Make an impression

Case Study External Publication Provide recognition to participants Improve image of function Enhance brand of department Enhance image of organization

Micro Level Scorecard Macro Level Scorecard 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 0 Indicators 1 Reaction 2 Learning 3 Application 4 Impact 5 ROI Intangibles 1 2 3

Building a Macro Scorecard Provides macro-level perspective of success Serves as a brief report versus detailed study Shows connection to business objectives Integrates various types of data Demonstrates alignment between programs, strategic objectives, and operating goals

Seven Categories of Data Indicators Reaction and Planned Action Learning Application Business Impact ROI Intangibles

Potential Reporting 0. Indicators Number of Employees Involved Total Hours of Involvement Hours Per Employee Training investment as a Percent of Payroll Cost Per Participant

Potential Reporting I. Reaction and Planned Action Percent of Programs Evaluated at this Level Ratings on 7 Items vs. Target Percent with Action Plans Percent with ROI Forecast

Potential Reporting II. Learning Percent of Programs Evaluated at This Level Types of Measurements Self Assessment Ratings on 3 Items vs. Targets Pre/Post – Average Differences

Potential Reporting III. Application Percent of Programs Evaluated at This Level Ratings on 3 Items vs. Targets Percent of Action Plans Complete Barriers (List of Top Ten) Enablers (List of Top Ten) Management Support Profile

Potential Reporting IV. Business Impact Percentage of Programs Evaluated at This Level Linkage with Measures (List of Top Ten) Types of Measurement Techniques Types of Methods to Isolate the Effects of Programs Investment Perception

Potential Reporting V. ROI Percent of Programs Evaluated at This Level ROI Summary for Each Study Methods of Converting Data to Monetary Values Fully Loaded Cost Per Participant

Potential Reporting Intangibles List of Intangibles (Top Ten) How Intangibles Were Captured

Appropriate Level of Data 1 2 3 4 5 Use of Evaluation Data Appropriate Level of Data 1 2 3 4 5 Adjust program design   Improve program delivery   Influence application and impact   Enhance reinforcement  Improve management support   Improve stakeholder satisfaction    Recognize and reward participants    Justify or enhance budget   Develop norms and standards    Reduce costs     Market programs     Expand implementation to other areas  

Delivering Bad News Never fail to recognize the power to learn and improve with a negative study. Look for red flags along the way. Lower outcome expectations with key stakeholders along the way. Look for data everywhere. Never alter the standards. Remain objective throughout the process. and . . .

Delivering Bad News Prepare the team for the bad news. Consider different scenarios. Find out what went wrong. Adjust the story line to “Now we have data that shows how to make this program more successful.” In an odd sort of way, this becomes a positive spin on less-than-positive data. Drive improvement.

Analyze the Results of Communication Observe reactions Solicit informal feedback Collect formal feedback Monitor blogs Make adjustments

ROI Possibilities Pre-Program ROI forecast End-of-Program ROI forecast with Level 1 Data End-of-Program ROI forecast with Level 2 Data Follow-Up ROI forecast with Level 3 Data Follow-Up ROI evaluation with Level 4 Data

ROI at Different Levels Data Collection Timing - (Relative Cost to ROI with: to the Initiative) Credibility Accuracy Develop Difficulty Least Least Least Least Credible Accurate Expensive Difficult Pre-Program Before Forecast Level 1 Data During Level 2 Data During Level 3 Data After Level 4 Data After Most Most Most Most

Pre-Program Forecast ROI Model TABULATE PROGRAM COSTS ISOLATE THE EFFECTS OF THE PROGRAM CONVERT DATA TO MONETARY VALUES ESTIMATE CHANGE IN DATA CALCULATE THE RETURN ON INVESTMENT IDENTIFY INTANGIBLE BENEFITS

Retail Merchandise Company Questions for Discussion Is a pre-program forecast possible? Which groups should provide input to the forecast?

“Expert” Input for Estimate Sales Increase Estimate (Δ) Forecasted ROI Sales Associates Dept. Managers Store Managers Sr. Executive Analyst Vendor Marketing Analyst Finance Staff Benchmarking Data 0% 5% 10% 15% 12% 25% 4% 2% 9% -100% -30% 33% 110% 95% 350% -40% -80% 22%

Retail Merchandise Company Questions for Discussion Assess the credibility of each “expert” group. Is there any additional information you need? How would you present this to senior management to make a decision to implement the program?

Steps for Pre-Program ROI Forecast Develop Level 3 and 4 objectives, with as many specifics as possible Estimate/Forecast monthly improvement in Level 4 data (ΔP) Convert Level 4 measure to monetary value (V) Develop the estimated annual impact for each measure (ΔPxVx12) Estimate fully-loaded program costs and . . .

Steps for Pre-Program ROI Forecast Calculate the forecasted ROI using the total projected benefits Use sensitivity analysis to develop several potential ROI values with different levels of improvement (ΔP) Identify potential intangible benefits Communicate analysis with caution

Steps to Pre-Program Forecast Measure: Sales Profit Margin: 2% Source Monthly Change Value Annual Change Cost ROI SME $25,000 $500 $6,000 $5,000 20% Vendor $50,000 $1,000 $12,000 140% Participant $30,000 $600 $7,200 44% Supervisor $28,000 $560 $6,720 34%

Sensitivity Analysis Potential Sales Increase (Existing Customers) Potential Complaint Reduction (Monthly Reduction) Expected ROI $25,000 $50,000 $30,000 10 20 30 60% 90% 120% 150% 180%

Input to Forecast Previous experience with same or similar programs Supplier/Designer experience in other situations Estimates from supplier/designer Estimates from SMEs Estimates from client/sponsor Estimates from target participants

Forecasting ROI from a Pilot Program Develop Level 3 and 4 objectives Design/Develop pilot program without the bells and whistles (or use a supplier program) Conduct the program with one or more “typical” groups Develop the ROI using the ROI Process model for Level 4 post-program data Make decision to implement based on results

Level 1 Measures Program content Materials Facilitator / coordinator Relevance / importance Perceived value Amount of new information Recommendation to others Planned improvements Opportunity for forecast

Important Questions to Ask on Feedback Questionnaires Planned Improvements Please indicate what you will do differently on the job as a result of this program 1.________________________________________________________ 2.________________________________________________________ 3.________________________________________________________ As a result of any change in your thinking, new ideas, or planned actions, please estimate (in monetary values) the benefit to your organization (i.e., reduced absenteeism, reduced employee complaints, better teamwork, increased personal effectiveness, etc.) over a period of one year. __________________ What is the basis of this estimate?_______________________________________ What confidence, expressed as a percentage, can you put in your estimate? (0%=No Confidence; 100%=Certainty) ____________________%

ROI with Level 1 Data At the end of the program, ask participants: What knowledge or skills have been improved? What actions are planned with the improved knowledge and skills? Which measures will be influenced? What impact, in monetary units, will this improvement have in the work unit? What is the basis for this estimate? What level of confidence do you place on this estimate? Then, compare total “adjusted” benefits with program costs.

Sales Increase Estimate Participant No. Sales Increase Estimate Basis Confidence Level 1 $20,000 Sales 90% 2 $9,000 2 sales per day 80% 3 $50,000 Sales increase 70% 4 $10,000 3 sales daily 60% 5 Millions 4 sales each day 95% 6 $75,000 More sales 100% 7 $7,500 3 more sales 8 $25,000 4 sales – 1 sale 75% 9 $15,000 One more sale 30% 10 2 new sales 11 $45,000 12 $40,000 2 sales each day 13 No increase 14 $150,000 Many new sales 15 Unlimited Additional sales 50% 16 $37,000 More sales and satisfaction

Retail Merchandise Company Questions for Discussion What is your strategy for analyzing this data? How reliable is this data? How could you use this data?

Level 2 Evaluation Tests Opportunity for forecast Skill practices Self reports Exercises Observations during the training program Checklists by facilitator Team assessments

ROI with Level 2 Data Develop an end-of-program test that reflects program content Establish a relationship between test data and output performance for participants Predict performance levels of each participant with given test scores Convert performance data to monetary value Compare total predicted value of program with program costs

Relationship Between Test Scores and Performance

Relationship Between Test Scores and Sales Performance

Retail Merchandise Company Questions for Discussion Calculate the forecasted ROI. How reliable is this estimate of ROI at Level 2? What other issues might need to be considered in this process? Is this information useful? If so, how should the information be used?

Projected Benefit $9,698 x .14 x .02 x 48 = $1,303 $1,303 BCR = = 1.9 $687 ROI = 90%

Retail Merchandise Company Questions for Discussion What is the ROI for this program? How credible is this approach to calculating ROI? Could this same approach be used to forecast the value prior to the implementation of the program?

ROI Calculation Benefits BCR = = Costs Net Benefits ROI = X 100 =

ROI Calculation $3,242 BCR = = 4.72 $687 $3,242 - $687 ROI = X 100 = 372% $687

ROI with Level 3 Data Develop competencies for the target job. Indicate percentage of job success that is covered in the program. Determine monetary value of competencies, using salaries and employee benefits. Compute the worth of pre- and post-program skill levels. Subtract post-program values from pre-program values. Compare the total added benefits with the program costs.

Advantages of Forecasting Increases the usefulness of data collection Focuses attention on business outcomes Monitors the path to success Compares forecast to actual results to improve forecasts

Forecasting Realities If you must forecast, forecast frequently Consider forecasting an essential part of the evaluation mix Forecast different types of data Secure input from those who know the process best Long-term forecasts will usually be inaccurate

Forecasting Realities Expect forecasts to be biased Serious forecasting is hard work Review the success of forecasting routinely The assumptions are the most serious error in forecasting Utility is the most important characteristic of forecasting

Barriers to ROI Use and Implementation After mastering the ROI model, it is appropriate to examine implementation in more detail. Please take a few moments to identify the barriers to implementation. List all the “things” that can prevent a successful implementation. Be candid.

Overcoming the Barriers Now, identify the actions needed to minimize, remove, or go around the barriers. List all the “steps” that need to be taken to overcome the barrier.

Implementation Issues Resources (staffing / budget) Leadership (individual, group, cross functional team) Timing (urgency, activities) Communication (various audiences) Commitment (staff, managers, top executives)

Typical Barriers I don’t have time for additional measurement and evaluation. An unsuccessful evaluation will reflect poorly on my performance. A negative ROI will kill my program. My budget will not allow for additional measurement and evaluation. Measurement and evaluation are not part of my job. and . . .

Typical Barriers I didn’t have input on this process. I don’t understand this process. Our managers will not support this process. Data will be misused. The data are too subjective.

Building Blocks to Overcome Resistance Utilizing Shortcuts Monitoring Progress Removing Obstacles Preparing the Management Team Initiating the ROI Projects Tapping into a Network Preparing the Staff Revising Policies and Procedures Establishing Goals and Plans Developing Roles and Responsibilities Assessing the Climate for Measuring ROI

Assessing the Climate for Results Survey staff (team members) Survey staff from management perspective Develop gaps (actual vs. desired) Plan actions

Identifying Champions You cannot do it alone Champions have a passion for accountability Consider a champion from each area Network the champions Recognize the champions

Measurement and Evaluation Implementation Project Plan for a Large Petroleum Company Team formed Jan Policy developed Feb-Apr Targets set Jan-Feb Workshops developed Mar-Jul Application Evaluation Project (A) Apr-Sept Impact Evaluation Project (B) Jun-Jan Impact Evaluation Project (C) Sept-Mar and . . .

Measurement and Evaluation Implementation Project Plan for a Large Petroleum Company ROI Project (D) Nov-Aug Staff trained Aug-Jan Vendors trained Feb-Apr Managers trained May-Aug Support tools developed Apr-May Evaluation guidelines developed Feb-Jun

Responsibilities for Champions Designing data collection instruments Providing assistance for developing an evaluation strategy Analyzing data, including specialized statistical analyses Interpreting results and making specific recommendations and . . .

Responsibilities for Champions Developing an evaluation report or case study to communicate overall results Providing technical support in any phase of measurement and evaluation Assisting in communicating results to key stakeholders

Responsibilities for Team Members Ensure that the needs assessment includes specific business impact measures. Develop application objectives and business impact objectives for each program. Focus the content of the program on the objectives of business performance improvement; ensuring that exercises, case studies, and skill practices relate to the desired objectives. and . . .

Responsibilities for Team Members Keep participants focused on application and impact. Communicate rationale and reasons for evaluation. Assist in follow-up activities to capture business impact data. Provide assistance for data collection, data analysis, and reporting. Design simple instruments and procedures for data collection and analysis. Present evaluation data to a variety of groups.

Getting Team Members Involved Developing plans Establishing responsibilities Designing tools and templates Selecting programs for higher level evaluation Driving changes / improvements

Participant Responsibilities Actively participate Learn what’s needed Apply and implement program Secure results Provide data

Conduct Several Studies Cover a variety of areas Move from simple to complex Mix up Levels 3, 4, and 5 Avoid political issues early in the process

Conduct Workshops and Briefings 1 to 1½-hour briefings 1-day workshops 2-day workshops Special topics

Creating an ROI Network Within the organization Within the local area Within the community

Typical Network Issues Communication methods Membership rules Meeting times Topics / Issues Monitoring / Managing

Typical Network Topics Tool / Template sharing Collaborative projects Research / Benchmarking Sounding board Project critiques Technology review

Key ROI Issues Time Cost Complexity Accuracy Credibility Lack of Skills

Cost-Saving Approaches to ROI Plan for evaluation early in the process Build evaluation into the process Share the responsibilities for evaluation Require participants to conduct major steps Use short-cut methods for major steps Use sampling to select the most appropriate programs for ROI analysis and . . .

Cost-Saving Approaches to ROI Use estimates in the collection and analysis of data Develop internal capability to implement the ROI process Utilize web-based software to reduce time Streamline the reporting process

Tools and Templates Instruments Costs Analysis Reporting

Technology Reaction / Learning surveys Test design Follow-up surveys Statistics packages ROI software Scorecards

Suggested Evaluation Targets Level Level 1 - Reaction Level 2 - Learning Level 3 - Application Level 4 - Business Impact Level 5 - ROI Target 100% 60% 30% 10-20% 5-10%

Worksheet – Project/Program Selection Criteria List each project/program that fits Level 3 criteria in the left column. Rank each project/program in its category as High Priority (HP), Special Attention (SA), or Business as Usual (BAU). Compliance Project/Program Customer Service Project/Program Sales Program Call Center or other Customer Transaction Program Organization Sponsored Certification Program

Level 3 Priority Ranking High Priority Project/Program clearly must be evaluated at Level 3 in the short term. Special Attention May not be evaluated at Level 3 in the short term, but there are enough issues that an assignment will be made to assess the situation. Business as Usual Continue with current strategy for this program.

Worksheet – Project/Program Selection Criteria List each project/program you are considering evaluating in the left column. Rank each program as 1, 2, 3, 4, or 5 for each of the ten criteria. Life Cycle of Project/ Program Operational Objectives Strategic Objectives Costs Audience Size Visibility Investment of time Needs Assessment Conducted Management Interest Quality of Data Collection Processes

Criteria for Selecting Programs for Levels 4 and 5 Evaluation Life cycle of the solution Linkage of solution to operational goals and issues Importance of solution to strategic objectives Top executives are interested in the evaluation Cost of the solution Visibility of the solution Size of the target audience Investment of time

Results-Based Policy Statement Provides focus for the staff Communicates results-based philosophy Sets goals and targets for evaluation Determines basic requirements Serves as a learning tool

Results-Based Policy Key Elements Purpose / Mission / Direction Evaluation targets Evaluation support group functions Responsibility for results Management review of results Follow-up process Communication of results

Evaluation Procedures and Guidelines Show how to utilize tools and techniques Guide the design process Provide consistency in the process Ensure that the appropriate methods are used Keeps the process on track Place emphasis on the desired areas

Management Influence Commitment usually refers to the top management group and includes its pledge or promise to allocate resources. Management support refers to the action of the entire management group and reflects the group’s attitude towards the HR process and staff. and . . .

Management Influence Management involvement refers to the extent to which executives and managers are actively engaged in the HR process in addition to participating in the program. Management reinforcement refers to the actions designed to reward and encourage a desired behavior.

Why Managers Don’t Support Your Programs No results Too costly No input No relevance No involvement No time No preparation Lack of knowledge about HR No requirements

Management Action Target Group Scope Payoff Commitment Top executives All programs Very high Support Mid managers, 1st Level managers Several programs High Reinforcement 1st Level managers Specific programs Moderate Involvement All levels

The Results Commitment Relationship Top Management Commitment Business Results Successful Programs

CEO Commitment Checklist

Ten Commitments Develop or approve a mission Allocate the necessary funds Allow employees time to participate Become actively involved Support the learning effort

Ten Commitments Position the function Require evaluation Insist on cost effectiveness Set an example Create an atmosphere of open communication

Why Programs Don’t Work Immediate manager does not support the program. The culture in the work group does not support the program. No opportunity to use the program. No time to implement the program. Didn’t learn anything that could be applied to the job. The systems and processes did not support the program.

Why Programs Don’t Work Didn’t have the resources available to use the program. Changed job and the program no longer applies. This is not appropriate in our work unit. Didn’t see a need to use the program. Could not change old habits.

Questions for Discussion When considering the situation, what specifically can be done to enhance the program success? How important is the role of the manager of participants in programs? What implication does this have for your programs?

The Transfer of Success to the Job TIMEFRAME Before During After Manager Participant Facilitator/ Organizer ROLE-PLAYERS

Ideal Management Support Gives endorsement and approval for participants to be involved in program. Volunteers personal services or resources to assist in the program’s implementation. Makes a pre-program commitment with the participant concerning expected efforts. Reinforces the behavior change resulting from the program. Conducts a follow-up on program results. Gives positive rewards for participants who experience success with the program.

Ideal Reinforcement Helping the participant diagnose problems to determine if the program is needed Discussing possible alternatives to help the participant apply the skills and implement the program Encouraging the participant to implement the program Serving as a role model for the proper use of the skills Providing positive rewards to the participant when the program is successfully implemented

Levels of Management Support Supportive: strongly and actively supports all of our efforts. Responsive: Supports programs, but not as strongly as the supporting manager. Non-Supportive: Privately voices displeasure with our programs on a formal basis. Destructive: Works actively to keep participants from being involved in our programs.

ROI: Tools vs. Relationships Program Developers Program Coordinators Program Facilitators Program Advisors Program Managers Participants Supervisors Managers

Types of Management Involvement As members of advisory committees As members of task forces As subject matter experts As participants As program leaders As evaluators As program sponsors As purchasers of services In a newly-defined role In rotational assignments

Potential Manager Involvement Steps in the Process Opportunity Strategy Conduct Analysis High Taskforce Develop Measurement and Evaluation System Moderate Advisory committee Establish Program Objectives Develop Program Implement Program Program leader

Potential Manager Involvement Steps in the Process Opportunity Strategy Monitor Costs Low Expert input Collect and Analyze Data Moderate Interpret Data and Draw Conclusions High Communicate Results Manager as participant

Concerns About HR From a Key Manager Results are not there This is not my responsibility I don’t have time for HR I don’t understand what you do No respect for HR

Managers Workshop Objectives After completing this workshop, each manager should: See the results of HR. Understand his or her responsibility for HR. Identify areas for personal involvement in the HR process. Develop specific behaviors to support and reinforce program objectives. Realize the importance of the HR function in achieving departmental, division, and company goals.

Steps to Develop a Partnership Assess the current status of partnership relationships. Identify key individuals for a partnership relationship. Learn the business. Consider a written plan. Offer assistance to solve problems. Show results of programs. Publicize partners’ accomplishments and successes.

Steps to Develop a Partnership Ask the partner to review needs. Have partner serve on an advisory committee. Shift responsibility to the partner. Invite input from the partner about key plans and programs. Ask the partner to review program objectives, content, and delivery mechanisms. Invite the partner to conduct or coordinate a program or portion of a program. Review progress and re-plan strategy.

Partnering Principles Have patience and persistence throughout the process. Follow win-win opportunities for both parties. Deal with problems and conflicts quickly. Share information regularly and purposefully. Always be honest and display the utmost integrity in all the transactions.

Partnering Principles Keep high standards of professionalism in each interaction. Give credit and recognition to the partner routinely. Take every opportunity to explain, inform, and educate. Involve managers in as many activities as possible.

Annual HR Review Agenda Review of previous year’s HR programs Methods / levels of evaluation Results achieved from programs Significant deviations from expected results Basis for determining HR needs for next year Scheduled programs Proposed methods / levels of evaluation Potential payoffs Problem areas in the HR process Concerns from top management

Action Plan for Improvement Develop a plan of implementation for improving measurement and evaluation in your organization. Consider all of the items included in this and other modules. Identify a particular time frame and key responsibilities.

Needs assessment/analysis Objectives Reaction measures Issue Actions Time Responsibility Perception of HR Needs assessment/analysis Objectives Reaction measures Learning measures Application measures Impact measures ROI measures Use of technology Communicating results Management influence Staff development Roles / responsibilities

International Car Rental