Ardianto Prabowo Indira Dwiajeng A. Maria Angela Masruroh Susan Kuncoro.

Slides:



Advertisements
Similar presentations
Return on Investment: Training and Development
Advertisements

St. Louis Public Schools Human Resources Support for District Improvement Initiatives (Note: The bullets beneath each initiative indicate actions taken.
WV High Quality Standards for Schools
Conceptual Feedback Threading Staff Development. Goals of Presentation What is Conceptualized Feedback? How is it used to thread the development of staff?
Training and Development Current or future skills By Em And Charli.
TRAINING – A RETURN ON INVESTMENT?. Links to individual programs here Training webpage.
Gwinnett Teacher Effectiveness System Training
SEM A – Marketing Information Management
2.06 Understand data-collection methods to evaluate their appropriateness for the research problem/issue.
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
Show Me the Money: Moving From Impact to ROI
Copyright © Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark.
Primary and Secondary Data
Formative and Summative Evaluations
Human Resource Auditing
HOW TO DESIGN EFFECTIVE TRAINING PROGRAM
C o n f i d e n c e p e r f o r m a n c e d i s t i n c t i o n q u a l i t y Nursing Facility Family Satisfaction Survey Report Prepared for: Wiley Mission.
BP Centro Case Top management job descriptions Team 4 Jussi Tiilikainen Jiri Sorvari.
Standards and Guidelines for Quality Assurance in the European
+ Training Evaluation Plan Increasing transfer and effectiveness through the proper evaluation of our current and future training programs. I am here to.
Employee Performance Management
1 WRS Feedback Overview. 2 Agenda Introduction to WRS Assessment Feedback Report Developmental Planning Best Practices Summary/Wrap Up.
Chapter no:6 Training and development of sales force.
Is a systematic process of evaluating and managing employee performance in order to achieve the best outcomes for a business PERFORMANCE MANAGEMENT.
ASSESSMENT OF HRD NEEDS Jayendra Rimal. Goals of HRD Improve organizational effectiveness by: o Solving current problems (e.g. increase in customer complaints)
You’ve Got What It Takes: Peer Training and Mentoring for Staff Development You’ve Got What It Takes: Peer Training and Mentoring for Staff Development.
Kirkpatrick’s Levels of Evaluation
…and how to give it to them What CEOs Want Patti Phillips, Ph.D.
Evaluating HRD Programs
Converting from Military to Civilian Employment Presented By Sophia Alexander Stringfield.
MARKETING SURVEYS Constructing the Questionnaire validity  A questionnaire has validity when the questions asked measure what they were intended.
Interactive Training Skills: Evaluation Study Executive Summary Presentation Service Merchandise Company Date: Wednesday, April 6, 2011 CONFIDENTIAL Consultants:
Presented by Thomas. What is HR Department HR Departments are the entities organizations that organize people, report relationships, and work in a way.
Campus Quality Survey 1998, 1999, & 2001 Comparison Office of Institutional Research & Planning July 5, 2001.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Copyright © 2014 by The University of Kansas Conducting Surveys.
Copyright  2005 McGraw-Hill Australia Pty Ltd PPTs t/a Australian Human Resources Management by Jeremy Seward and Tim Dein Slides prepared by Michelle.
Good Agricultural Practices Teaching Adult Learners.
PERFORMANCE APPRAISAL 1. Performance Appraisal Performance Appraisal (PA) refers to all those procedures that are used to evaluate the personality, performance.
CHAPTER V Health Information. Updates on new legislation (1)  Decision No.1605/2010/QĐ-TTg approving the National Program for Application of information.
2.4 Key Management Roles KEY CONCEPT
1 EMS Fundamentals An Introduction to the EMS Process Roadmap AASHTO EMS Workshop.
DAY-5 Evaluation and Control Process 6 Yes Determine what to measure. Measure performance. Take corrective action. STOP No Does perform- ance match.
Designing Effective Surveys with Qualtrics Gareth Johns IT Skills Development Advisor 1.
Evaluation: Methods & Concerns Otojit Kshetrimayum V.V. Giri National Labour Institute, Noida
E VALUATING YOUR E - LEARNING COURSE LTU Workshop 11 March 2008.
Lynn Schmidt, PhD ATD Puget Sound October 21, 2014.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
{ Determine services to provide for customers When a business determines which services to offer its customers, it should consider: The nature of the.
The Curriculum Development Process Dr. M
M & E System for MI’s Training Program & Guidelines for MI’s Completion Report Presented by Monitoring and Evaluation Officer Mekong Institute January.
Accreditation (AdvancED) Process School Improvement Activities February 2016 Office of Service Quality Veda Hudge, Director Donna Boruch, Coordinator of.
The Marketing Plan Chapter 2. Section 2.1: Marketing Planning  Good marketing requires good planning Research your company Study your business environment.
2009 Annual Employee Survey U.S. Department of Housing and Urban Development December 29,2009 (updated January 8, 2010)
قياس أثر التدريب أكاديمية العبيكان للمعرفة. تقييم التدريب Training evaluation methodologies Donald Kirkpatrick's Four Levels of Evaluation The Phillips.
Improved socio-economic services for a more social microfinance.
Return on Investment: Training and Development Session 1 ROI and Evaluation.
7 Training Employees What Do I Need to Know?
Ing. Valeria Mirabella Questionnaires Ing. Valeria Mirabella
Figure 9.8 User Evaluation Form
an effective self assessment system
Blackburn College Employer Portal
SPIRIT OF HR.in TRAINING EVALUATION.
Chapter 7 Implementing a Performance Management System
Chapter 7 Implementing a Performance Management System
Designing Effective Surveys with Qualtrics
Assessments and the Kirkpatrick Model
2.06 Understand data-collection methods to evaluate their appropriateness for the research problem/issue.
6 Chapter Training Evaluation.
Module 8 Training Delivery.
Presentation transcript:

Ardianto Prabowo Indira Dwiajeng A. Maria Angela Masruroh Susan Kuncoro

Measuring Indicators, Satisfaction, and Learning

1. Indicators2. Satisfaction3. Learning4. Application5. Business Impact6. Return on Investment7. Intangible Benefits

Typical Indicators include: The number and variety of programs The number of employees participating in a leadership development program Total number of hours of learning activity per employee Various enrollment statistics, including demographics of participants, participation rates, completion rates, etc.

Investment in leadership development programs reported in a variety of ways. (Total cost, cost per employee, direct cost per participant, and cost as a percent of payroll are common ways.) Cost recovery, if there is a charge back system. The types of delivery mechanisms.

Indicators usually show the degree of management’s commitment to leadership development and provide a brief view of the mix of programs offered

The most widely used data source for reaction and satisfaction data is the program participants

 Questionnaires  Used to obtain subjective information about participants, as well as to objectively document measurable business results for an ROI analysis.

 Surveys  represent a specific type of questionnaire with several applications for measuring training success.  Used in situations where attitudes, beliefs, and opinions are captured only, whereas a questionnaire has much more flexibility and captures data ranging from attitude to specific improvement statistics.

Surveys can have yes or no responses a range of responses (strongly disagree…strongly agree) five-point scale Questionnaires open-ended question Checklist two-way question multiple-choice ranking scale

 Determine the specific information needed  Involve management in the process  Select the type(s) of questions  Develop the questions  Check the reading level  Test the questions  Address the anonymity issue  Design for ease of tabulation and analysis  Develop the completed questionnaire and prepare a data summary

The most common types of feedback solicited:  Progress with objectives. To what degree were the objectives met?  Program content. Was the content appropriate?  Instructional materials. Were the materials useful?  Pre-work materials. Were the pre-work materials necessary? Helpful?

 Assignments. Were the out-of-class assignments helpful?  Method of delivery. Was the method of delivery appropriate for the objectives?  Instructor/facilitator. Was the facilitator effective?  New information. How much new information was included?  Motivation to learn. Were you motivated to learn this content?

 Relevance. Was the program relevant to your needs?  Importance. How important is this content to the success of  your job?  Registration/logistics. Were the scheduling and registration efficient?  Facilities. Did the facilities enhance the learning environment?

 Potential barriers. What potential barriers exist for the application of the material?  Planned improvements/use of material. How will you apply what you have learned?  Recommendations for target audiences. What is the appropriate audience for this program?  Overall evaluation. What is your overall rating of the program?

 Monitor customer satisfaction.  Identify strengths and weaknesses of the program.  Develop norms and standards.  Evaluate the leadership development staff.  Evaluate planned improvements.  Link with follow-up data.  Market future programs.

 Use a simple questionnaire  Collect data early and react quickly  Pay attention to participants