Developing a Hiring System OK, Enough Assessing: Who Do We Hire??!!

Slides:



Advertisements
Similar presentations
STAFFING. KEY ASSUMPTIONS ä People differ ä Jobs differ ä Goal? ä ä Requires ä.
Advertisements

Combining Test Data MANA 4328 Dr. Jeanne Michalski
Chapter 10 Decision Making © 2013 by Nelson Education.
Simple Regression Equation Multiple Regression y = a + bx Test Score Slope y-intercept Predicted Score  y = a + b x + b x + b x ….. Predicted Score 
Marketing 334 Consumer Behavior
PowerPoint Slides developed by Ms. Elizabeth Freeman
Strategic Staffing Chapter 9 – Assessing External Candidates
Chapter 8 Making Judgments and Decisions P189 Intuitive Predictions Judgment Aids Utility Analysis chapter 8 Makeing Judgments and Decisions 1.
Staffing Chapters
DECISION-MAKING AND UTILITY METHOD SELECTION OBTAINING ACCEPTANCE.
Developing a Hiring System Reliability of Measurement.
Effect of Selection Ratio on Predictor Utility Reject Accept Predictor Score Criterion Performance r =.40 Selection Cutoff Score sr =.50sr =.10sr =.95.
Chapter Five Selection © 2007 Pearson Education Canada 5-1 Dessler, Cole, Goodman, and Sutherland In-Class Edition Management of Human Resources Second.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 8-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
1 Report Tile UNITED STATES OFFICE OF PERSONNEL MANAGEMENT Principles of Assessment.
Chapter 16 Consumer Decision Making and Beyond
1 Chapter 7 Staffing Decisions Copyright © The McGraw-Hill Companies, Inc.
Part 5 Staffing Activities: Employment
1 BA 555 Practical Business Analysis Review of Statistics Confidence Interval Estimation Hypothesis Testing Linear Regression Analysis Introduction Case.
Tools for Successful Selection Developing a Hiring SYSTEM.
DEFINING JOB PERFORMANCE AND ITS RELATIONSHIP TO ASSESSMENTS.
Copyright © 2012 by Cengage Learning. All rights reserved Chapter 7 Recruiting and Selection Prepared by Joseph Mosca Monmouth University.
Dessler, Cole, Goodman and Sutherland Fundamentals of Human Resources Management in Canada Chapter Five Selection © 2004 Pearson Education Canada Inc.,
Analyzing data: Synthesis
Hires 5 Offers 10 Interviews 40 Invites 60 Applicants 240 Adapted from R.H. Hawk, The Recruitment Function (New York: American Management Association,
© 2005 SHRM SHRM Weekly Online Survey: March 15, 2005 Hiring Decisions Sample comprised of 282 randomly selected HR professionals. Analyzing 282 response.
Hires 5 Offers 10 Interviews 40 Invites 60 Applicants 240 Adapted from R.H. Hawk, The Recruitment Function (New York: American Management Association,
Selection Decisions MANA 5341 Dr. George Benson
S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9.
CHAPTER 4 Employee Selection
S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9.
Selection 1- Measurement 2- External. Organization Strategy HR and Staffing Strategy Staffing Policies and Programs Staffing System and Retention Management.
Chapter Seven Measurement and Decision-Making Issues in Selection.
How People make Decisions
Part 5 Staffing Activities: Employment
Dessler, Cole and Sutherland Human Resources Management in Canada Canadian Ninth Edition Chapter Six Selection © 2005 Pearson Education Canada Inc., Toronto,
Hires 5 Offers 10 Interviews 40 Invites 60 Applicants 240 Adapted from R.H. Hawk, The Recruitment Function (New York: American Management Association,
Selecting Employees to Fit the Job and the Organization 03/04/2013.
Chapter 1 Understanding Personnel Assessment chapter 11.
Combining Test Data MANA 4328 Dr. Jeanne Michalski
Chapter 9 Selection Tests McGraw-Hill/Irwin Copyright © 2009 by The McGraw-Hill Companies, Inc., All Rights Reserved.
Assessing Recruiting Effectiveness  Cost per hire  Time to hire  Tenure of employees recruited  Job performance of employees recruited  Yield ratios.
IMPORTANCE OF STATISTICS MR.CHITHRAVEL.V ASST.PROFESSOR ACN.
Developing a Hiring System Measuring Applicant Qualifications or Statistics Can Be Your Friend!
Topic #5: Selection Theory
Strategy for Human Resource Management Lecture 15
©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
 Seeks to determine group membership from predictor variables ◦ Given group membership, how many people can we correctly classify?
Introduction to Power and Effect Size  More to life than statistical significance  Reporting effect size  Assessing power.
© 2013 by Nelson Education1 Decision Making. Chapter Learning Outcomes  After reading this chapter you should:  Appreciate the complexity of decision.
© 2013 by Nelson Education1 Selection I: Applicant Screening.
Neil H. Schwartz, Ph.D. Psych 560
Reliability and Validity Selection methods
Chapter 6 Staffing Decisions.
Human Resource Selection, 8e
Yield Pyramid Hires 5 Offers 10 Interviews 40 Invites 60
CHAPTER 4 Employee Selection
Introduction to Agribusiness Management
MANA 4328 Dr. Jeanne Michalski
Week 10 Slides.
Evaluating Recruiting Methods
MANA 4328 Dr. George Benson Combining Test Data MANA 4328 Dr. George Benson 1.
Evaluating Recruiting Methods
Chapter Six Selection 6 Human Resources Management in Canada
Developing a Hiring System
Multiple Regression – Split Sample Validation
Personnel decisions Study Unit 4.
CHAPTER 4 Employee Selection
DM’ing with Multiple Predictors
Chapter 7: Selection.
Presentation transcript:

Developing a Hiring System OK, Enough Assessing: Who Do We Hire??!!

Summary of Performance-Based Hiring  Understand performance expectations  List attributes that predict performance  Match attributes with selection tools  Choose/develop each tool effectively  Make performance-based decisions

List of Critical Attributes

Performance Attributes Matrix

Who Do You Hire??

Common Decision-Making Errors  Switching to non-performance factors  Succumbing to the “Tyranny of the Best”  Reverting to “intuition” or “gut feel”

Information Overload!!  Leads to: – Reverting to gut instincts – Mental Gymnastics

Combining Information to Make Good Decisions  “Mechanical” methods are superior to “Judgment” approaches – Multiple Regression – Multiple Cutoff – Multiple Hurdle – Profile Matching – High-Impact Hiring approach

Multiple Regression Approach  Predicted Job perf = a + b 1 x 1 + b 2 x 2 + b 3 x 3 – x = predictors; b = optimal weight  Issues: – Compensatory: assumes high scores on one predictor compensate for low scores on another – Assumes linear relationship between predictor scores and job performance (i.e., “more is better”)

Multiple Cutoff Approach  Sets minimum scores on each predictor  Issues – Assumes non-linear relationship between predictors and job performance – Assumes predictors are non-compensatory – How do you set the cutoff scores?

How Do You Set Cut Scores?  Expert Judgment  Average scores of current employees – Good employees for profile matching – Minimally satisfactory for cutoff models  Empirical: linear regression

Multiple Cutoff Approach  Sets minimum scores on each predictor  Issues – Assumes non-linear relationship between predictors and job performance – Assumes predictors are non-compensatory – How do you set the cutoff scores? – If applicant fails first cutoff, why continue?

Test 1Test 2 Interview Background Finalist Decision Reject Multiple Hurdle Model Fail Pass

Multiple Hurdle Model  Multiple Cutoff, but with sequential use of predictors – If applicant passes first hurdle, moves on to the next  May reduce costs, but also increases time

Profile Matching Approach  Emphasizes “ideal” level of KSA – e.g., too little attention to detail may produce sloppy work; too much may represent compulsiveness  Issues – Non-compensatory – Small errors in profile can add up to big mistake in overall score  Little evidence that it works better

How Do You Compare Finalists?  Multiple Regression approach –Y (predicted performance) score based on formula  Cutoff/Hurdle approach – Eliminate those with scores below cutoffs – Then use regression (or other formula) approach  Profile Matching – Smallest difference score is best – ∑ (Ideal-Applicant) across all attributes  In any case, each finalist has an overall score

Making Finalist Decisions  Top-Down Strategy – Maximizes efficiency, but also likely to create adverse impact if CA tests are used  Banding Strategy – Creates “bands” of scores that are statistically equivalent (based on reliability) – Then hire from within bands either randomly or based on other factors (inc. diversity)

Applicant Total Scores

Limitations of Traditional Approach  “Big Business” Model – Large samples that allow use of statistical analysis – Resources to use experts for cutoff scores, etc. – Assumption that you’re hiring lots of people from even larger applicant pools

A More Practical Approach  Rate each attribute on each tool – Desirable – Acceptable – Unacceptable  Develop a composite rating for each attribute – Combining scores from multiple assessors – Combining scores across different tools – A “judgmental synthesis” of data  Use composite ratings to make final decisions

Improving Ratings 1. Use intuitive rating system  Unacceptable – Did not demonstrate levels of attribute that would predict acceptable performance  Acceptable – Demonstrated levels that would predict acceptable performance  Desirable – Demonstrated levels that would predict exceptional performance

Categorical Decision Approach 1. Eliminate applicants with unacceptable qualifications 2. Then hire candidates with as many desirable ratings as possible 3. Finally, hire as needed from applicants with “acceptable” ratings – Optional: “weight” attributes by importance

Sample Decision Table

Using the Decision Table 1: More Positions than Applicants

Using the Decision Table 2: More Applicants than Positions

Numerical Decision Approach 1. Eliminate applicants with unacceptable qualifications 2. Convert ratings to a common scale – Obtained score/maximum possible score 3. Weight by importance of attribute and measure to develop composite score

Numerical Decision Approach

Summary: Decision-Making  Focus on critical requirements  Focus on performance attribute ratings – Not overall evaluations of applicant or tool  Eliminate candidates with unacceptable composite ratings on any critical attribute  Then choose those who are most qualified: – Make offers first to candidates with highest numbers of desirable ratings