DM’ing with Multiple Predictors

Slides:



Advertisements
Similar presentations
Managing Human Resources, 12e, by Bohlander/Snell/Sherman © 2001 South-Western/Thomson Learning 5-1 Managing Human Resources Managing Human Resources Bohlander.
Advertisements

pv Shift: Part-time, year-round; Must have flexible schedule Rate: $11.00 – $16.50 Job Summary Hourly, nonexempt, nonsupervisory position. Primarily.
Combining Test Data MANA 4328 Dr. Jeanne Michalski
Developing a Hiring System OK, Enough Assessing: Who Do We Hire??!!
Chapter 10 Decision Making © 2013 by Nelson Education.
Simple Regression Equation Multiple Regression y = a + bx Test Score Slope y-intercept Predicted Score  y = a + b x + b x + b x ….. Predicted Score 
PowerPoint Slides developed by Ms. Elizabeth Freeman
Strategic Staffing Chapter 9 – Assessing External Candidates
III Choosing the Right Method Chapter 10 Assessing Via Tests p235 Paper & Pencil Work Sample Situational Judgment (SJT) Computer Adaptive chapter 10 Assessing.
Industrial and Organizational Psychology Assessment Methods For Selection Copyright Paul E. Spector, All rights reserved, March 15, 2005.
Staffing Chapters
Chapter 4 Validity.
Effect of Selection Ratio on Predictor Utility Reject Accept Predictor Score Criterion Performance r =.40 Selection Cutoff Score sr =.50sr =.10sr =.95.
Chapter Five Selection © 2007 Pearson Education Canada 5-1 Dessler, Cole, Goodman, and Sutherland In-Class Edition Management of Human Resources Second.
Part 5 Staffing Activities: Employment
Correlational Designs
Chapter 7 Correlational Research Gay, Mills, and Airasian
MGTO 324 Recruitment and Selections Validity II (Criterion Validity) Kin Fai Ellick Wong Ph.D. Department of Management of Organizations Hong Kong University.
Hires 5 Offers 10 Interviews 40 Invites 60 Applicants 240 Adapted from R.H. Hawk, The Recruitment Function (New York: American Management Association,
Hires 5 Offers 10 Interviews 40 Invites 60 Applicants 240 Adapted from R.H. Hawk, The Recruitment Function (New York: American Management Association,
L 1 Chapter 12 Correlational Designs EDUC 640 Dr. William M. Bauer.
Job Analysis (e.g., Job requirements, KSAs HR Planning (e.g., # job openings, time frame) Job Description (e.g., job duties, benefits, applicant qualifications)
CHAPTER 4 Employee Selection
Managing Human Resources, 12e, by Bohlander/Snell/Sherman © 2001 South-Western/Thomson Learning 5-1 Managing Human Resources Managing Human Resources Bohlander.
Chapter Seven Measurement and Decision-Making Issues in Selection.
Validity Is the Test Appropriate, Useful, and Meaningful?
Tests and Measurements Intersession 2006.
Part 5 Staffing Activities: Employment
Hires 5 Offers 10 Interviews 40 Invites 60 Applicants 240 Adapted from R.H. Hawk, The Recruitment Function (New York: American Management Association,
Chapter 1 Understanding Personnel Assessment chapter 11.
Chapter 13 Multiple Regression
Combining Test Data MANA 4328 Dr. Jeanne Michalski
Model Selection and Validation. Model-Building Process 1. Data collection and preparation 2. Reduction of explanatory or predictor variables (for exploratory.
Assessing the Total Effect of Time-Varying Predictors in Prevention Research Bethany Bray April 7, 2003 University of Michigan, Dearborn.
Correlation They go together like salt and pepper… like oil and vinegar… like bread and butter… etc.
Topic #5: Selection Theory
©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
© 2013 by Nelson Education1 Foundations of Recruitment and Selection I: Reliability and Validity.
© 2013 by Nelson Education1 Decision Making. Chapter Learning Outcomes  After reading this chapter you should:  Appreciate the complexity of decision.
Chapter 6 Staffing Decisions.
3 Chapter Needs Assessment.
The Most Effective Tool to Measure SOFT SKILLS
Regression Analysis AGEC 784.
Copyright ©2016 Cengage Learning. All Rights Reserved
Chapter 8 Selection. chapter 8 Selection Selection – the process by which an organization chooses from a list of applicants the person or persons who.
VALIDITY by Barli Tambunan/
Human Resource Selection, 8e
CHAPTER 5 Methods for Assessing and selecting Employees
Yield Pyramid Hires 5 Offers 10 Interviews 40 Invites 60
Regression Chapter 6 I Introduction to Regression
CHAPTER 4 Employee Selection
Understanding Personnel Assessment
Introduction to Agribusiness Management
Multivariate Analysis Lec 4
MANA 4328 Dr. Jeanne Michalski
MANA 4328 Dennis C. Veit Measurement MANA 4328 Dennis C. Veit 1.
Week 10 Slides.
III Choosing the Right Method Chapter 10 Assessing Via Tests
MANA 4328 Dr. George Benson Combining Test Data MANA 4328 Dr. George Benson 1.
5 6 Selecting Employees C H A P T E R Training Employees
MANA 4328 Dennis C. Veit Measurement MANA 4328 Dennis C. Veit 1.
12/27/2018 Selection Decisions.
III Choosing the Right Method Chapter 10 Assessing Via Tests
Diagnostics and Remedial Measures
CHAPTER 4 Employee Selection
Copyright ©2016 Cengage Learning. All Rights Reserved
Diagnostics and Remedial Measures
Chapter 7: Selection.
Chapter 6 Selecting Employees
Chapter 7 Excel Extension: Now You Try!
Presentation transcript:

DM’ing with Multiple Predictors Advantages More comprehensive assessment Bias suppression Better prediction Greater face validity and Challenges Imposing information processing demands Much higher technical proficiency Greater cost Time lapse expansion and attrition during cycle

Strategies for Selection Decision Making Methods of collecting predictor information Tests, interviews, work samples, application blanks, simulations (e.g., leader, group, customer) Methods of combining predictor information Adding, weighting, formula or judgement based approaches to estimate a composite score Strategies for facilitating decisions Applying a decision scheme that considers available information to produce a conclusive result

What are the best methods to collect information? Mechanical Judgmental No use of human judgment in collecting applicant information Standardized administration of a job knowledge test, cognitive ability test, or personality inventory. Use of human judgment in collecting applicant information Unstructured employment interview, observer rated performance in a simulation, or dinner at Casa Rustica.

How should scores be combined to give applicants an overall score (a composite). Mechanical Judgmental No use of human judgment in combining applicant information Entering applicant scores on interviews or tests into a pre-defined equation designed to predict job performance. Use of human judgment in combining applicant information Reviewing applicant data and applying clinical human judgment to form an impression of applicant promise.

Sparkl* Housekeeper Selection Procedure and Application Score Data Characteristic Selection Procedure Employer Referral Reading Test Agreeableness Test HR Interview Management Interview Maximum score 15 30 65 25 Regression Weight 1 .8 .9 .7 .5 Cutoff Score 7 22 50 6 10 Applicant Applicant Selection Procedure Scores Amanda 55 Dave 14 21 63 11 Carlo 9 29 60 8 12 Becky 5 24 Cindy 23 13 Regression equation: Y = x1 + .8x2 + .9x3 + .7x4 + .5x5 (intercept omitted). The higher the applicant score, the better on the predictor.

Multiple regression Compensatory linear composite method Assumes additive, linear function among predictors-criterion FYI: can accommodate non-linear functions compensatory relationship between predictors Advantages easily applied; robust to violations; allows ranking Disadvantages compensatory model may not be appropriate unstable parameters with small samples requires all applicants to be measured on all predictors

Sparkl* Housekeeper Selection Procedure and Application Score Data Characteristic Selection Procedure Employer Referral Reading Test Agreeableness Test HR Interview Management Interview Maximum score 15 30 65 25 Regression Weight 1 .8 .9 .7 .5 Cutoff Score 7 22 50 6 10 Applicant Applicant Selection Procedure Scores Amanda 55 Dave 14 21 63 11 Carlo 9 29 60 8 12 Becky 5 24 Cindy 23 13 1 2 3 4 5 Regression equation: Y = x1 + .8x2 + .9x3 + .7x4 + .5x5 (intercept omitted). The higher the applicant score, the better on the predictor.

Multiple Cutoffs Minimum threshold method Assumes a non-compensatory, nonlinear relationship between predictors and criterion deficiency on any one predictor is sufficient to justify elimination Advantages intuitive, simple Disadvantages only identifies the minimally competent requires all applicants be measured on all predictors

Sparkl* Housekeeper Selection Procedure and Application Score Data Characteristic Selection Procedure Employer Referral Reading Test Agreeableness Test HR Interview Management Interview Maximum score 15 30 65 25 Regression Weight 1 .8 .9 .7 .5 Cutoff Score 7 22 50 6 10 Applicant Applicant Selection Procedure Scores Amanda 55 Dave 14 21 63 11 Carlo 9 29 60 8 12 Becky 5 24 Cindy 23 13 Regression equation: Y = x1 + .8x2 + .9x3 + .7x4 + .5x5 (intercept omitted). The higher the applicant score, the better on the predictor.

Multiple Hurdles Sequential minimum threshold method Assumes a non-compensatory, nonlinear relationship between predictors and criterion deficiency on any one predictor is sufficient to justify elimination Advantages intuitive, simple, cost effective Disadvantages only identifies the minimally competent validation difficulties bc of missing data on predictors time lapse from search to hire is long

Sparkl* Housekeeper Selection Procedure and Application Score Data Characteristic Selection Procedure Employer Referral Reading Test Agreeableness Test HR Interview Management Interview Maximum score 15 30 65 25 Regression Weight 1 .8 .9 .7 .5 Cutoff Score 7 22 50 6 10 Applicant Applicant Selection Procedure Scores Amanda 55 Dave 14 21 63 11 Carlo 9 29 60 8 12 Becky 5 24 Cindy 23 13 Sequence First Second Third Fourth Fifth Regression equation: Y = x1 + .8x2 + .9x3 + .7x4 + .5x5 (intercept omitted). The higher the applicant score, the better on the predictor.

Combination Method Compensatory minimum threshold method Assumes a “hybrid” relationship exists between subsets of predictors and criterion Initial screening on base of minimum score on all predictors subsequent selection based on linear composite score Advantages intuitive, simple, allows ranking of applicants Disadvantages requires all applicants to complete all predictors more costly than multiple hurdle approach

Sparkl* Housekeeper Selection Procedure and Application Score Data Characteristic Selection Procedure Employer Referral Reading Test Agreeableness Test HR Interview Management Interview Maximum score 15 30 65 25 Regression Weight 1 .8 .9 .7 .5 Cutoff Score 7 22 50 6 10 Applicant Applicant Selection Procedure Scores Amanda 55 Dave 14 21 63 11 Carlo 9 29 60 8 12 Becky 5 24 Cindy 23 13 1 2 Regression equation: Y = x1 + .8x2 + .9x3 + .7x4 + .5x5 (intercept omitted). The higher the applicant score, the better on the predictor.

Profile Matching Prototype matching method Assumes non-linear profile among existing employees generalizes to applicants Profile of Ringers significantly different from that of non-Ringers correlation and squared difference (D2) methods Advantages cutoffs set with deviation from ideal; allows ranking based on proximity Disadvantages requires all applicants to complete all predictors assumes one best profile exists; precludes others validity of separate predictors is overlooked

Sparkl* Housekeeper Selection Procedure and Application Score Data Characteristic Selection Procedure Employer Referral Reading Test Agreeableness Test HR Interview Management Interview Maximum score 15 30 65 25 Regression Weight 1 .8 .9 .7 .5 Cutoff Score 7 22 50 6 10 Applicant Applicant Selection Procedure Scores Amanda 55 Dave 14 21 63 11 Carlo 9 29 60 8 12 Becky 5 24 Cindy 23 13 Regression equation: Y = x1 + .8x2 + .9x3 + .7x4 + .5x5 (intercept omitted). The higher the applicant score, the better on the predictor.

Criterion-related Evidence for S*EEP Measures 4. Cut scores must be rationally or empirically established

Test Fairness of S*EEP Reading Test Scores Males r=.32 Ratings Common r=.25 Females r=.19 Reading Test Scores 5. Test scores must demonstrate fairness in application

Graphing of Utility Not hired mean Y Hired mean Y Applicant mean on criterion Cut score

More on Utility Estimation Other Issues Salary info biases judges 40% rule on capitalization context of work greatly affects estimates negative utility scenarios Can be applied to a host of organizational interventions smoking cessation programs Self-efficacy training programs in organizations Sadacca & Campbell (1985) utility of military MOSs War vs. Peace scenario Negative utility of privates in peace time What do dollars mean in war? UNITS.. Military simulations in desert High (1 SD) vs. normal IQ soldiers High IQ unit killed twice as many enemy soldiers Destroyed 3 times as many tanks