Chapter 13 Combining Multiple Assessments Combining Predictors Individual Assessments Assessment Centers Chapter 13 Combining Multiple Assessments 1.

Slides:



Advertisements
Similar presentations
Q.
Advertisements

Copyright © 1999 Harcourt Brace & Company Canada, Ltd. Chapter 7 Selection Falkenberg, Stone, and Meltz Human Resource Management in Canada Fourth Edition.
Qualifications Update: National 5 Music Qualifications Update: National 5 Music.
Qualifications Update: National 5 Drama Qualifications Update: National 5 Drama.
Appraising and Managing Performance (c) 2007 by Prentice Hall7-1 Chapter 7.
WHAT IS AN ASSESSMENT CENTER? NOT A PLACE TO TAKE A TEST A TESTING PROCESS CANDIDATES PARTICIPATE IN A SERIES OF SYSTEMATIC, JOB RELATED, REAL-LIFE SITUATIONS.
LESSON 9 ASSESSMENT CENTRES. WHY DO ASSESSMENT CENTERS WORK? Meta-analyses of assessment center research results have been reported by Schmitt, Gooding,
Combining Test Data MANA 4328 Dr. Jeanne Michalski
Chapter 10 Decision Making © 2013 by Nelson Education.
III Choosing the Right Method Chapter 10 Assessing Via Tests p235 Paper & Pencil Work Sample Situational Judgment (SJT) Computer Adaptive chapter 10 Assessing.
Definitions Performance Appraisal
VALIDITY.
Leadership Talent Selection. Uses of Assessment Centers Evaluation of people for promotion or succession Formulation of training plan for strengths &
SELECTION.
Developing a Hiring System Reliability of Measurement.
Concept of Reliability and Validity. Learning Objectives  Discuss the fundamentals of measurement  Understand the relationship between Reliability and.
1 The Employment Interview. 2 One of the most widely used hiring tools in the public and private sector Used for selection and recruiting Involve an interaction.
5-1 Copyright ©2010 Pearson Education, Inc. publishing as Prentice Hall Recruiting and Selecting Employees Chapter 5.
Chapter 4: Predictors: Psychological Assessment
DEFINING JOB PERFORMANCE AND ITS RELATIONSHIP TO ASSESSMENTS.
Employee Testing and Selection
Brief Assessment Center History Used by Germans in 1 st World War to select officers Used by U.S. to select spies (OSS) In Private Industry, 1st used by.
Selection Test Validity and Reliability; Types of Tests Group 5 Luke Anderson Taylor Burton Zach Haas Chris Hahn Chris Kintz Jason Springer.
TAYLOR HOWARD The Employment Interview: A Review of Current Studies and Directions for Future Research.
Chapter 3 Needs Assessment
Sears, Roebuck, and Indiana University Leadership Assessment Steve Kirn, VP of Innovation Alicia Winckler, Mgr of Testing and Selection Processes.
Introduction to Clinical Psychology Science, Practice and Ethics Chapter 5 General Issues in Psychological Assessment This multimedia product and its contents.
Part 4 Staffing Activities: Selection
Torrington, Hall & Taylor, Human Resource Management 6e, © Pearson Education Limited 2005 Slide 7.1 Importance of Selection The search for the perfect.
SELECTION. SELECTION PROCESS BY WHICH AN ORGANIZATION CHOOSES PERSONS WHO BEST MEET THE SELECTION CRITERIA FOR THE POSITION AVAILABLE, CONSIDERING CURRENT.
Recruiting and Selection. Recruiting A. Internal v. external.
Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Testing and Selecting Employees.
Comprehensive Cultural Assessments Summary of Scope & Methodology A. Levin © SYNERGY Consulting Services Corporation, 1999.
Chapter 1 Understanding Personnel Assessment chapter 11.
Combining Test Data MANA 4328 Dr. Jeanne Michalski
Recruitment, Retention, Selection Development and Retention of Personnel Educ 567 Summer 2009 Thomas DiPaola, Ph.D.
Introduction Chapter 1 and 2 Slides From Research Methods for Business
RESEARCH METHODS IN INDUSTRIAL PSYCHOLOGY & ORGANIZATION Pertemuan Matakuliah: D Sosiologi dan Psikologi Industri Tahun: Sep-2009.
Performance Assessment Pertemuan 8 Matakuliah: L0074/Psikologi Industri dan Organisasi 2 Tahun: 2008.
Organizational Psychology: A Scientist-Practitioner Approach Jex, S. M., & Britt, T. W. (2014) Prepared by: Christopher J. L. Cunningham, PhD University.
10-1 McGraw-Hill/Irwin © 2004 The McGraw-Hill Companies, Inc., All Rights Reserved. CHAPTER TEN Internal Selection Screen graphics created by: Jana F.
ASSESSMENT CENTRES 16. OBJECTIVES To Understand Concept of Assessment Centre Difference between Assessment and Development Centre Designing of Assessment.
HR Issues of Events (3) By: Zhou Chunlin School of Tourism, Conference and Exhibitions Henan University of Economics and Law.
Predictors: Psychological Assessments
Managing your Science Centre Your Most Precious Asset; People!
Standards for Decision Making
Reliability and Validity Selection methods
The Most Effective Tool to Measure SOFT SKILLS
6 Selecting Employees and Placing Them in Jobs
Copyright ©2016 Cengage Learning. All Rights Reserved
Chapter 8 Selection. chapter 8 Selection Selection – the process by which an organization chooses from a list of applicants the person or persons who.
Chapter 11 Assessing Via Inventories and Interviews (2nd ed)
Fundamentals of a Vocational Assessment
Human Resource Selection, 8e
Internal Selection CHAPTER TEN Screen graphics created by:
Understanding Personnel Assessment
Part 4 Staffing Activities: Selection
MANA 4328 Dr. Jeanne Michalski
Week 10 Slides.
III Choosing the Right Method Chapter 10 Assessing Via Tests
Chapter 13 Individual and Group Assessment
5 6 Selecting Employees C H A P T E R Training Employees
III Choosing the Right Method Chapter 10 Assessing Via Tests
Chapter 13 Individual and Group Assessment
Chapter 11 Assessing Via Inventories and Interviews (2nd ed)
Copyright ©2016 Cengage Learning. All Rights Reserved
Simulation Tests CHAPTER 13
Chapter 7: Selection.
Chapter 6 Selecting Employees
Presentation transcript:

Chapter 13 Combining Multiple Assessments Combining Predictors Individual Assessments Assessment Centers Chapter 13 Combining Multiple Assessments 1

ISSUES IN COMBINING PREDICTORS Decision Models – Additive Well known and useful …and… – Compensatory New concept: compensatory batteries – What is …”either or” or ….”if then? – What does the ADA have to do with it? – Judgmental (for individual assessments) v. Statistical Cf judgmental vs. Statistical – which is better? From Highhouse’s PTC presentation… Chapter 13 Combining Multiple Assessments 2

Issues in Combining Predictors Tough Choices – Large applicant pools and top-down, no problem – What about small pools of candidates for one position? What factors influence the decision? What is the decoy effect? (Highhouse, ‘96) What is the lesson learned? – What is the recommendation by Guion & Highhouse? (judgment or “most important” criterion?) Chapter 13 Combining Multiple Assessments 3

INDIVIDUAL ASSESSMENT – Usually for executives or special positions Performance is hard to define and few occupy the roles Why do these assessment opportunities attract charlatans? – Holistic Approach (Henry Murray) – Psychometric Emphasis Did you read all seven articles from Campbell et al? – Consultant visited clients to learn the job/org/context – Two psychologists interviewed and rated candidates w/o access to data on them – Projective tests were analyzed by clinician –blind to other information – Test battery developed to include » Two personality/ interest inventory/abilities tests – One psychologist –interviewer wrote the report Two other programs: – EXXON and Sears - » Batteries included critical thinking / personality MRC ! » Exec success = forcefulness, dominance, assertiveness, confidence Although valid, legal concerns from ‘50s – ’60 damped down research Chapter 13 Combining Multiple Assessments 4

Individual Assessment Criticisms of Individual Assessment – Overconfidence in clinical judgments – True or false? – (Camerer & Johnson, ’91; Highouse, ‘02) Psychometrics don’t apply to this type of assessment Assessors wouldn’t be in business if they weren’t valid Chapter 13 Combining Multiple Assessments 5

Individual Assessments – Other criticisms: 1.Individual assessments is rarely subjected to validation 2.Conclusions are often unreliable (Ryan & Sackett, ‘89) 3.Summaries are often influenced by one or two parts which could be done alone 1.(they are judgments! And judgments often focus on negative & early cues!) 4.Great emphasis is usually placed on personality 1.When cognitive tests are usually more valid 5.Actual interpersonal interaction needs to be assessed 1.And not usually with executive/managerial candidates 6.May be ethically or legally questionable to seek information not explicitly relevant to the job 1.“Mr. Obama, can you tell us a little about your wife, Michelle?” Chapter 13 Combining Multiple Assessments 6

To Address these issues: Combine evidence of relevant traits with evidence from construct validities Use well-developed predictive hypotheses to dictate and justify the assessment content Use more work samples (or in-basket, SJT) To assess interpersonal behavior – Personnel records / biodata / interview structure – Others? Chapter 13 Combining Multiple Assessments 7

Assessment Centers Purposes – Often organizationally specific To reflect specific values and practices – For managerial (Thornton & Byham, ‘82) Early identification of potential Promotion Development – For personnel decisions – OAR (overall rating) Chapter 13 Combining Multiple Assessments 8

Assessment Centers Assessment Center Components – Tests and Inventories – Exercises In-basket Leaderless group discussions – Interviews Assessors – Functions of Assessors (Zedeck, ‘86) Observer and recorder Role play Predictor – Assessor Qualifications: (SMEs, HR, psychologists) – Number of Assessors: (2 candidates to 1 assessor) Chapter 13 Combining Multiple Assessments 9

Dimensions to be Assessed – See Table 13.2, p 329 Dimensions and definitions – Should be defined in behavioral terms Ratings – Replication: Ratings on different predictors for the same dimension should generalize from one exercise to another » Would you predict that that happens much? Overall Assessments (OAR) – Should be a definable attribute » Or is it a composite of unrelated, but valid predictors? » Can you think of an example? Chapter 13 Combining Multiple Assessments 10

ASSESSMENT CENTER PROBLEMS AND ISSUES Construct Validities of Dimension Assessments – Dimensional Consistency (across exercises): rs should not be high but substantial. Why? – Results of factor analyses Are the factors (in Table 13.3, p333) defined by the exercise or dimensions? Is this consistent with Sackett & Drehers, ‘82 findings? – Reasons for inconsistency in dimension ratings Two viewpoints: Are the dimensions relative enduring or situational specific? Or contingent? – Solutions? Maybe the dimensions are just a small number of cognitive and personality factors…. Criterion-Related Validities (review of meta-analytic studies) 1.Predictive validity higher with multiple measures 2.Validities higher when peer evals included 3.Background and training moderates validity 4.4 dimensions account for most of the variance 5.Validities higher for managerial progress v. future performance Chapter 13 Combining Multiple Assessments 11

Point of View What is the authors’ on Assessment Center validity? What do they recommend? Chapter 13 Combining Multiple Assessments 12