Chapter 13 Individual and Group Assessment

Slides:



Advertisements
Similar presentations
Q.
Advertisements

Chapter 13 Combining Multiple Assessments Combining Predictors Individual Assessments Assessment Centers Chapter 13 Combining Multiple Assessments 1.
WHAT IS AN ASSESSMENT CENTER? NOT A PLACE TO TAKE A TEST A TESTING PROCESS CANDIDATES PARTICIPATE IN A SERIES OF SYSTEMATIC, JOB RELATED, REAL-LIFE SITUATIONS.
Measurement Reliability and Validity
Chapter 10 Decision Making © 2013 by Nelson Education.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Measures of Association.
III Choosing the Right Method Chapter 10 Assessing Via Tests p235 Paper & Pencil Work Sample Situational Judgment (SJT) Computer Adaptive chapter 10 Assessing.
STAFFING maybe defined as the management of function that determines human resource needs, recruits, selects, trains, and develops human resource for.
Leadership Talent Selection. Uses of Assessment Centers Evaluation of people for promotion or succession Formulation of training plan for strengths &
Concept of Reliability and Validity. Learning Objectives  Discuss the fundamentals of measurement  Understand the relationship between Reliability and.
5-1 Copyright ©2010 Pearson Education, Inc. publishing as Prentice Hall Recruiting and Selecting Employees Chapter 5.
Brief Assessment Center History Used by Germans in 1 st World War to select officers Used by U.S. to select spies (OSS) In Private Industry, 1st used by.
Selection Test Validity and Reliability; Types of Tests Group 5 Luke Anderson Taylor Burton Zach Haas Chris Hahn Chris Kintz Jason Springer.
Sears, Roebuck, and Indiana University Leadership Assessment Steve Kirn, VP of Innovation Alicia Winckler, Mgr of Testing and Selection Processes.
Introduction to Clinical Psychology Science, Practice and Ethics Chapter 5 General Issues in Psychological Assessment This multimedia product and its contents.
Validating Assessment Centers Kevin R. Murphy Department of Psychology Pennsylvania State University, USA.
Comprehensive Cultural Assessments Summary of Scope & Methodology A. Levin © SYNERGY Consulting Services Corporation, 1999.
Chapter 1 Understanding Personnel Assessment chapter 11.
ASSESSMENT CENTRES 16. OBJECTIVES To Understand Concept of Assessment Centre Difference between Assessment and Development Centre Designing of Assessment.
HR Issues of Events (3) By: Zhou Chunlin School of Tourism, Conference and Exhibitions Henan University of Economics and Law.
Module 1 Lesson 6 Research in Psychology Title: Kids at table doing experiment Author: Rejon Source: Openclipart il/38305/kids-at-table-
Predictors: Psychological Assessments
Managing your Science Centre Your Most Precious Asset; People!
Quality Assurance processes
Standards for Decision Making
Reliability and Validity Selection methods
Job Analysis Chapter 4 Md. Al-Amin.
The Most Effective Tool to Measure SOFT SKILLS
6 Selecting Employees and Placing Them in Jobs
Copyright ©2016 Cengage Learning. All Rights Reserved
Chapter 8 Selection. chapter 8 Selection Selection – the process by which an organization chooses from a list of applicants the person or persons who.
Compliance with Framework of Quality Control - General & Specific Controls CA Vimal Chopra, Ex Chairman of CIRC of ICAI.
Theoretical issues Traits capture relatively stable individual differences. They are assumed to be relatively stable over time. They are also assumed to.
Chapter 11 Assessing Via Inventories and Interviews (2nd ed)
Assessing Leadership and Measuring Its Effects
Introduction: The Nature of Leadership
Fundamentals of a Vocational Assessment
Chapter 4 Developing Performance-Based Assessments
Human Resource Selection, 8e
Internal Selection CHAPTER TEN Screen graphics created by:
Research & Writing in CJ
Pakistan Institute of Management Human Resource Management Professional Diploma Course Session: Sep 2016 – Jan 2017 Employee Selection Selection is the.
Understanding Personnel Assessment
Part 4 Staffing Activities: Selection
MANA 4328 Dr. Jeanne Michalski
Week 10 Slides.
III Choosing the Right Method Chapter 10 Assessing Via Tests
PSYCH 610 Education for Service/snaptutorial.com.
Analysis and Critical Thinking in Assessment
Chapter 8 Making Judgments and Decisions
Chapter 2 Analyzing Orgs and Jobs
Theoretical issues Traits capture relatively stable individual differences. Traits are assumed to be relatively stable over time. Traits are also assumed.
Monitoring and Information Systems
..
Chapter 13 Individual and Group Assessment
5 6 Selecting Employees C H A P T E R Training Employees
III Choosing the Right Method Chapter 10 Assessing Via Tests
Job Analysis CHAPTER FOUR Screen graphics created by:
Chapter 1: Introduction to Research on Physical Activity
THE MARKET-DRIVEN SALES ORGANIZATION
Pest Risk Analysis (PRA) Stage 2: Pest Risk Assessment
Why should the public sector want to innovate?
Chapter 11 Assessing Via Inventories and Interviews (2nd ed)
Job Analysis Chapter 4 Md. Al-Amin.
Copyright ©2016 Cengage Learning. All Rights Reserved
Managerial Decision Making and Evaluating Research
Simulation Tests CHAPTER 13
Chapter 7: Selection.
Chapter 6 Selecting Employees
AS Psychology Research Methods
Presentation transcript:

Chapter 13 Individual and Group Assessment Complex Candidate Judgments Individual Assessments Assessment Centers Chapter 13 Combining Multiple Assessments

ISSUES IN COMBINING PREDICTORS Decision Models Additive Well known and useful …and… Compensatory New concept: compensatory batteries What is …”either or” or ….”if then? What does the ADA have to do with it? Judgmental (for individual assessments) v. Statistical Cf judgmental vs. Statistical – which is better? From Highhouse’s PTC presentation… Chapter 13 Combining Multiple Assessments

Issues in Combining Predictors Tough Choices Large applicant pools and top-down, no problem What about small pools of candidates for one position? What factors influence the decision? Chapter 13 Combining Multiple Assessments

INDIVIDUAL ASSESSMENT Usually for executives or special positions Performance is hard to define and few occupy the roles Why do these assessment opportunities attract charlatans? Holistic Approach (Henry Murray) How good it this approach? (P. Meehl, ‘54) Analytic Emphasis (history) Approach: Consultant visited clients to learn the job/org/context Two psychologists interviewed and rated candidates w/o access to data on them Projective tests were analyzed by clinician –blind to other information Test battery developed to include Two personality/ interest inventory/abilities tests One psychologist –interviewer wrote the report Two other programs: EXXON and Sears - Batteries included critical thinking / personality MRC .70 -.75! Exec success = forcefulness, dominance, assertiveness, confidence Although valid, legal concerns from ‘50s – ’60 damped down research Chapter 13 Combining Multiple Assessments

Individual Assessment Criticisms of Individual Assessment Overconfidence in clinical judgments True or false? (Camerer & Johnson, ’91; Highouse, ‘02) Psychometrics don’t apply to this type of assessment Assessors wouldn’t be in business if they weren’t valid Chapter 13 Combining Multiple Assessments

Individual Assessments Other criticisms: Individual assessments is rarely subjected to validation Conclusions are often unreliable (Ryan & Sackett, ‘89) Summaries are often influenced by one or two parts which could be done alone (they are judgments! And judgments often focus on negative & early cues!) Great emphasis is usually placed on personality When cognitive tests are usually more valid Actual interpersonal interaction needs to be assessed But need to be done with more than one person evaluating (assessment centers are useful) May be ethically or legally questionable to seek information not explicitly relevant to the job “Mr. Obama, can you tell us a little about your wife, Michelle?” Chapter 13 Combining Multiple Assessments

To Address these issues: Combine evidence of relevant traits with evidence from construct validities Use well-developed predictive hypotheses to dictate and justify the assessment content Use more work samples (or in-basket, SJT) To assess interpersonal behavior Personnel records / biodata / interview structure Others? Chapter 13 Combining Multiple Assessments

Assessment Centers Purposes Often organizationally specific To reflect specific values and practices For managerial (Thornton & Byham, ‘82) Early identification of potential Promotion Development For personnel decisions – OAR (overall rating) Chapter 13 Combining Multiple Assessments

Assessment Centers (organization specific) Purposes Promotion (Id potential managers) –succession planning Management development Assessment Center Components (need a JA) Multiattribute and should be multimethod (more that one method for an attribute) Tests and Inventories Exercises (performance tests-work samples) In-basket Leaderless group discussions do these have problems? Confounds? Interviews Should a stress interview be used? When? Give an example. Assessors Functions of Assessors (Zedeck, ‘86) Observer and recorder Role play Predictor Assessor Qualifications: (SMEs, HR, psychologists) Number of Assessors: (2 candidates to 1 assessor) Chapter 13 Combining Multiple Assessments

Dimensions to be Assessed See Table 13.2 Dimensions (usually not defined but should be) Should be defined in behavioral terms (in a particular situation) Ratings Replication: Ratings on different predictors for the same dimension should generalize from one exercise to another Would you predict that happens much? Overall Assessment Ratings (OAR) Should be a definable attribute Or is it a composite of unrelated, but valid predictors? Is consensus the way to go? Can you think of an example? Chapter 13 Combining Multiple Assessments

ASSESSMENT CENTER PROBLEMS AND ISSUES Construct Validities of Dimension Assessments Dimensional Consistency (across exercises): rs should not be high but substantial. Why? Results of factor analyses Are the factors (in Table 13.3, defined by the exercise or dimensions? Is this consistent with Sackett & Drehers, ‘82 findings? Reasons for inconsistency in dimension ratings Two viewpoints: Are the dimensions relative enduring or situational specific? Or contingent? Solutions? the OAR Maybe the dimensions are just a small number of cognitive and personality factors A behavioral checklist perhaps? Criterion-Related Validities (review of meta-analytic studies) Predictive validity higher with multiple measures Validities higher when peer evals included Background and training moderates validity 4 dimensions account for most of the variance Validities higher for managerial progress v. future performance Chapter 13 Combining Multiple Assessments

Point of View What is the authors’ on Assessment Center validity? What do they recommend? --behaviorally based ratings -using checklists Chapter 13 Combining Multiple Assessments