Presentation is loading. Please wait.

Presentation is loading. Please wait.

PowerPoint Slides developed by Ms. Elizabeth Freeman

Similar presentations


Presentation on theme: "PowerPoint Slides developed by Ms. Elizabeth Freeman"— Presentation transcript:

1 Applied Psychology in Human Resource Management seventh edition Cascio & Aguinis
PowerPoint Slides developed by Ms. Elizabeth Freeman University of South Carolina Upstate Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

2 Chapter 7 Validation and Use of Individual-Differences Measures
Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

3 Isn’t it enough to have a measure that is reliable
Isn’t it enough to have a measure that is reliable? No, it must be valid, too. Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

4 What is validity? Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

5 What is the relationship between reliability and validity?
Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

6 Remember, at basic level, reliability is consistency &
Remember, at basic level, reliability is consistency & validity is accuracy Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

7 Why does HRM care? Theoretically, a measure can be reliable but have no validity
Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

8 ? Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

9 You could reliably measure surgical vocabulary but be unable to accurately measure surgical skills.
Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

10 You could reliably disassemble a firearm but be unable to shoot a target.
Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

11 So, reliability is necessary but is not sufficient for HRM decisions
So, reliability is necessary but is not sufficient for HRM decisions. Reliability must be combined with validity. Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

12 Validity is enhanced by reliability
Validity is enhanced by reliability. Reliability serves as a ceiling of validity. Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

13 When the reliability coefficient of a criterion is known, it can be statistically adjusted for unreliability. “correction for attenuation in the criterion” Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

14 In other words, quality HRM decisions require knowledge of measures’ reliability and validity.
Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

15 Caution with reliability coefficients:. choose reliability
Caution with reliability coefficients: choose reliability statistic carefully reliability statistic will influence magnitude of validity Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

16 Validation is a process evaluating criterion interrelationships, not a single measure
Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

17 Validation processes 1. what is measured 2. how well is “it” measured
Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

18 Validity is unitary concept. 1. validity exists or not. 2
Validity is unitary concept validity exists or not 2. evidence for it varies 3. exists as matter of degree Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

19 Of interest & necessity is the accuracy of the inferences made about a person based on the reliability and validity of test measures. Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

20 To evaluate validity 3 primary strategies. content-related evidence
To evaluate validity 3 primary strategies content-related evidence criterion-related evidence construct-related evidence Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

21 Content-related asks whether measurement procedure represents fair sample of universe of situations
Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

22 To HRM Content-related Does measurement relate to content of job Jobs involve performance domains Valid content measures reflect job performance Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

23 Content-related Measures become more difficult as jobs become more complex and abstract Appropriate to think of content-oriented test development Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

24 Content-related Examples. Simple content (construct). typing test for
Content-related Examples Simple content (construct) typing test for administrative assistants Complex content (construct) predicting success of film based on screenplay or novel Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

25 Content-related legal explanations in Guardians Assn. of N. Y
Content-related legal explanations in Guardians Assn. of N. Y. City Police Dept. v. Civil Service Comm. of City of N. Y., 1980 Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

26 Content-related True question is not if work content (construct) is measured but what class of content (construct) is measured Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

27 Determine content-related validity focus on test construction not inferences about test scores Establish content evaluation panel 2. Calculate content validity index 3. Calculate substantive validity index 4. Conduct content adequacy procedure 5. Conduct analysis of variance Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

28 For content-related validity Remember primary goal is predicting future performance by describing existing scores Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

29 Criterion-related asks if measurement procedure represents fair sample of universe of situations
Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

30 Criterion-related tests hypothesis that scores are related to criterion performance
Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

31 Criterion-related. concurrent validity. exists when a criterion
Criterion-related concurrent validity exists when a criterion measure exists at same time as predictor measure Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

32 Criterion-related. predictive validity. exists when a criterion
Criterion-related predictive validity exists when a criterion measure is available after point in time when predictor measure taken Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

33 Criterion-related concurrent & predictive validity differ by timing, context concurrent – can you do the job right now predictive – can you do the job in the future Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

34 Criterion-related Predictive Study Method – foundation individual differences Measure candidates for job 2. Select candidates without using results of measurement procedure 3. Obtain measurements of criterion performance at later date 4. Assess the strength of the relationship between predictor and criterion Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

35 Criterion-related Predictive Study Issues 1. Sample size 2
Criterion-related Predictive Study Issues 1. Sample size 2. Statistical power (Type II error) 3. Null hypothesis rejection (Type I error) 4. Magnitude of the effect Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

36 Criterion-related Predictive Study Issues 1. Sample size effect
Criterion-related Predictive Study Issues 1. Sample size effect the larger the sample size, the greater the power smaller samples can be accommodated statistically Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

37 Criterion-related Predictive Study Issues 2
Criterion-related Predictive Study Issues 2. Statistical power (Type II error) rejecting null when null is false can increase with larger samples can increase with larger region for rejecting null Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

38 Criterion-related Predictive Study Issues 3. Null hypothesis rejection
Criterion-related Predictive Study Issues 3. Null hypothesis rejection (Type I error, alpha) usually set at .05 or determining a desired power estimate desired effect size small, medium, large correlation coefficient Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

39 Criterion-related Predictive Study Issues 4. Magnitude of the effect
Criterion-related Predictive Study Issues 4. Magnitude of the effect predetermine small, medium, large correlation coefficient easier to achieve w/large sample Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

40 Criterion-related Predictive Study Issues When power and effect size specified, can use to determine needed sample size larger sample needed for power smaller sample ok w/larger effects fixed sample & effect, change alphas Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

41 Criterion-related Predictive Study Other Issues Acknowledge small samples & continue to collect data over time Consider time between initial measure and criterion performance appraisal Samples must be representative Should include actual applicants motivated to do well Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

42 Criterion-related Concurrent Study Method– look at predictor data and actual performance concurrently or at same time Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

43 Criterion-related Concurrent Study Method– 1. Determine criterion 2
Criterion-related Concurrent Study Method– 1. Determine criterion 2. Collect successful employees’ criterion measures Collect performance appraisals of successful employees 4. Compare relationship between criterion measure and appraisals 5. Choose new applicants based on closeness of scores to successful performers Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

44 Criterion-related Concurrent Study Method– may be more cost effective than traditional predictive validity study does not account for effects of motivation & job experience Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

45 Criterion-related Concurrent Study Method– appear useful for cognitive ability test measures is not interchangeable with predictive studies must consider situations surrounding the study and uncontrolled variables Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

46 Criterion-related criterion requirements for predictive & concurrent studies * sensitivity to random errors * dependably indicate differences * free from contamination * performance criteria must be collected independently of predictor criteria Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

47 Factors affecting obtained validity coefficients Range Enhancement Range Restriction Position in employment process Form of predictor criterion relationship Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

48 Factors affecting obtained validity coefficients Range Enhancement
Factors affecting obtained validity coefficients Range Enhancement validity will appear falsely high if the validation group is broader than the applicant pool example validating with machinists, mechanics, tool crib attendants, engineers predicting for engineers only Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

49 Factors affecting obtained validity coefficients Range Restriction -
Factors affecting obtained validity coefficients Range Restriction - validity appears too low if either the predictor or the criterion is limited Direct range restriction - measures used to select prior to validation Indirect range restriction - experimental predictors administered but not included in employment decision Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

50 Factors affecting obtained validity coefficients Range Restriction -
Factors affecting obtained validity coefficients Range Restriction - may also occur when predictor selection occurs at hiring criterion selection occurs while on the job Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

51 Factors affecting obtained validity coefficients To interpret validity given range restrictions, apply appropriate statistic by determining 1. if restriction exists for predictor, criterion, or a third variable 2. whether unrestricted variances for relevant variables exist 3. whether third variable measured or unmeasured Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

52 Factors affecting obtained validity coefficients To interpret validity given range restrictions where unrestricted variances are unknown, may use multivariate correction formula and/or the RANGEJ computer program Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

53 Factors affecting obtained validity coefficients Position in employment process variance is greater during selection variance is restricted during later employment processes Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

54 Factors affecting obtained validity coefficients Form of predictor criterion relationship normal distributions assumed predictor & criterion relationships linear column segmentation variance equal Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

55 Summarizing content & criterion-related validity content – items cover intended domain criterion – empirical relationships between predictor & criterion next consideration, construct validity Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

56 Construct-Related Evidence focuses on the measure trait allows the interpretation of scores
Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

57 Construct-Related Evidence Method 1. State hypotheses 2
Construct-Related Evidence Method State hypotheses 2. Define traits nomonologically 3. Ask test takers about their strategies 4. Analyze internal consistency of items 5. Consult behavioral domain experts 6. Correlate procedures 7. Factor analyze group of procedures 8. Structural equation modeling 9. Consider scores’ discrimination 10. Demonstrate relationships 11. Convergent & discriminant validity Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

58 Cross-validation defined – when scores from one sample accurately predict outcomes for other samples of same population or for the whole population often assumed to exist needs verification Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

59 Cross-validation issue to address, phenomenon of shrinkage shrinkage occurs when weighted predictors from one sample are applied to another sample large when initial sample small when test items not relevant when many predictors Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

60 Cross-validation methods Empirical –. apply regression model to sample
Cross-validation methods Empirical – apply regression model to sample apply same model to second sample Statistical – adjusts multiple correlation coefficient as function of sample size and number of predictors Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

61 Cross-validation comparisons Empirical –. costly
Cross-validation comparisons Empirical – costly may not yield better results than statistical Statistical – should be recalculated annually numbers will reflect changes in values and changes to job needs Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

62 Gathering Validity Evidence Synthetic validity – job analysis based
Gathering Validity Evidence Synthetic validity – job analysis based infers validity for jobs considers job analysis elements as applied to job position of interest offers legal acceptance offers feasibility Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

63 Gathering Validity Evidence Job Analysis Tools
Gathering Validity Evidence Job Analysis Tools Position Analysis Questionnaire (PAQ) General Aptitude Test Battery Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

64 Gathering Validity Evidence Test transportability – using a test developed elsewhere Requires results of other study’s criterion- related study results of test fairness documented job similarities documented applicant similarities Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

65 Gathering Validity Evidence Validity generalization –
Gathering Validity Evidence Validity generalization – meta-analyses focused on testing the situational specificity hypothesis between variables Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

66 Gathering Validity Evidence Meta-analyses -. statistical comparisons
Gathering Validity Evidence Meta-analyses statistical comparisons two variables across studies variability of this relationship across studies Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

67 Gathering Validity Evidence Small organizations with few positions cannot conduct full validation studies. Small organizations can apply synthetic validity, test transportability, & validity generalization. Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

68 Gathering Validity Evidence Validity Generalization (VG) Study 1
Gathering Validity Evidence Validity Generalization (VG) Study Obtain validity coefficient for each study, compute the mean 2. Calculate the variance 3. Correct the mean and variance 4. Compare corrected standard deviation 5. For large variations, analyze moderator variables Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

69 Gathering Validity Evidence VG Study Improvements Remember that improvements can be made to VG Studies Meta-analytic results need to be considered carefully as bias may remain Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall

70 Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher. Printed in the United States of America. Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall Copyright © 2011 Pearson Education, Inc. publishing as Prentice Hall


Download ppt "PowerPoint Slides developed by Ms. Elizabeth Freeman"

Similar presentations


Ads by Google