Validity. Face Validity  The extent to which items on a test appear to be meaningful and relevant to the construct being measured.

Slides:



Advertisements
Similar presentations
Measurement Concepts Operational Definition: is the definition of a variable in terms of the actual procedures used by the researcher to measure and/or.
Advertisements

Cal State Northridge Psy 427 Andrew Ainsworth PhD
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
© 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill Validity and Reliability Chapter Eight.
VALIDITY AND RELIABILITY
Assessment Procedures for Counselors and Helping Professionals, 7e © 2010 Pearson Education, Inc. All rights reserved. Chapter 6 Validity.
General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
Chapter 4A Validity and Test Development. Basic Concepts of Validity Validity must be built into the test from the outset rather than being limited to.
Other Measurement Validity Types. OverviewOverview l Face validity l Content validity l Criterion-related validity l Predictive validity l Concurrent.
RESEARCH METHODS Lecture 18
Chapter 4 Validity.
Test Validity: What it is, and why we care.
Concept of Measurement
Concept of Reliability and Validity. Learning Objectives  Discuss the fundamentals of measurement  Understand the relationship between Reliability and.
Lecture 7 Psyc 300A. Measurement Operational definitions should accurately reflect underlying variables and constructs When scores are influenced by other.
SELECTION & ASSESSMENT SESSION THREE: MEASURING THE EFFECTIVENESS OF SELECTION METHODS.
Personality, 9e Jerry M. Burger
Validity and Reliability EAF 410 July 9, Validity b Degree to which evidence supports inferences made b Appropriate b Meaningful b Useful.
Chapter 7 Correlational Research Gay, Mills, and Airasian
Chapter 7 Evaluating What a Test Really Measures
Validity Lecture Overview Overview of the concept Different types of validity Threats to validity and strategies for handling them Examples of validity.
Understanding Validity for Teachers
Test Validity S-005. Validity of measurement Reliability refers to consistency –Are we getting something stable over time? –Internally consistent? Validity.
Presented to: Dr. Ava Clare Marie O. Robles Class Schedule: TFr /1:00-2:30 pm Presented by: Ierine Joy L. Caserial Republic of the Philippines MINDANAO.
Measurement in Exercise and Sport Psychology Research EPHE 348.
Reliability and Validity what is measured and how well.
Instrumentation.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
MEASUREMENT CHARACTERISTICS Error & Confidence Reliability, Validity, & Usability.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
Unanswered Questions in Typical Literature Review 1. Thoroughness – How thorough was the literature search? – Did it include a computer search and a hand.
Principles of Test Construction
MGTO 324 Recruitment and Selections Validity II (Criterion Validity) Kin Fai Ellick Wong Ph.D. Department of Management of Organizations Hong Kong University.
MGTO 231 Human Resources Management Personnel selection II Dr. Kin Fai Ellick WONG.
MGTO 324 Recruitment and Selections Validity I (Construct Validity) Kin Fai Ellick Wong Ph.D. Department of Management of Organizations Hong Kong University.
Chapter Five Measurement Concepts. Terms Reliability True Score Measurement Error.
Tests and Measurements Intersession 2006.
Assessing the Quality of Research
6. Evaluation of measuring tools: validity Psychometrics. 2012/13. Group A (English)
Measurement Models: Exploratory and Confirmatory Factor Analysis James G. Anderson, Ph.D. Purdue University.
Measurement Validity.
Validity Validity: A generic term used to define the degree to which the test measures what it claims to measure.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
Chapter 4 Validity Robert J. Drummond and Karyn Dayle Jones Assessment Procedures for Counselors and Helping Professionals, 6 th edition Copyright ©2006.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
Validity and Item Analysis Chapter 4. Validity Concerns what the instrument measures and how well it does that task Not something an instrument has or.
Validity and Item Analysis Chapter 4.  Concerns what instrument measures and how well it does so  Not something instrument “has” or “does not have”
Measurement Issues General steps –Determine concept –Decide best way to measure –What indicators are available –Select intermediate, alternate or indirect.
RESEARCH METHODS IN INDUSTRIAL PSYCHOLOGY & ORGANIZATION Pertemuan Matakuliah: D Sosiologi dan Psikologi Industri Tahun: Sep-2009.
Chapter 6 - Standardized Measurement and Assessment
©2005, Pearson Education/Prentice Hall CHAPTER 6 Nonexperimental Strategies.
Classroom Assessment Chapters 4 and 5 ELED 4050 Summer 2007.
Measurement and Scaling Concepts
Consistency and Meaningfulness Ensuring all efforts have been made to establish the internal validity of an experiment is an important task, but it is.
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
Validity. How do they look? Math aptitude test Depression scale How many books do you usually read in a month? What is your favorite snack? Do you take.
Principles of Language Assessment
Ch. 5 Measurement Concepts.
Concept of Test Validity
Test Validity.
Evaluation of measuring tools: validity
HRM – UNIT 10 Elspeth Woods 9 May 2013
Human Resource Management By Dr. Debashish Sengupta
Week 3 Class Discussion.
پرسشنامه کارگاه.
PSY 614 Instructor: Emily Bullock Yowell, Ph.D.
5. Reliability and Validity
Reliability and Validity of Measurement
Measurement Concepts and scale evaluation
Cal State Northridge Psy 427 Andrew Ainsworth PhD
Presentation transcript:

Validity

Face Validity  The extent to which items on a test appear to be meaningful and relevant to the construct being measured

Content-Related Evidence for Validity  The evidence that the content of a test represents the conceptual domain that the test is designed to cover  The extent to which test items or tasks cover a representative sample of the behavioural or conceptual domain they are supposed to measure

Concepts Related to Content Validity  Construct Underrepresentation Failure to capture important components of the construct Failure to capture important components of the construct  Construct-Irrelevant Variance Scores are influenced by factors irrelevant to the construct Scores are influenced by factors irrelevant to the construct

Criterion-Related Evidence for Validity  The evidence that a test score corresponds to an accurate measure of interest, called the “criterion”  The criterion is a direct & independent measure of that which the test is supposed to predict

Validity Coefficient  Correlation between test & criterion  Rarely greater than.6  Considered satisfactory if in the range of.3 to.4

Concerns About Evaluating Validity Coefficients  Changes in cause of relationships  Meaning of criterion  Subject population  Sample size  Confusion of criterion with predictor  Restricted range of predictor & criterion  Evidence for validity generalization  Differential prediction

Construct-Related Evidence for Validity  A CONSTRUCT is a theoretical entity used to explain and organize response consistencies  A process used to establish the meaning of a test through a series of studies  Researcher simultaneously defines a construct & develops a measure of that construct  Then looks at how the measure relates to other measures, behaviours, etc.

Types of Construct-Related Evidence  Convergent evidence Evidence that the test correlates highly with other tests, behaviours that supposedly measure the same or related construct Evidence that the test correlates highly with other tests, behaviours that supposedly measure the same or related construct  Discriminant evidence Evidence that the test is unique, that it measures something different from what other available tests measure Evidence that the test is unique, that it measures something different from what other available tests measure

Ways of Establishing Convergent Validity  Correlation with measure designed to measure same or related construct  Age differentiation  Experimental interventions  Known groups

Class Exercise  Using the four methods (correlation with other measures, age differentiation, experimental intervention, known groups), think of at least 2 ways to validate each of the following: A measure of reading ability A measure of reading ability A measure of “business sense” A measure of “business sense” A measure of musical aptitude A measure of musical aptitude A measure of “social consciousness” or “social responsibility” A measure of “social consciousness” or “social responsibility” A measure of political knowledge A measure of political knowledge

Relationship Between Reliability & Validity  A test should not correlate more highly with any other variable than it correlates with itself  Maximum validity coefficient (R 12max ) between 2 variables is equal to the square root of the product of their reliabilities