Construct Validity By Michael Kotutwa Johnson Submitted October 23, 2006 AED 615 Professor Franklin.

Slides:



Advertisements
Similar presentations
Elliott / October Understanding the Construct to be Assessed Stephen N. Elliott, PhD Learning Science Institute & Dept. of Special Education Vanderbilt.
Advertisements

Measurement Concepts Operational Definition: is the definition of a variable in terms of the actual procedures used by the researcher to measure and/or.
 Degree to which inferences made using data are justified or supported by evidence  Some types of validity ◦ Criterion-related ◦ Content ◦ Construct.
Conceptualization and Measurement
Cal State Northridge Psy 427 Andrew Ainsworth PhD
© 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill Validity and Reliability Chapter Eight.
Reliability And Validity
Research Methodology Lecture No : 11 (Goodness Of Measures)
Social Research Methods
Other Measurement Validity Types. OverviewOverview l Face validity l Content validity l Criterion-related validity l Predictive validity l Concurrent.
VALIDITY OF MEASUREMENT S P M V Subbarao Professor Mechanical Engineering Department Justification for Selection of Concepts to Hardware ?????
Chapter 4 Validity.
Reliability or Validity Reliability gets more attention: n n Easier to understand n n Easier to measure n n More formulas (like stats!) n n Base for validity.
A bunch of stuff you need to know
Developing a Hiring System Reliability of Measurement.
Problem Identification
Validity of Selection. Objectives Define Validity Relation between Reliability and Validity Types of Validity Strategies.
Validity and Reliability EAF 410 July 9, Validity b Degree to which evidence supports inferences made b Appropriate b Meaningful b Useful.
Chapter 7 Evaluating What a Test Really Measures
Construct Validity and Measurement
Validity Lecture Overview Overview of the concept Different types of validity Threats to validity and strategies for handling them Examples of validity.
Questions to check whether or not the test is well designed: 1. How do you know if a test is effective? 2. Can it be given within appropriate administrative.
1 Evaluating Psychological Tests. 2 Psychological testing Suffers a credibility problem within the eyes of general public Two main problems –Tests used.
VALIDITY. Validity is an important characteristic of a scientific instrument. The term validity denotes the scientific utility of a measuring instrument,
Validity and Reliability
Slide 9-1 © 1999 South-Western Publishing McDaniel Gates Contemporary Marketing Research, 4e Understanding Measurement Carl McDaniel, Jr. Roger Gates Slides.
Measurement in Exercise and Sport Psychology Research EPHE 348.
Bryman: Social Research Methods, 4 th edition What is a concept? Concepts are: Building blocks of theory Labels that we give to elements of the social.
Principles of Test Construction
Validity & Practicality
6. Conceptualization & Measurement
Validity. Face Validity  The extent to which items on a test appear to be meaningful and relevant to the construct being measured.
Introduction to Validity
Measurement Validity.
METODE PENELITIAN AKUNTANSI. Tugas Tugas Telaah Tugas Riset.
Research: Conceptualization and Measurement Conceptualization Steps in measuring a variable Operational definitions Confounding Criteria for measurement.
Validity Validity: A generic term used to define the degree to which the test measures what it claims to measure.
Evaluating Survey Items and Scales Bonnie L. Halpern-Felsher, Ph.D. Professor University of California, San Francisco.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
Validity: Introduction. Reliability and Validity Reliability Low High Validity Low High.
Validity Validity is an overall evaluation that supports the intended interpretations, use, in consequences of the obtained scores. (McMillan 17)
Validity and Item Analysis Chapter 4. Validity Concerns what the instrument measures and how well it does that task Not something an instrument has or.
Validity and Item Analysis Chapter 4.  Concerns what instrument measures and how well it does so  Not something instrument “has” or “does not have”
Ethnographic Research By Michael Kotutwa Johnson Submitted November 9 th, 2006 AED 615 Professor Franklin.
 A test is said to be valid if it measures accurately what it is supposed to measure and nothing else.  For Example; “Is photography an art or a science?
Criterion Validity Kyle Sharp A ED 615 Fall 2006.
Chapter 6 - Standardized Measurement and Assessment
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Measurement and Data Quality.
WHS AP Psychology Unit 7: Intelligence (Cognition) Essential Task 7-3:Explain how psychologists design tests, including standardization strategies and.
MGMT 588 Research Methods for Business Studies
Chapter 2 Theoretical statement:
Principles of Language Assessment
Unit 8: Intelligence (Cognition)
VALIDITY by Barli Tambunan/
Lecture 5 Validity and Reliability
Reliability and Validity
Tests and Measurements: Reliability
A Focus on Survey Development
Outline the steps خطوات in the selection اختيار process عملية
Introduction to Measurement
Human Resource Management By Dr. Debashish Sengupta
Week 3 Class Discussion.
پرسشنامه کارگاه.
PSY 614 Instructor: Emily Bullock Yowell, Ph.D.
Reliability and Validity of Measurement
Social Research Methods
Measurement Concepts and scale evaluation
Cal State Northridge Psy 427 Andrew Ainsworth PhD
Chapter 8 VALIDITY AND RELIABILITY
Presentation transcript:

Construct Validity By Michael Kotutwa Johnson Submitted October 23, 2006 AED 615 Professor Franklin

Overview Definitions of Construct Validity Types of Construct Validity How Construct Validity is Used in Research Where to Find an Example of Construct Validity References

Definition of Construct Validity “Construct validity refers to the degree to which inferences can legitimately be made from the operationalizations in your study to the theoretical constructs on which those opernationalizations were based…….construct validity involves generalizing from your program or measures to the concept of your program or measures. You might think of construct validity as a ‘labeling’ issue.” Source: Construct Validity (

Another Definition of Construct Validity “ Construct validity is indeed the unifying concept of validity that integrates criterion and content considerations into a common framework for testing rational hypothesis about theoretically relevant relationships. This construct meaning provides a rational basis both for hypothesizing predictive relationships and for judging content relevance and representiveness.” Messick, 1980

Overview Definitions of Construct Validity Definitions of Construct Validity Types of Construct Validity Types of Construct Validity How Construct Validity is Used in Research How Construct Validity is Used in Research Where to Find an Example of Construct Validity Where to Find an Example of Construct Validity References References

Types of Construct Validity Translation Validity Translation Validity Face Validity Face Validity Content Validity Content Validity Criterion-Related Validity Criterion-Related Validity Predictive Validity Predictive Validity Concurrent Validity Concurrent Validity Convergent Validity Convergent Validity

Translation Validity Face Validity- If you look at the operationalization and see whether on its face seems like a good translation of the construct. Face Validity- If you look at the operationalization and see whether on its face seems like a good translation of the construct. Content Validity- Checking the operationalization against the relevant content domain for the construct. Content Validity- Checking the operationalization against the relevant content domain for the construct.

Criterion-Related Validity Predictive Validity- Assessment of the operationalization’s ability to predict something it should theoretically be able to predict. Predictive Validity- Assessment of the operationalization’s ability to predict something it should theoretically be able to predict. Concurrent Validity-Assessment of the operationalization’s ability to distinguish between groups that it should theoretically be able to distinguish between. Concurrent Validity-Assessment of the operationalization’s ability to distinguish between groups that it should theoretically be able to distinguish between. Convergent Validity- The degree to which the operationalization is similar to (converges on) other operationalizations that it theoretically should be similar to. Convergent Validity- The degree to which the operationalization is similar to (converges on) other operationalizations that it theoretically should be similar to.

Overview Definitions of Construct Validity Definitions of Construct Validity Types of Construct Validity Types of Construct Validity How Construct Validity is Used in Research How Construct Validity is Used in Research Where to Find and Example of Construct Validity Where to Find and Example of Construct Validity References References

How Construct Validity is Used in Research Three Step Process Three Step Process 1. The variable being measured is clearly defined; 2. Hypotheses, based on theory underlying the variable, are formed about how people who possess a lot versus a little of the variable will behave in a particular situation; 3. The hypothesis are tested both logically and empirically.

For Example…….. Suppose a researcher interested in developing a pencil-and-paper test to measure honesty wants to use a construct-validity approach. Suppose a researcher interested in developing a pencil-and-paper test to measure honesty wants to use a construct-validity approach. Then What….

……Application of Construct- Validity Three Step Process. He/She first defines “honesty.” He/She first defines “honesty.” Next they formulate a theory about how “honest” people behave as compared to “dishonest” people. Next they formulate a theory about how “honest” people behave as compared to “dishonest” people. Based on the theory, the researcher might hypothesize that individuals who score high on their honesty test will be more likely to attempt to locate the owner of an object they find than individuals who score low on the test. Based on the theory, the researcher might hypothesize that individuals who score high on their honesty test will be more likely to attempt to locate the owner of an object they find than individuals who score low on the test. The researcher then administers the honesty test, separates the names of those who score high and those who score low, and gives all of them an opportunity to be honest. The researcher then administers the honesty test, separates the names of those who score high and those who score low, and gives all of them an opportunity to be honest.

Application process cont…. If the researchers hypotheses is substantiated, more of the high scorers than the low scorers on the honesty test will attempt to call the owner of the wallet. If the researchers hypotheses is substantiated, more of the high scorers than the low scorers on the honesty test will attempt to call the owner of the wallet. With this piece of evidence it can be construed that their inferences may be substantiated about the honesty of individuals, based on the scores they receive on the test. With this piece of evidence it can be construed that their inferences may be substantiated about the honesty of individuals, based on the scores they receive on the test. Furthermore, it is a broad array of evidence, rather than any one particular type of evidence, that is desired when using construct-validity. Furthermore, it is a broad array of evidence, rather than any one particular type of evidence, that is desired when using construct-validity.

Overview Definitions of Construct Validity Definitions of Construct Validity Types of Construct Validity Types of Construct Validity How Construct Validity is Used in Research How Construct Validity is Used in Research Where to Find and Example of Construct Validity Where to Find and Example of Construct Validity References References

Where to Find an Example of Construct Validity Dyer, J. & Osborne, E. (1996). Effects of teaching approach on problem solving ability of agricultural education students with varying learning styles. Journal of Agriculture Education, 37(4),

References Fraenkal J.R. & Wallen N.E. (2006). How to design and evaluate research and education. New York: McGraw Hill. Fraenkal J.R. & Wallen N.E. (2006). How to design and evaluate research and education. New York: McGraw Hill. Conrad, J. (Ed.). (1994). Reassessing validity threats in experiments: Focus on construct validity. San Francisco: Jossey-Bass Publishers. Conrad, J. (Ed.). (1994). Reassessing validity threats in experiments: Focus on construct validity. San Francisco: Jossey-Bass Publishers. Pearlman, K. (1983). Validity generalization applied to the construct validity of a broad-band examination. Washington D.C.: U.S. Office of Personnel Management. Pearlman, K. (1983). Validity generalization applied to the construct validity of a broad-band examination. Washington D.C.: U.S. Office of Personnel Management. Messick, S. (1980). The standard problem: Meaning and values in measurement and evaluation. Psychologist, Messick, S. (1980). The standard problem: Meaning and values in measurement and evaluation. Psychologist, Dyer, J. & Osborne, E. (1996). Effects of teaching approach on problem solving ability of agricultural education students with varying learning styles. Journal of Agriculture Education, 37(4), Dyer, J. & Osborne, E. (1996). Effects of teaching approach on problem solving ability of agricultural education students with varying learning styles. Journal of Agriculture Education, 37(4), Construct Validity ( Construct Validity ( Construct Validity ( Construct Validity (