Validity: Introduction. Reliability and Validity Reliability Low High Validity Low High.

Slides:



Advertisements
Similar presentations
Questionnaire Development
Advertisements

Chapter 8 Flashcards.
Measurement Concepts Operational Definition: is the definition of a variable in terms of the actual procedures used by the researcher to measure and/or.
Conceptualization and Measurement
The Research Consumer Evaluates Measurement Reliability and Validity
VALIDITY AND RELIABILITY
Professor Gary Merlo Westfield State College
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
Assessment Procedures for Counselors and Helping Professionals, 7e © 2010 Pearson Education, Inc. All rights reserved. Chapter 6 Validity.
Measurement Reliability and Validity
Chapter 4A Validity and Test Development. Basic Concepts of Validity Validity must be built into the test from the outset rather than being limited to.
Reliability and Validity of Research Instruments
RESEARCH METHODS Lecture 18
Reliability and Validity. Criteria of Measurement Quality How do we judge the relative success (or failure) in measuring various concepts? How do we judge.
Rosnow, Beginning Behavioral Research, 5/e. Copyright 2005 by Prentice Hall Ch. 6: Reliability and Validity in Measurement and Research.
Validity and Validation: An introduction Note: I have included explanatory notes for each slide. To access these, you will probably have to save the file.
Understanding Validity for Teachers
Measurement Concepts & Interpretation. Scores on tests can be interpreted: By comparing a client to a peer in the norm group to determine how different.
Measurement and Data Quality
Reliability, Validity, & Scaling
Measurement in Exercise and Sport Psychology Research EPHE 348.
Instrumentation.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
Unanswered Questions in Typical Literature Review 1. Thoroughness – How thorough was the literature search? – Did it include a computer search and a hand.
LECTURE 06B BEGINS HERE THIS IS WHERE MATERIAL FOR EXAM 3 BEGINS.
Copyright © 2008 by Nelson, a division of Thomson Canada Limited Chapter 11 Part 3 Measurement Concepts MEASUREMENT.
I Can Distinguish the types of validity Distinguish the types of reliability Identify if an example is objective or subjective Copyright © Allyn & Bacon.
Principles of Test Construction
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Lecture 6: Reliability and validity of scales (cont) 1. In relation to scales, define the following terms: - Content validity - Criterion validity (concurrent.
Validity. Face Validity  The extent to which items on a test appear to be meaningful and relevant to the construct being measured.
Counseling Research: Quantitative, Qualitative, and Mixed Methods, 1e © 2010 Pearson Education, Inc. All rights reserved. Basic Statistical Concepts Sang.
6. Evaluation of measuring tools: validity Psychometrics. 2012/13. Group A (English)
Measurement Validity.
VALIDITY AND VALIDATION: AN INTRODUCTION Note: I have included explanatory notes for each slide. To access these, you will probably have to save the file.
Validity Validity: A generic term used to define the degree to which the test measures what it claims to measure.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
MOI UNIVERSITY SCHOOL OF BUSINESS AND ECONOMICS CONCEPT MEASUREMENT, SCALING, VALIDITY AND RELIABILITY BY MUGAMBI G.K. M’NCHEBERE EMBA NAIROBI RESEARCH.
Measurement Issues General steps –Determine concept –Decide best way to measure –What indicators are available –Select intermediate, alternate or indirect.
Nurhayati, M.Pd Indraprasta University Jakarta.  Validity : Does it measure what it is supposed to measure?  Reliability: How the representative is.
Reliability performance on language tests is also affected by factors other than communicative language ability. (1) test method facets They are systematic.
Validity and Reliability in Instrumentation : Research I: Basics Dr. Leonard February 24, 2010.
Michigan Assessment Consortium Common Assessment Development Series Module 16 – Validity.
Survey Design Class 02.  It is a true measure  Measurement Validity is the degree of fit between a construct and indicators of it.  It refers to how.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Measurement and Data Quality.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Measurement and Scaling Concepts
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
Reliability and Validity
Principles of Language Assessment
Ch. 5 Measurement Concepts.
Lecture 5 Validity and Reliability
QUESTIONNAIRE DESIGN AND VALIDATION
Reliability and Validity in Research
Concept of Test Validity
Associated with quantitative studies
Test Design & Construction
Evaluation of measuring tools: validity
Introduction to the Validation Phase
Tests and Measurements: Reliability
Reliability & Validity
Introduction to Measurement
Human Resource Management By Dr. Debashish Sengupta
Week 3 Class Discussion.
پرسشنامه کارگاه.
Reliability and Validity of Measurement
RESEARCH METHODS Lecture 18
Measurement Concepts and scale evaluation
Copyright © Allyn & Bacon 2007
Presentation transcript:

Validity: Introduction

Reliability and Validity Reliability Low High Validity Low High

Types of Error in Repeated Measurements True value ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦ ¦

The Validation Sequence “Does it measure what it’s supposed to measure?” 1.First, define what you want to measure (conceptual basis) 2.Select indicators or items that represent that topic (content validity; a sampling issue) 3.Check that items are clear, comprehensible and relevant (face validity, “sensibility”). 4.This produces a pool of items ready for item analysis stage, which involves administering the test and analyzing responses.

Validation Sequence (2): Internal structure Item analysis refers to a series of checks on the performance of each item. Some fall under the heading of reliability, some validity. Faulty items are discarded or replaced. Analyses include: Item distributions & missing values: an item that does not vary cannot measure anything Correlations among items, maybe with factor analysis Item response theory (IRT) analyses

Validation Sequence (3): External associations Criterion validation, if a “gold standard” exists. Sensitivity & specificity are normal statistics. Correlational evidence, leading to construct validation, where there is no single, clear gold standard. Correlations are the normal statistic. Correlations often divided into convergent and discriminant coefficients, according to hypothesized associations These analyses tend to use the entire test administered to selected samples; inadequate performance leads back to basic design

Validation Sequence (4): Group Discrimination Once you show that the test correlates with other measures as intended, its actual performance is evaluated in rating groups of respondents. Analyses generally use representative samples “Known groups” (can it distinguish well from sick; similar to criterion validity) Sensitivity to change over time (relevant to an evaluative measure); responsiveness. Do scores show ceiling or floor effects?

Conclusion Validation is rarely complete. Many instruments continue to be checked for validity 20 years after their invention. Times change, phrasing makes old items obsolete, etc. It is long and expensive. Basic test development and validation may take years: it’s not a thesis project. Remember: validity is about the interpretation of scores. It is a relative concept: a test is not valid or invalid, but only valid or not for a particular application. The viagara principle: a test intended for one purpose may prove good for an unanticipated application.