Lecture 5 Validity and Reliability

Slides:



Advertisements
Similar presentations
Questionnaire Development
Advertisements

Measurement Concepts Operational Definition: is the definition of a variable in terms of the actual procedures used by the researcher to measure and/or.
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
© 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill Validity and Reliability Chapter Eight.
Reliability Analysis. Overview of Reliability What is Reliability? Ways to Measure Reliability Interpreting Test-Retest and Parallel Forms Measuring and.
Part II Sigma Freud & Descriptive Statistics
Part II Sigma Freud & Descriptive Statistics
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
MEQ Analysis. Outline Validity Validity Reliability Reliability Difficulty Index Difficulty Index Power of Discrimination Power of Discrimination.
Chapter 4A Validity and Test Development. Basic Concepts of Validity Validity must be built into the test from the outset rather than being limited to.
CH. 9 MEASUREMENT: SCALING, RELIABILITY, VALIDITY
Reliability and Validity of Research Instruments
RESEARCH METHODS Lecture 18
Reliability Analysis. Overview of Reliability What is Reliability? Ways to Measure Reliability Interpreting Test-Retest and Parallel Forms Measuring and.
RELIABILITY & VALIDITY
Lecture 7 Psyc 300A. Measurement Operational definitions should accurately reflect underlying variables and constructs When scores are influenced by other.
FOUNDATIONS OF NURSING RESEARCH Sixth Edition CHAPTER Copyright ©2012 by Pearson Education, Inc. All rights reserved. Foundations of Nursing Research,
Validity of Selection. Objectives Define Validity Relation between Reliability and Validity Types of Validity Strategies.
Research Methods in MIS
Validity and Reliability EAF 410 July 9, Validity b Degree to which evidence supports inferences made b Appropriate b Meaningful b Useful.
Classroom Assessment A Practical Guide for Educators by Craig A
Technical Issues Two concerns Validity Reliability
Reliability and Validity what is measured and how well.
Instrumentation.
Foundations of Educational Measurement
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
MEASUREMENT CHARACTERISTICS Error & Confidence Reliability, Validity, & Usability.
Data Analysis. Quantitative data: Reliability & Validity Reliability: the degree of consistency with which it measures the attribute it is supposed to.
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
Unanswered Questions in Typical Literature Review 1. Thoroughness – How thorough was the literature search? – Did it include a computer search and a hand.
LECTURE 06B BEGINS HERE THIS IS WHERE MATERIAL FOR EXAM 3 BEGINS.
Validity and Reliability THESIS. Validity u Construct Validity u Content Validity u Criterion-related Validity u Face Validity.
Reliability & Validity
Reliability vs. Validity.  Reliability  the consistency of your measurement, or the degree to which an instrument measures the same way each time it.
Chapter 8 Validity and Reliability. Validity How well can you defend the measure? –Face V –Content V –Criterion-related V –Construct V.
Concurrent Validity Pages By: Davida R. Molina October 23, 2006.
Session 4 Reliability and Validity. Validity What does the instrument measure and How well does it measure what it is supposed to measure? Is there enough.
Reliability and Validity Themes in Psychology. Reliability Reliability of measurement instrument: the extent to which it gives consistent measurements.
Chapter 6 - Standardized Measurement and Assessment
TEST SCORES INTERPRETATION - is a process of assigning meaning and usefulness to the scores obtained from classroom test. - This is necessary because.
Dr. Jeffrey Oescher 27 January 2014 Technical Issues  Two technical issues  Validity  Reliability.
WHS AP Psychology Unit 7: Intelligence (Cognition) Essential Task 7-3:Explain how psychologists design tests, including standardization strategies and.
Measurement and Scaling Concepts
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
Survey Methodology Reliability and Validity
Unit 8: Intelligence (Cognition)
Ch. 5 Measurement Concepts.
Product Reliability Measuring
Reliability and Validity in Research
Concept of Test Validity
Assessment Theory and Models Part II
Reliability & Validity
Associated with quantitative studies
Test Validity.
Tests and Measurements: Reliability
Journalism 614: Reliability and Validity
Reliability & Validity
Human Resource Management By Dr. Debashish Sengupta
Part Two THE DESIGN OF RESEARCH
Week 3 Class Discussion.
Making Sense of Advanced Statistical Procedures in Research Articles
مركز مطالعات و توسعه آموزش دانشگاه علوم پزشكي كرمان
Reliability and Validity of Measurement
PSY 614 Instructor: Emily Bullock, Ph.D.
Unit IX: Validity and Reliability in nursing research
RESEARCH METHODS Lecture 18
The first test of validity
How can one measure intelligence?
Measurement Concepts and scale evaluation
Chapter 8 VALIDITY AND RELIABILITY
Presentation transcript:

Lecture 5 Validity and Reliability Research Methods and Statistics

Validity and Reliability of Measurements Validity (accuracy) - The degree to which a data collection instrument “accurately” measures what it is supposed to measure for a particular study. Reliability (consistency) - The degree to which a data collection instrument “consistently” measures what it is supposed to measure for a particular study.

Validity Validity includes the appropriateness and meaningfulness of the specific inferences a researcher makes on the basis of the data the researcher collects. Methods of collecting evidence of validity - Content Validity - Criterion Validity

Content Validity Content validity is the degree to which items on a data collection instrument measure an intended content area. How to collect evidence of content validity - Review theory and the literature: how content has been measured in the past, an extensive review of the literature is warranted - Assemble a “panel of experts” to judge your instrument

Criterion Validity Criterion validity is the degree to which data from your instrument are related to data from another instrument (the criterion). Two types of criterion validity - Concurrent validity: degree to which data from your instrument are related to data from another instrument (the criterion) that is collected at the same time. - Predictive validity: degree to which data from your instrument predict future outcomes based on data from another instrument

Criterion Validity How to collecting evidence of criterion validity - Determine appropriate criterion (using theory and literature) against which you will correlate data from your instrument. - Measure the relationship between (correlate) your data and the criterion data and examine the validity coefficient.

Reliability Reliability is the degree to which an instrument consistently measures what it is supposed to measure. Methods for obtaining evidence of reliability - Stability: Test-Retest Reliability - Equivalence: Equivalent Forms Reliability - Internal Consistency: Internal Consistency Reliability Reliability is expressed numerically as a “reliability coefficient” (0<cronbach’s alpha<1)

Test-Retest Reliability Test-Retest Reliability is the degree to which scores from the same instrument are consistent or stable over time. The biggest challenge - Determining how long the time interval between the first and second administration should be.

Equivalent Forms Reliability Equivalent Forms Reliability is the degree to which scores are consistent across two different forms of the same instrument. The biggest challenge with equivalent forms reliability is constructing two forms of the same instrument that are essentially equivalent.

Internal Consistency Reliability Internal Consistency Reliability is the degree to which items on an instrument are consistent among themselves and with the instrument as a whole. Two types of internal consistency reliability - Split-Half : dividing an instrument into 2 equivalent halves and correlating the scores of each half. - Coefficient Alpha (Cronbach’s Alpha): determining how all items of an instrument relate to all other items and to the overall instrument.