Chapter Thirteen Measurement Winston Jackson and Norine Verberg Methods: Doing Social Research, 4e.

Slides:



Advertisements
Similar presentations
Chapter 3 Introduction to Quantitative Research
Advertisements

Chapter 3 Introduction to Quantitative Research
Agenda Levels of measurement Measurement reliability Measurement validity Some examples Need for Cognition Horn-honking.
Conceptualization and Measurement
The Research Consumer Evaluates Measurement Reliability and Validity
Taking Stock Of Measurement. Basics Of Measurement Measurement: Assignment of number to objects or events according to specific rules. Conceptual variables:
Psychometrics William P. Wattles, Ph.D. Francis Marion University.
Professor Gary Merlo Westfield State College
Research Methodology Lecture No : 11 (Goodness Of Measures)
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
Designing Research Concepts, Hypotheses, and Measurement
Validity In our last class, we began to discuss some of the ways in which we can assess the quality of our measurements. We discussed the concept of reliability.
CH. 9 MEASUREMENT: SCALING, RELIABILITY, VALIDITY
RESEARCH METHODS Lecture 18
MEASUREMENT. Measurement “If you can’t measure it, you can’t manage it.” Bob Donath, Consultant.
Concept of Measurement
Beginning the Research Design
Reliability and Validity
1 Measurement PROCESS AND PRODUCT. 2 MEASUREMENT The assignment of numerals to phenomena according to rules.
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 5 Making Systematic Observations.
FOUNDATIONS OF NURSING RESEARCH Sixth Edition CHAPTER Copyright ©2012 by Pearson Education, Inc. All rights reserved. Foundations of Nursing Research,
Variables cont. Psych 231: Research Methods in Psychology.
Chapter 7 Correlational Research Gay, Mills, and Airasian
Measurement Concepts & Interpretation. Scores on tests can be interpreted: By comparing a client to a peer in the norm group to determine how different.
The Practice of Social Research
Measurement and Data Quality
Reliability, Validity, & Scaling
MEASUREMENT OF VARIABLES: OPERATIONAL DEFINITION AND SCALES
Instrumentation.
Foundations of Educational Measurement
Chapter Thirteen Data Collection and Measurement.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
LECTURE 06B BEGINS HERE THIS IS WHERE MATERIAL FOR EXAM 3 BEGINS.
Copyright © 2008 by Nelson, a division of Thomson Canada Limited Chapter 11 Part 3 Measurement Concepts MEASUREMENT.
What is a Measurement? Concept of measurement is intuitively simple  Measure something two concepts involved  The thing you are measuring  The measurement.
Chapter Nine
Counseling Research: Quantitative, Qualitative, and Mixed Methods, 1e © 2010 Pearson Education, Inc. All rights reserved. Basic Statistical Concepts Sang.
Chapter 13 Measurement Winston Jackson and Norine Verberg Methods: Doing Social Research, 4e.
Measurement Validity.
Learning Objective Chapter 9 The Concept of Measurement and Attitude Scales Copyright © 2000 South-Western College Publishing Co. CHAPTER nine The Concept.
Chapter 9 Three Tests of Significance Winston Jackson and Norine Verberg Methods: Doing Social Research, 4e.
Measurement and Questionnaire Design. Operationalizing From concepts to constructs to variables to measurable variables A measurable variable has been.
Validity Validity: A generic term used to define the degree to which the test measures what it claims to measure.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
The Theory of Sampling and Measurement. Sampling First step in implementing any research design is to create a sample. First step in implementing any.
SOCW 671: #5 Measurement Levels, Reliability, Validity, & Classic Measurement Theory.
Scaling and Index Construction
Measurement Theory in Marketing Research. Measurement What is measurement?  Assignment of numerals to objects to represent quantities of attributes Don’t.
©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
Measurement Experiment - effect of IV on DV. Independent Variable (2 or more levels) MANIPULATED a) situational - features in the environment b) task.
Chapter 6 - Standardized Measurement and Assessment
1 Announcement Movie topics up a couple of days –Discuss Chapter 4 on Feb. 4 th –[ch.3 is on central tendency: mean, median, mode]
Measurement Chapter 6. Measuring Variables Measurement Classifying units of analysis by categories to represent variable concepts.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Measurement and Data Quality.
Measurement and Scaling Concepts
Chapter 3 Designing Research Concepts, Hypotheses, and Measurement.
Chapter 2 Theoretical statement:
Ch. 5 Measurement Concepts.
CHAPTER 6, INDEXES, SCALES, AND TYPOLOGIES
Concept of Test Validity
Associated with quantitative studies
CHAPTER 5 MEASUREMENT CONCEPTS © 2007 The McGraw-Hill Companies, Inc.
Understanding Results
پرسشنامه کارگاه.
Measuring Social Life: How Many? How Much? What Type?
Chapter Eight: Quantitative Methods
RESEARCH METHODS Lecture 18
Measurement Concepts and scale evaluation
Presentation transcript:

Chapter Thirteen Measurement Winston Jackson and Norine Verberg Methods: Doing Social Research, 4e

2© 2007 Pearson Education Canada A. Theoretical, Conceptual, and Operational Levels Measurement is the “process of linking abstract concepts to empirical referents” (Carmines & Zeller) Hence, one moves from the general (theoretical level) to the specific (empirical level) i.e., For each concept, an indicator is identified e.g., What is the best way to measure, or indicate, a person’s social prestige? The concepts we measure are called variables See Figure 13.1 (next slide)

3© 2007 Pearson Education Canada Figure 13.1 Levels in Research Design FPO Figure 13.1 Levels in Research Design from page 350

4© 2007 Pearson Education Canada Figure 13.1 Shows movement from the general to the specific – from the theoretical level to the operational level referred to as operationalization At the theoretical level, concepts (e.g., socioeconomic status, alienation, job satisfaction, conformity, age, gender, poverty, political efficacy) are conceptualized At the operational level, the researcher must create measures (or indicators) for the concept Indicators should reflect the variable’s conceptual definition

5© 2007 Pearson Education Canada Assessing Indicators We assess the link between the concepts and the indicators by evaluating the validity and reliability of the indicators 1. Validity the extent to which a measure reflects a concept, reflecting neither more nor less than what is implied by the conceptual definition 2. Reliability the extent to which, on repeated measures, an indicator yields similar readings

6© 2007 Pearson Education Canada 1. Validity (in Quantitative Research) Illustration: concept - socioeconomic status Conceptual definition: a “hierarchical continuum of respect and prestige” Operational definition: annual salary Assessment: Low validity (salary might not capture prestige – widows, ministers, nuns – prestige and respect would be higher than income suggests Measure should be congruent with conceptual definition

7© 2007 Pearson Education Canada Types of validity Face validity… on the face of it... Content validity…reflects the dimension(s) implied by the concept Criterion validity: two types Concurrent validity…correlation of one measure with another Predictive validity...predict accurately Construct validity…distinguishes participants who differ on the construct

8© 2007 Pearson Education Canada Validity in experimental design Internal validity- the extent to which you can demonstrate that the treatment produces changes in dependent variable External validity - the extent to which one can extrapolate from study to the general pop’n In qualitative research… “credibility” is the issue.. Degree to which the description “rings true” to the subjects of the study, to other readers, or to other researchers

9© 2007 Pearson Education Canada 2. Reliability ( 2. Reliability (in Quantitative Methods) Reliability refers to the extent to which, on repeated measures, an indicator yields similar readings. Assessing internal reliability of items used to construct an index (an index combines several items into a single score) Split-half method: randomly split the items in two, construct index, do results correlate highly? Internal consistency: statistical procedure done in SPSS (described later in chapter)

10© 2007 Pearson Education Canada B. Measurement Error Researchers assume that the object being measured has two or more values (i.e., is not a constant) and that it has a “true value” true value – the underlying exact quantity of a variable at any given time Researchers also assume that measurement errors will always occur because instruments are imperfect Measurement error is any deviation from “true value”

11© 2007 Pearson Education Canada Measurement Error (Cont’d) Measures are made up of the following components: MEASURE= true value +/- (SE+/-RE) SE ~ Systematic error is non-random error that systematically over- or under-estimates a value (eg., systematically assigning the lowest value when a respondent does not answer) RE ~ Random error is random fluctuations around the true value Not a problematic…should average out

12© 2007 Pearson Education Canada 1. Tips for Reducing Random and Systematic Error 1. Take average of several measures 2. Use several different indicators 3. Use random sampling procedures 4. Use sensitive measures 5. Avoid confusion in wording of questions or instructions 6. Error-check data carefully 7. Reduce subject/experimenter expectations

13© 2007 Pearson Education Canada C. Levels of Measurement Introduced in Chapter 8; This chapter stresses the importance of level of measurement for measuring concepts Type of level of measurement influences which statistical procedures one can use Three levels of measurement 1. Nominal 2. Ordinal 3. Ratio

14© 2007 Pearson Education Canada

15© 2007 Pearson Education Canada D. The Effects of Reduced Levels of Measurement Best to achieve most precise, and highest, level of measurements possible When lower levels are used, the results under- estimate the relative importance of a variable The greater the reduction in measurement precision, the greater the drop in correlations between variable Precisely measured variables will appear to be more important than poorly measured ones

16© 2007 Pearson Education Canada E. Indexes, Scales, and Special Measurement Procedures Combining several indicators into one score results in an index or scale While used interchangeably, an index refers to the combination of two or more indicators; a scale refers to a more complex combination of indicators where the pattern of responses is taken into account Indexes are routinely constructed to reflect complex variables Socioeconomic status, job satisfaction, group dynamics, social attitudes toward an issue

17© 2007 Pearson Education Canada 1. Item Analysis Items in an index should discriminate well Example of test item development Test graded, students divided into upper and lower quartile Examine performance on each question Select those questions that discriminate best

18© 2007 Pearson Education Canada Discrimination of Items FPO Table 13.1 Discrimination Ability of 100 Items: Percentage Correct for Each Item, by Quartile, from page 350

19© 2007 Pearson Education Canada 2. Selecting Index Items 1. Review conceptual definition Does the concept have ranges or dimensions 2. Develop measures for each dimension Developed items for each dimension of the concept 3. Pre-test index Complete the index yourself, then pre-test it with target-group members 4. Pilot test index Use SPSS to assess internal consistency

20© 2007 Pearson Education Canada 3. The Rationale for Using Several Items in an Index

21© 2007 Pearson Education Canada 4. Likert-based Indexes

22© 2007 Pearson Education Canada A. Tips for Constructing Likert- based Index The “and” alert: avoid multiple dimensions Strongly Agree on right hand side 9-points Response set issue Avoid negatives like “not” simply use negative wording. Vary strength of wording to produce variation in response Exercise….items for a euthanasia index

23© 2007 Pearson Education Canada A. Tips for Constructing a Likert- based Index

24© 2007 Pearson Education Canada B. Evaluation of Likert-based Indexes

25© 2007 Pearson Education Canada C. Using the Internal Consistency Approach to Selecting Index Items

26© 2007 Pearson Education Canada Semantic Differential Procedures A variety of anchors are used and people place themselves or others on a continuum: shy/outgoing; bookworm/social butterfly Continued…

27© 2007 Pearson Education Canada 5. Box 13.3

28© 2007 Pearson Education Canada 6. Magnitude Estimation Procedures subjects use numbers or line lengths to indicate perceptions. Very good for comparisons: yields ratio level measures. Comparing liking of teachers; seriousness of crimes; liking of one community compared to another one, etc.

29© 2007 Pearson Education Canada Tips for Using Magnitude Estimation Procedures 1. Only use ME when a researcher is present to explain the method to respondents 2. Use ME when comparative judgments sought 3. Use a stimulus category somewhere near the middle of the range you intend to use as a standard (avoid a standard too high or low) 4. After the standard established,