Download presentation
1
Neuman & Robson Chapters 5, 6, & (some of) 12 (p. 332-8)
CMNS 260: Empirical Communication Research Methods 4-Measurement (Part 1 of 2 slideshows) Neuman & Robson Chapters 5, 6, & (some of) 12 (p ) systematic observation can be replicated requires: construct (concept) operational measure/instrument/tool for making empirical observations
2
Today’s Lecture Review core notions: concepts, operational measures, empirical research, language of variables, hypothesis testing, errors in explanation, etc. Reliability & Validity: relationship, types Levels of Measurement Scales & Indices (if time– material for this in another slideshow)
3
Recall: The Research Process
Babbie (1995: 101)
4
“Dimensions” of Research
Purpose of Study Intended Use of Study Treatment of Time in Study Space Unit of Analysis Exploratory Descriptive Explanatory Basic Applied -Action -Impact -Evaluation Cross-sectional Longitudinal -Panel -Time series -Cohort analysis -Case Study -Trend study -dependent individual -independent -family -household -artifact (media, technology) Neuman (2000: 37)
5
The Research Wheel The “Research Wheel” Steps in the research process
Choose Topic Focus Research Question Inform Others The Research Wheel The “Research Wheel” Interpret Data Design Study Steps in the research process Collect Data Analyze Data Source: Neuman (1995: 12)
6
Developing research topics
7
Concepts Symbol (image, words, practices…) definition
must be shared to have social meaning concepts with more than one possible value or attribute sometimes called variables
8
Concept Clusters Examples: Peer Group Role Model Broadcast Media
Ethnic Identity Cultural trauma Collective memory Political economy
9
Measurement systematic observation
can be replicated (by someone else) Measures include: Concepts (constructs), theories measurement instrument/tools Must recognize concept in observations (measures) ??(# of library holdings as a measure of quality of university?) MacLeans Magazine survey results, 2000.
10
From Concept to Measure
Neuman (2000: 162)
11
Variables Must have more than one possible “value” or “attribute” Types: dependent variable (effect) independent variable (cause) intervening variable control variable
12
Causal Relationships proposed for testing (NOT like assumptions)
5 characteristics of causal hypothesis at least 2 variables cause-effect relationship (cause must come before effect) can be expressed as prediction logically linked to research question+ a theory falsifiable
13
Errors in Explanation
14
Propositions logical statement about (causal) relationship between two variables i.e. “Increased television watching leads to more shared family time and better communication between children & their parents”
15
Types of Hypotheses (note: plural form of Hypothesis)
Null hypothesis predicts there is no relationship Direct relationship (positive correlation) more time spent studying leads to higher grades Indirect relationship (negative correlation) More time spent playing video games leads to lower grades
16
Hypothesis Testing
17
Possible outcomes in Testing Hypotheses (using empirical research)
support (confirm) hypothesis reject (not support) hypothesis partially confirm or fail to support avoid use of PROVE
18
X Y Causal diagrams Direct relationship (positive correlation)
Indirect relationship (negative correlation)
19
Spurious Association example
20
Causal Diagram S R D R= Racism against non-whites D= Discrimination against non-whites S=Intelligence Test Scores
21
Good & Bad Research Questions
22
Abstract to Concrete Concept to Measure
23
Reliability & Validity
dependability is the indicator consistent? same result every time? Validity measurement validity - how well the conceptual and operational definitions mesh with each other does measurement tool measure what we think ?
24
Types of Validity
25
Content Validity measure represents all the aspects of conceptual definition of construct. how adequately a measure covers behavior representative of the universe of behavior the test was designed to sample. Love
26
Face & Expert Panel Validity
judgement by group or scientific community that indicator measures the construct (conceptual def.) Examples: Socio-economic status (education, income & ?) Digital Divide (differences in access to computers, internet, broadband?...) Construct Measure ? Scientific Community
27
Criterion Validity The validity of an indicator is verified by comparing it with another measure of the same construct in which a researcher has confidence. Predictive : ex. Comparison of Aptitude test & performance measures concurrent validity: ex. Comparison of new measure with established one
28
Construct Validity A type of measurement validity that uses multiple indicators– the construct is a combination of measures of the same variable convergent : positive correlation with related measures discriminate: negative correlation with measures of different variables
29
Other Dimensions of Validity
Internal Validity no error of logic internal to research design External Validity results can be generalized Statistical validity correct statistical methodology chosen ? assumptions fully met
30
Types of Reliability stability representative equivalence intercoder
over time representative across different subgroups of a population equivalence multiple indicators intercoder type of equivalence reliability
31
Improving Reliability
clearly conceptualize constructs increase level of measurement use pretests, pilot studies use multiple indicators : Dependent Variable Measure Independent Empirical Association? a2 a3 a1 b1 b2 A B Specific Indicators
32
Relationship between Measurement Reliability & Validity
reliability necessary for validity but does not guarantee it “necessary but not sufficient” measure can be reliable but invalid (ex. not measuring what you think you are measuring) Source: Neuman (2000: 171)
33
Quantitative & Qualitative “Trustworthiness”
34
Creating Measures Measures must have response categories that are:
mutually exclusive possible observations must only fit in one category exhaustive categories must cover all possibilities composite measures must also be: uni-dimensional
35
Levels of Measurement Levels of Measurement
Categories (or attributes) must be exhaustive & mutually exclusive Relations between levels --can collapse from higher into lower, not vice versa
36
Nominal Measurement different categories (names, labels, images)
not ranked attributes are mutually exclusive and exhaustive. Examples: What media do you use for finding out about news? Television Newspapers Radio Magazines Internet Other Babbie (1995: 137)
37
Ordinal Measurement different categories (mutually exclusive, exhaustive) rank-ordered attributes indicate relatively more or less of that variable. distance between the attributes of a variable is imprecise Example: “How important are newspapers as your news source?”
38
Interval Measurement different categories ranked in order
can also tell amount of difference between categories Babbie (1995: 137)
39
Ratio Measurement different categories ranked in order
amount of difference between categories also possible to state proportion (have a true zero) Example: “What was your income in dollars last year?”
40
Examples
41
Continuous & Discrete Variables
Continuous variables: can have an infinite number of values interval and ratio levels of measurement Discrete variables: distinct categories nominal and ordinal levels of measurement
42
Composite Measures (continued in second slide series)
Composite measures are instruments that use several questions to measure a given variable (construct). A composite measure can be either unidimensional or multidimensional. Ex. Indices (plural form of index) and scales
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.