Download presentation
Presentation is loading. Please wait.
Published byNergiz sar Modified over 8 years ago
1
Chapter 3 Designing Research Concepts, Hypotheses, and Measurement
2
Research Design Must create a Research Design Questions are composed of concepts Must start with a research question
3
Stages of Research 1)Developing Concepts 2)Operationalization 3)Selection of Research Method(s) 4)Sampling Strategy 5)Data Collection ‘Plan’ 6)Analyses 7)Results and Writing Also need to consider budget issues
4
Operationalization It is critical to survey research to understand how to go from ideas to concepts to variables – operationalization.
5
Concepts Concept (p.35): an idea, a general mental formulation summarizing specific occurrences A label we put on a phenomenon, a matter, a “thing” that enables us to link separate observations, make generalizations, communicate and inherit ideas. Concepts can be concrete, abstract, tangible or intangible. Concrete: Height, Major Abstract: Happiness, Love
6
Transferring Concepts into something Measurable Variable: A representation of concept in its variation of degree, varieties or occurrence. A characteristic of a thing that can assume varying degrees or values. Fixed meaning = constant Most variables are truly variable = multiple categories or variables
7
Example: Concept and Variable Concept: Political participation Variables: Voted or not How many times a person has voted What party a person votes for
8
How to be measured? Conceptualization: The process of conceptualization includes coming to some agreement about the meaning of the concept In practice, you often move back and forth between loose ideas of what you are trying to study and searching for a word that best describes it. Sometimes you have to “make up” a name to encompass your concept.
9
Conceptualization As you flush out the pieces or aspects of a concept, you begin to see the dimensions; the terms that define subgroups of a concept. With each dimension, you must decide on indicators – signs of the presence or absence of that dimension. Dimensions are usually concepts themselves.
10
Operationalizing Choices You must operationalize: process of converting concepts into measurable terms The process of creating a definition(s) for a concept that can be observed and measured The development of specific research procedures that will result in empirical observations SES is defined as a combination of income and education and I will measure each by… The development of questions (or characteristics of data in qualitative work) that will indicate a concept
11
Variable Attribute Choices Variable attributes need to be exhaustive and exclusive Represent full range of possible variation Degree of Precision selection depends on your research interest Is it better to include too much or too little?
12
Variables The dependent variable is the variable that the researcher measures; it is called a dependent variable because it depends upon (is caused by) the independent variable. The independent variable is the one that the researcher manipulates. Example: If you are studying the effects of a new educational program on student achievement, the program is the independent variable and your measures of achievement are the dependent ones.
13
Variables Qualitative Variable: Composed of categories which are not comparable in terms of magnitude Quantitative Variable: Can be ordered with respect to magnitude on some dimension Continuous Variable: A quantitative variable, which can be measured with an arbitrary degree of precision. Any two points on a scale of a continuous variable have an infinite number of values in between. It is generally measured. Discrete Variable: A quantitative variable where values can differ only by well-defined steps with no intermediate values possible. It is generally counted.
14
Level of Measurement Nominal Ordinal Interval Ratio
15
Nominal Measures Only offer a name or a label for a variable There is not ranking They are not numerically related Gender; Race
16
Ordinal Measures Variables with attributes that can be rank ordered Can say one response is more or less than another Distance between does not have meaning lower class, middle and upper class Note: Scales and indexes are ordinal measures, but conventions for analysis allow us to assume equidistance between attributes (if it makes logical sense); treat them like “interval” measures; and subject them to statistical tests
17
Interval Measures Distance separating attributes has meaning and is standardized (equidistant) “0” value does not mean a variable is not present Score on an ACT test 50 vs. 100 does not mean person is twice as smart
18
Ratio Measures Attributes of a variable have a “true zero point” that means something Waist measures and Biceps measures Allows one to create ratios
19
Hypotheses Hypotheses: (pg. 36) Untested statements that specify a relationship between 2 or more variables. Example: Milk Drinkers Make Better Lovers
20
Characteristics of a Hypothesis States a relationship between two or more variables Is stated affirmatively (not as a question) Can be tested with empirical evidence Most useful when it makes a comparison States how multiple variables are related Theory or underlying logic of the relationship makes sense
21
Hypotheses should be clearly stated at the beginning of a study. Do not have to have a hypothesis to conduct research, general research questions.
22
Positive and Negative (Inverse) Relationships Positive: as values of independent variable increase, the values of the dependent variable increase Negative: as values of independent variable increase, the values of the dependent variable decrease (or vice versa)
23
Two-directional Hypotheses More general expression of a hypothesis Usually default in stat packages Suggests that groups are different or concepts related, but without specifying the exact direction of the difference Example: Men and women trust UK security differently.
24
One-directional hypotheses More specific expression of a hypothesis Specifies the precise direction of the relationship between the dependent and independent variables. Example: Women have greater trust in UK security compared to men.
25
Determining Quality of Measurement Accuracy and Consistency in Measurement Validity is accuracy Reliability is consistency
26
Reliability Definition -- The extent to which the same research technique applied again to the same object (subject) will give you the same result Reliability does not ensure accuracy: a measure can be reliable but inaccurate (invalid) because of bias in the measure or in data collector/coder
27
Validity Definition -- The extent to which our measure reflects what we think or want them to be measuring
28
Face Validity Face validity: the measure seems to be related to what we are interested in finding out even if it does not fully encompass the concept concept = intellectual capacity measure = grades (high face validity) measure = # of close friends (low face validity)
29
Criterion Validity Criterion validity (predictive validity): the measure is predictive of some external criterion Criterion = Success in College Measure = ACT scores (high criterion validity?)
30
Construct Validity Construct Validity: the measure is logically related to another variable as conceptualized it to be construct = happiness measure = financial stability if not related to happiness, low construct validity
31
Content Validity Content Validity: how much a measure covers a range of meanings; did you cover the full range of dimensions related to a concept Example: You think that you are measuring prejudice, but you only ask questions about race what about sex, religious etc.?
32
Methodological Approaches, Reliability and Validity Qualitative research methods lend themselves to high validity and lower reliability. Quantitative research methods lend themselves to lower validity and higher reliability
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.