Download presentation
Presentation is loading. Please wait.
Published byClaire McKinney Modified over 9 years ago
1
URBP 204A QUANTITATIVE METHODS I Social Research Lecture I Gregory Newmark San Jose State University (This lecture accords with Chapters 1,2,3,5 of Earl Babbie’s The Practice of Social Research)
2
How do we know anything? Observation – our experienced knowledge – “Ouch, this fire is hot!” – “Pigeons can fly.” Agreement – our accepted knowledge – “Wait 15 minutes after eating before swimming.” – “Red is a primary color.”
3
Errors of Casual Observation Inaccurate observations – “What color are my pants?” Overgeneralization – “All women are left-handed” Selective Observation – “This driver is slow, probably an old person” Illogical Reasoning – “The exception that proves the rule...”
4
Guarding against those Errors Inaccurate observations – Deliberate, structured measurement Overgeneralization – Committing to a sufficient sample of observations – Replicating the experiment Selective Observation – Research design structures observations Illogical Reasoning – Conscious use of logic – Peer review
5
Views on Reality Pre-Modern – “We see things as they actually are.” Modern – “We see things subjectively, but there is an objective truth out there.” Post-Modern – “We see things subjectively and that is truth.”
6
Scientific Knowledge Logic – Theory Observation – Data Collection – Data Analysis
7
What is Research? “a process through which we attempt to achieve systematically and with the support of data the answers to a question, the resolution of a problem, or a greater understanding of a phenomenon” (Leedy 1997)
8
A Dialectic of Explanation Idiographic Explanation – Explains a single case with excruciating detail – “Here are the 23 reasons I personally chose SJSU” – Specific explanation Nomothetic Explanation – Explains a class of cases with economical detail – “Here are the 4 top reasons students choose SJSU” – General explanation
9
A Dialectic of Theory Inductive Theory – Theorizes from specific cases to general pattern – “San Jose is sunny. Oakland is sunny. LA is sunny. Therefore, cities in California are sunny.” Deductive Theory – Theorizes from general pattern to specific cases – “Cities in California are sunny. San Jose is in California. Therefore San Jose must be sunny.”
11
A Dialectic of Data Qualitative Data – Non-numeric information – “The adult Martian is tall!” Quantitative Data – Numeric information – “The adult Martian is 5’8”.”
12
A Dialectic of Focus Macrotheory – Deals with large aggregate entities of society – “How do economic classes interact?” Microtheory – Deals with intimate level of individuals – “How do panhandlers and pedestrians interact?”
13
Theories are logical explanations Theories prevent our being deceived by flukes – “In theory, everyone rides the bus, so if the first four passengers are women, I won’t assume that all passengers will be.” Theories make sense of observed patterns in a way that can suggest other possibilities – “In theory, if we understand why people take the bus, we can design a policy to support that use.” Theories shape and direct research efforts – “In theory, faster travel times encourage transit. Let’s look into that!”
14
Rational Objectivity Reconsidered Experience is inescapably unique – Individual experience of ‘subjectivity’ Humans seek agreement on what’s real – Social pursuit of ‘objectivity’ Ideas that hold up to inter-subjective scrutiny are considered real – Problem: Scrutiny can be skewed by culture E.g. Ignoring experiences of subaltern groups
15
Hypothesis A testable expectation about empirical reality derived from theory – “If the theory is correct, then x will be observed” – Must be disconfirmable Examples: – Theory: Crime is inversely related to income – Hypothesis: A lower income school district will report more crimes per capita than a higher income one
16
Ethical Issues in Social Research Voluntary participation – Deception – at times research purpose concealed No harm to participants – Anonymity versus confidentiality – Institutional review boards (IRB) Honest analysis and reporting – Use appropriate analytical procedures – Disclose problems and negative results
17
Moving from Theory to Research Conceptualization – Specification of abstract terms in research Operationalization – Development of working (operational) definitions – Specification of procedure (operations) for measuring a variable Measurement – Deliberate empirical observations to describe phenomena in terms of variable attributes
18
Measurement Attributes – Characteristic or quality of something – “ This animal is female.” Variables – Logical sets of attributes Exhaustive – every observation can be classified Mutually exclusive – every observation classified once – “Female is an attribute that composes gender.”
19
Levels of Measurement Nominal Measures – Exhaustive and mutually exclusive only – E.g. birthplace, gender, religious affiliation Ordinal Measures – Exhaustive and mutually exclusive – Capable of being ranked in order – E.g. social class, conservatism, level of satisfaction
20
Levels of Measurement Interval Measures – Exhaustive and mutually exclusive – Capable of being ranked in order – Distance separating attributes has fixed meaning – E.g. Temperature (C or F), credit score, GRE Ratio Measures – Exhaustive and mutually exclusive – Capable of being ranked in order – Distance separating attributes has fixed meaning – Based on a true zero point – E.g. Temperature (K), height (ft), income ($)
21
Levels of Measurement Certain analytical techniques require certain levels of measurement – “Calculate the class’s average height (ft).” – “Calculate the class’s average birthplace.” Measures taken at one level can be recoded into lower levels of measurement – “Recode heights (ft) into three ordinal categories.” – “Recode birthplaces into three ordinal categories.” Level of measurement determined by analytical uses you have planned – If you are not sure how you might use the data, aim for the highest level of measurement
22
Criteria of Measurement Quality Precision (versus Accuracy) A B
23
Criteria of Measurement Quality Precision (versus Accuracy) Reliability – Definition: same technique yields same result – Methods: Test-retest, Split-half, pre-established – Research workers are not always reliable Validity – Definition: accurately measures the concept it is intended to measure
24
Types of Validity Face Validity – Indicator seems reasonable “on its face” – Grievances as a measure of worker morale Criterion-related Validity – Indicator relates to some external criterion – SAT scores as a predictor of college GPA Construct Validity – Indicator relates to other variables as expected within a system of theoretical relationships – Comparing marriage satisfaction to marriage fidelity Content Validity – Indicator covers the range of meanings associated within a concept – Planning licensing exam covers all the planning skills
25
Reliability vs. Validity More reliable: Quantitative, nomothetic More valid: Qualitative, idiographic
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.