Download presentation
Presentation is loading. Please wait.
Published byCharity Newman Modified over 9 years ago
1
Research System Theory Hypotheses Data Verification Theory building Hypothesis generation Measurement issues Research design Sampling issues Statistical analysis Interpretation Presentation of results Generalization Culmination Modification
2
Research Strategies “All research strategies are seriously flawed…” McGrath, J. E. Martin, J., & Kulka, R. A. (1982). Judgment Calls in Research. Beverly Hills, CA: SAGE Publications Inc.
3
A Three-Horned Dilemma Different methods have different strengths: Rigor Relevance Generalizability Every research strategy either avoids two of the horns by an uneasy compromise but gets impaled, to the hilt, on the third horn; or it grabs the dilemma boldly by one horn, maximizing on it, but at the same time “sitting down” (with some pain) on the other two horns. (McGrath, 1982: 74)
4
Dilemmatics: McGrath Taxonomy The study of research choices and tradeoffs There is no perfect study. Research involves tradeoffs Research looks at Actors emitting Behaviors in a Context We want Generalizability from Actors population Precise measure and control of Behavior Realistic Contexts for observation of actor behavior
5
J II I IV III Judgment Task Sample Survey Formal Theory Computer Simulation Field Study Field Experiment Lab Experiment Experimental Simulation RELEVANCE GENERALIZABILITY RIGOR
6
Q I: Field Strategies Field study – No deliberate manipulation; everything is measured – Naturally occurring setting – Example: Survey of QWL in 100 organizations Field experiment – Deliberate manipulation of one or more variables – Naturally occurring setting – Example: Hawthorne studies; Greenberg’s work RELEVANCE
7
Q II: Experimental Strategies Laboratory experiment – Deliberate manipulation of variables – Contrived setting – Example: Effects of communication channels on team performance, effects of feedback and goal-setting on individual performance Experimental simulation – Deliberate manipulation of variables – Contrived realistic setting – Example: Center for Creative Leadership Looking Glass simulation; Zimbardo prison experiment – http://www.youtube.com/watch?v=JxGEmfNl-xM RIGOR
8
Q III: Respondent Strategies Judgment tasks – Emphasis on task/judgment selection, often with a limited number of participants – All variables are measured – Example: “policy capturing” studies, creation of the competing values framework Sample Survey – Emphasis on sample selection – All variables measured – Example: National Survey of Organizations GENERALIZABILITY
9
Q IV: Theoretical Strategies Formal theory/literature reviews – No actual research participants – Summarize the literature to create new models for testing (inductive process) Computer Simulations – No actual research participants – All outcomes are computer- generated – Example: garbage can decision-making processes, monte carlo studies GENERALIZABILITY
10
Integrative & Hybrid Strategies Because every methodological choice is flawed or incomplete, you can decrease the effects of the trade-offs by: Using different methods across studies Using multiple methods within a single study Packaging different studies with different methods together
11
Alternatively, Three Study Types Experimental Quasi-Experimental Correlational or passive observational study (field) Single subject (case study)
12
Important Concepts: The Building Blocks of Research Methods Independent Variable Dependent Variable Extraneous Variable Hypothesis Experiment
13
Causal Inference Conditions needed for causality Covariation of cause and effect Temporal precedence (cause must come before effect in time) Control to rule out alternative interpretations True experiments are best suited to infer causality because the include the greatest degree of control Apply MAXIMINICON principle Maximize relevant systematic variance Minimize irrelevant systematic variance Control extraneous sources of variance
14
What is a TRUE experiment? There must be manipulation Manipulation of a cause results in an effect There must be random assignment to experimental conditions There must be control of extraneous variable
15
Experiment Advantages high degree of control strong inference of causality measurement of behavior is precise often laboratory experiments can be replicated easily Disadvantages low realism low external validity in general some phenomena cannot be analyzed in a laboratory some variables may have a weaker (or stronger) impact in the lab than they would in a natural environment
16
Quasi-Experiment Advantages high realism greater external validity moderate degree of control moderate to high inference of causality Disadvantages internal validity may be compromised external validity may be compromised measurement may be imprecise it may be difficult to get people to agree to participate it is often difficult to get access to many field settings
17
Correlational Field Study Advantages realistic data on a large number of variables can be collected the researcher's impact on the study is often lower allows exploration of contextual effects Disadvantages causality is difficult to assess internal validity may be compromised external validity may be compromised organizations may not agree to participate measurement of variables less precise than lab low response rate common
18
Single Subject/Case Study Advantages good for low base rate occurrences provides source of rich and descriptive data good for generating new ideas Disadvantages low internal and external validity
19
Threats to Validity Validity = the confidence we can have that our findings from any study are “true” All research has threats to validity – that is, things that minimize the degree to which we can embrace a particular finding as “true” Many sources of validity threats but two common ones: Research participants Researchers themselves
20
Threats due to Participants Roles Good subject Faithful subject Negativistic subject Apprehensive subject Role Multiplicity and Conflict Attributes of Participants Comprehension artifact They misunderstand
21
Threats due to Researcher Attributes of researchers Expectancies Designer, observer, and interpreter effects Data analyst Tester Poor measurement decisions
22
Week 2 Assignment Describe a research topic that you are interested in Write three hypothesis statements about relationships you might expect Identify what the IVs and DVs are in these relationships If there are mediators or moderators in your hypothesis statements, identify what these are Indentify the type of research strategy you would use to study this research question Use the article you identified last week as your reference
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.