Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Experimental Design

Similar presentations


Presentation on theme: "Introduction to Experimental Design"— Presentation transcript:

1 Introduction to Experimental Design
Validity vs. Reliability

2 Why Use an Experiment? Quantitative questions Causal questions
Not just descriptive Causal questions Does the IV cause differences in DV? Tighter control over situation and relevant variables Rule out alternative explanations for relationship between variables

3 Variables Variable is … a characteristic that varies! Types
Independent variable Manipulated by experimenter Levels (> 2) Dependent variable Measured by experimenter Affected by IV Extraneous variable Related but not of interest to experiment

4 Definition of Concepts
Hypothesis: A tentative statement, subject to empirical test, about the expected relationship between variables. Independent variable: The variable that is manipulated in an experiment. The independent variable is believed to have an impact on the dependent variable. (multiple levels) Dependent variable: The variable measured in a study.

5 Experimental Research Design
Experimental design: Research in which independent variables are manipulated and behavior is measured while all other variables (extraneous variables) are controlled for. Random sampling: Drawing from the population in a way that ensures equal opportunity for every member to be included in one or more conditions of the experiment.

6 Experimental Research Design (cont).
Control Group: A group of subjects in an experiment that does not receive the experimental treatment. The data from the control group are used as a baseline against which data from the experimental group are compared.

7

8 Validity Validity Types
The degree to which a research question corresponds with reality Types Internal: Explanatory relationships between IVs and DVs (causal relationship) Construct: Do results support Theory behind research? External: Can findings be generalized to other settings, populations Statistical: Are findings the result of chance processes?

9 Internal Validity Fundamental “Logic” of experiment
Rule out alternative variables as explanations Confounding – co-varying variables Can’t be completely eliminated Internal validity can be weak in Quasi experimental designs

10 Construct Validity Degree to which research measures theory
Rule out alternative theories Auxiliary hypotheses needed to obtain inferences can often be used to save a theory Need for additional research

11 External validity Degree to which findings generalize
External validity can be most important in organizational and evaluation research

12 Statistical Validity Rule out chance, sampling variation
Use statistical tests Sample correctly Meet test assumptions (e.g., independence)

13 Threats to Internal Validity
History: Events outside of the study/laboratory Maturation: Participants grow older, wiser, or more experienced between pre- and post-measurement Instrumentation: The effect observed is due to changes in the measuring instrument (Changes in procedures) Mortality:Dropout of participants from study Selection (Randomization): The nature of the participants in the group or groups being compared (equalizes within “normal” limits)

14 Maximizing Internal Validity
Random assignment Control of extraneous variables Elimination of confounds Extraneous variables that covary with the IV

15 Threats to External Validity
Other Subjects (Ss) Other times Other settings

16 Internal vs. External Validity
Generally these are opposing Internal Validity Does the Design lend itself to testing the hypotheses? External Validity Are the results only applicable in the controlled seting or can they be generalized to the real world? More realistic situations tend to have less control More variable sample More variable situations One compromises the other.

17 Types of Experiments Laboratory Experiment
Research investigation in which investigator creates a situation with exact conditions so as to control some, and manipulate other, variables Experiment Scientific investigation in which an investigator manipulates and controls one or more independent variables and observes the dependent variable for variation concomitant to the manipulation of the independent variables Field Experiment Research study in a realistic situation in which one or more independent variables are manipulated by the experimenter under as carefully controlled conditions as the situation will permit

18 Lab Experiment Question: Will the response be the same outside the laboratory?

19 Experimental Simulation
retain some realism of content though context is not real Examples: simulated grocery aisles; ad testing facilities

20 Experimental Simulation
Example: How does size of bottle and amount left affect amount used? Treatment: Size and fullness of bottles of blue water changed. Women were given a bowl and asked how to spray in the amount (the dependent variable) they do when cleaning their toilet

21 Summary of Laboratory Experiment
Experiments in which the experimental treatment is introduced in an artificial or laboratory setting Laboratory experiments tend to be artificial Testing effect exists as respondents are aware of being in a test and may not respond naturally Results may not have external validity Least costly and allow experimenter greater control over the experiment Alternative explanations of results are reduced, increasing internal validity

22 Field Experiment Take place in real settings
Control is traded off for realism Example: Test marketing

23 Field Experiment (Cont.)
Example: PSAs about colon cancer are run in four cities; phone calls are made before and after to see if awareness and doctor visits have increased What could be the problem here?

24 Field Experiment (Cont.)
Often used to fine-tune marketing marketing strategies and to determine sales volume

25 Summary of Field Experiment
Research study in which one or more independent variables are manipulated by the experimenter under carefully controlled conditions as the situation will permit Experimental treatment or intervention introduced in a completely natural setting Response tends to be natural Tend to have much greater external validity Difficult to control Competing explanations for results exist

26

27 Threats to Construct Validity
Indefinite number of theories may account for results Loose connection between theory and experiment tasks, measurements Good measurements Choose tasks carefully Ss understanding of tasks, experiment Instructions Hawthorne “good-subject” effect Evaluation apprehension Social desirability

28 Maximizing Construct Validity
Measurement is product of: Construct of interest Other constructs Error Well-defined operationalizations Multiple measures Control of extraneous variables

29 Experimenter Bias Conscious fudging of data Unconscious bias
Blind and double-blind procedures Standardize procedure of experiment

30 Reliability Old Rifle New Rifle New Rifle Sunglare
(low reliability) (high reliability) (reliable but not valid)

31 Reliability (cont.) The extent to which it consistently discriminates individuals at one time or over the course of time Test-retest reliability (Stability) Internal-consistency consistency (Reliability of components): Cronbach’s alpha coefficient (Cronbach, 1951) (The higher, the better)


Download ppt "Introduction to Experimental Design"

Similar presentations


Ads by Google