Download presentation
Presentation is loading. Please wait.
Published byDarcy Dalton Modified over 9 years ago
1
EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS
3
Campbell and Stanley (1966) Two general criteria of research designs: Internal validity External validity
4
n Definition: refers to the extent to which the changes observed in the DV are caused by the IV. INTERNAL VALIDITY
5
Internal Validity ?s of internal validity cannot be answered positively unless the design provides adequate control of extraneous variables. Essentially a problem of control. Anything that contributes to the control of a research design contributes to its internal validity.
6
Internal Validity 1.History: specific events or conditions, other than the treatment, may occur between the 1 st and 2 nd measurements of the participants to produce changes in the DV. 2.Maturation: processes that operate within the participants simply as a function of the passage of time.
7
Internal Validity 3.Pretesting: exposure to a pretest may affect participants’ performance on a 2 nd test, regardless of the IV. 4.Measuring instruments: changes in the measuring instruments, in the scorers, or in the observers used may produce changes in the obtained measures.
8
Internal Validity 5.Statistical regression: If groups are selected on the basis of extreme scores, statistical regression may operate to produce an effect that could be mistakenly interpreted as an experimental effect.
9
Internal Validity 6.Differential selection of participants: important differences may exist between the groups before the IV is applied. 7.Experimental mortality: occurs when there is a differential loss of respondents from the comparison groups.
10
Internal Validity 8.Selection-maturation interaction: Some of these internal validity threats may interact. Frequently arises when volunteers are compared with nonvolunteers.
11
Internal Validity 9.Implementation: sometimes implementing the IV threatens internal validity. Experimenter bias effect 10.Participants’ attitudes: Hawthorne effect- attention was positive; John Henry effect- exert extra effort
12
Controlling for Threats to Internal Validity 1.Random assignment 2.Randomized matching: match on as many variables as possible and then randomly assign one member of the pair to the IV- other goes to the control group.
13
3.Homogeneous selection: select samples that are as similar as possible on some extraneous variable (e.g., IQ; age) 4.Building variables into the design: include the extraneous variable as one of the IVs examined (e.g., gender) 5.Analysis of covariance: removing portion of performance that is systematically related to an extraneous variable. 6.Using participants as their own controls: participants are in each of the experimental conditions, one at a time.
14
External Validity of Research Designs n Refers to generalizability or representativeness of the findings. n Question addressed here is: n To what groups, settings, experimental variables, and measurement variables can these findings be generalized?
15
Types of External Validity 1.Population external validity: identifying the population to which results may be generalizable. 2.Ecological external validity: concerned with generalizing experimental effects to other environmental conditions (i.e., settings).
16
Types of External Validity 3.External validity of operations: concerned with how well the operational definitions and the experimental procedures represent the constructs of interest. Would the same relationships be found if a different researcher used different operations (i.e., measures) in investigating the same question?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.