Download presentation
Presentation is loading. Please wait.
Published byEmory Stone Modified over 9 years ago
3
Writing A Review
4
Sources Preliminary Primary Secondary
8
Steps in Reviewing Educational Literature Define the problem Review secondary sources Select most appropriate preliminary sources Translate problem statement into key words Search the preliminary sources Read primary sources Organize notes Write report
9
Problem Formulation Research Question Asked Primary Function in Review Sources of Potential Invalidity in Review Conclusions What evidence should be included in the review? Constructing definitions that distinguish relevant from irrelevant studies Narrow concepts might make review conclusions less definitive and robust Superficial operational detail might obscure interacting variables
13
Parts of a Review Paper Abstract Introduction Procedure Review of the Literature (Findings and Discussion) Conclusion References
15
Have I written a good review?
16
Evaluating the Literature Review Selection of the Literature Criticism of the Literature Summary and Interpretation
17
Selection of the Literature (McMillan & Schumacher, 2001) Is the purpose of the review indicated? Are the parameters of the review reasonable? Is primary literature emphasized? Is the literature selected relevant to the problem?
18
Criticism of the Literature Is the review organized by topics or ideas? Is the review organized logically? Are major studies or theories discussed in detail and minor studies with similar limitations or results discussed as a group? Is there adequate criticism of the design and methodology of important studies? Are studies compared and contrasted and conflicting or inconclusive results noted? Is the relevance of each reference to the problem explicit?
19
Summary and Interpretation Does the summary provide an overall interpretation and understanding of our knowledge of the problem? Do the implications provide theoretical or empirical justification for the specific research questions or hypotheses to follow? Do the methodological implications provide a rationale for the design to follow?
20
Now let’s talk design
23
Sampling Considerations Inductive method Sample Target Population Accessible Population Sampling Probability Non-probability
24
Types of Probability Sampling SS imple random sampling SS tratified sampling CC luster sampling SS ystematic sampling
26
Non-probability Sampling Convenience Judgmental Quota
27
Application Activity Sampling procedure used?
28
Moving on...
29
Still moving...
30
Sample Size Type of research Research hypotheses Financial constraints Importance of results Number of variables studied Methods of data collection Degree of accuracy needed Size of population
32
Validity Definition Relationship to Reliability Methods of Estimating Validity
34
Evaluating the Instruments Are the instruments valid and reliable for the participants used in the study? Are the instruments the best ones? Why did the researcher choose the instruments? Are the instruments completely described?
35
Reliability
36
Definition Sources of error –Non-standardization –Observer –Temporary characteristics of participants –Idiosyncrasies of items or components
37
Methods of Establishing Reliability Test-Retest (Stability) Alternate Form (Equivalence) Internal Consistency –Split Half –Kuder Richardson –Cronbach’s Alpha Scorer Reliability
39
Factors Influencing Reliability Group Heterogeneity Number of Items Range of Scores Difficulty of Items Discrimination of Items Norm Group Similarity
40
Enhancing Reliability Establish standard methods of data collection –Uniform directions –Uniform time frame –Uniform setting & conditions –Same people administering test/procedure –Trained study personnel Reasonable time demands to complete instrument/task Counterbalancing of instrument/task
41
Variables Dependent variable Independent variable Confounding variable
42
Woops!
43
Sources of Variability SS ystematic EE rror EE xtraneous
44
Sampling Error
45
HAD ENOUGH?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.