Conducting Meta-Analyses Marsha Sargeant, M.S. D esign A nd S tatistical A nalysis L aboratory University of Maryland, College Park Department of Psychology.

Slides:



Advertisements
Similar presentations
Survey design. What is a survey?? Asking questions – questionnaires Finding out things about people Simple things – lots of people What things? What people?
Advertisements

Marketing Research Aaker, Kumar, Day and Leone Tenth Edition Instructor’s Presentation Slides 1.
Significance and effect sizes What is the problem with just using p-levels to determine whether one variable has an effect on another? Don’t EVER just.
Effect Size Overheads1 The Effect Size The effect size (ES) makes meta-analysis possible. The ES encodes the selected research findings on a numeric scale.
Practical Meta-Analysis -- D. B. Wilson 1 Interpreting Effect Size Results Cohen’s “Rules-of-Thumb”  standardized mean difference effect size small =
Introduction to Meta-Analysis Joseph Stevens, Ph.D., University of Oregon (541) , © Stevens 2006.
A Meta-Analysis of Periodic Noise Stress on Human Performance J.M. Ross, G.E. Conway, J.L. Szalma, B.M. Saxton, A. Braczyk, & P.A. Hancock University of.
Research Synthesis (Meta-Analysis) Research Synthesis (Meta-Analysis) CHAPTER 1 CHAPTER 10.
EVAL 6970: Meta-Analysis Formulating a Problem, Coding the Literature, and Review of Research Designs Dr. Chris L. S. Coryn Spring 2011.
Aaker, Kumar, Day Ninth Edition Instructor’s Presentation Slides
Database Structure Overheads1 Database Structures The hierarchical nature of meta-analytic data The familiar flat data file The relational data file Advantages.
11 Populations and Samples.
Practical Meta-Analysis -- D. B. Wilson
Practical Meta-Analysis -- D. B. Wilson 1 Practical Meta-Analysis David B. Wilson.
Research Methods in Psychology Pertemuan 3 s.d 4 Matakuliah: L0014/Psikologi Umum Tahun: 2007.
Chapter 7 Correlational Research Gay, Mills, and Airasian
Wilson Coding Protocol Page 1 Development of Coding Protocol Coding protocol: essential feature of meta-analysis Goal: transparent and replicable  description.
Fig Theory construction. A good theory will generate a host of testable hypotheses. In a typical study, only one or a few of these hypotheses can.
Chapter 4 Principles of Quantitative Research. Answering Questions  Quantitative Research attempts to answer questions by ascribing importance (significance)
Are the results valid? Was the validity of the included studies appraised?
Chapter 2: The Research Enterprise in Psychology
Overview of Meta-Analytic Data Analysis
Chapter 2: The Research Enterprise in Psychology
Research Methods Unit 2 (Chapter 2).
Dr. Engr. Sami ur Rahman Assistant Professor Department of Computer Science University of Malakand Research Methods in Computer Science Lecture: Research.
Chapter 1: Introduction to Statistics
Research Methods Key Points What is empirical research? What is the scientific method? How do psychologists conduct research? What are some important.
Practical Meta-Analysis -- The Effect Size -- D. B. Wilson 1 The Effect Size The effect size (ES) makes meta-analysis possible The ES encodes the selected.
Copyright © Allyn & Bacon 2007 Chapter 2: Research Methods.
Research Methods in Computer Science Lecture: Quantitative and Qualitative Data Analysis | Department of Science | Interactive Graphics System.
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
Advanced Statistics for Researchers Meta-analysis and Systematic Review Avoiding bias in literature review and calculating effect sizes Dr. Chris Rakes.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
The Effect of Computers on Student Writing: A Meta-Analysis of Studies from 1992 to 2002 Amie Goldberg, Michael Russell, & Abigail Cook Technology and.
Chapter 4 Statistics. 4.1 – What is Statistics? Definition Data are observed values of random variables. The field of statistics is a collection.
Chapter 1: The Research Enterprise in Psychology.
The Research Enterprise in Psychology. The Scientific Method: Terminology Operational definitions are used to clarify precisely what is meant by each.
The Campbell Collaborationwww.campbellcollaboration.org C2 Training: May 9 – 10, 2011 Data Evaluation: Initial screening and Coding Adapted from David.
Chapter 3 Research Methods Used to Study Child Behavior Disorders.
Research and Statistics AP PSYCHOLOGY RESEARCH METHODS.
Measuring Complex Achievement
Chapter 2 The Research Enterprise in Psychology. Table of Contents The Scientific Approach: A Search for Laws Basic assumption: events are governed by.
Geo597 Geostatistics Ch9 Random Function Models.
Systematic reviews to support public policy: An overview Jeff Valentine University of Louisville AfrEA – NONIE – 3ie Cairo.
Copyright  2003 by Dr. Gallimore, Wright State University Department of Biomedical, Industrial Engineering & Human Factors Engineering Human Factors Research.
Developing a Review Protocol. 1. Title Registration 2. Protocol 3. Complete Review Components of the C2 Review Process.
Meta-analysis of Anthropometric Outcomes of Supervised Exercise Interventions in Healthy Adults Vicki Conn PhD RN FAAN Todd Ruppar PhD RN Lorraine Phillips.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
McGraw-Hill/Irwin Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. Using Nonexperimental Research.
Chapter 2 The Research Enterprise in Psychology. Table of Contents The Scientific Approach: A Search for Laws Basic assumption: events are governed by.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
GENERALIZING RESULTS: the role of external validity.
Practical Meta-Analysis -- D. B. Wilson 1 Interpreting Effect Size Results Cohen’s “Rules-of-Thumb” –standardized mean difference effect size small = 0.20.
Characteristics of Studies that might Meet the What Works Clearinghouse Standards: Tips on What to Look For 1.
Funded through the ESRC’s Researcher Development Initiative Department of Education, University of Oxford Session 2.1 – Revision of Day 1.
Chapter 14 Research Synthesis (Meta-Analysis). Chapter Outline Using meta-analysis to synthesize research Tutorial example of meta-analysis.
Helpful hints for planning your Wednesday investigation.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
Chapter 11 Meta-Analysis. Meta-analysis  Quantitative means of reanalyzing the results from a large number of research studies in an attempt to synthesize.
Chapter 22 Inferential Data Analysis: Part 2 PowerPoint presentation developed by: Jennifer L. Bellamy & Sarah E. Bledsoe.
Week Seven.  The systematic and rigorous integration and synthesis of evidence is a cornerstone of EBP  Impossible to develop “best practice” guidelines,
Effect Sizes.
H676 Week 3 – Effect Sizes Additional week for coding?
Meta-analysis: Conceptual and Methodological Introduction
Analyzing Reliability and Validity in Outcomes Assessment Part 1
Statistics for the Social Sciences
Statistics for the Social Sciences
Analyzing Reliability and Validity in Outcomes Assessment
Meta-analysis, systematic reviews and research syntheses
Presentation transcript:

Conducting Meta-Analyses Marsha Sargeant, M.S. D esign A nd S tatistical A nalysis L aboratory University of Maryland, College Park Department of Psychology

2 Overview of Presentation 1. What is a meta-analysis and why is it important? 2. Overview of procedures involved in conducting a quantitative meta-analysis 3. Database structure 4. Interpretation of effect sizes

3 Meta-analysis Definition A statistical analysis of the summary findings of many empirical studies It’s quantitative! – Distinct from a meta-review

4 Background Empirical findings grew exponentially in the middle 50 years of the 20th century – Multiplied beyond our ability to comprehend and integrate it – Hence a growing need to statistically and technically review, rather than through narrative

5 Background Review of practices and methods of research reviewers and synthesizers in the social sciences (Jackson, 1978) Failure to report methods of reviewing

6 Benefits of Meta-analyses Increased statistical power Identification of sources of variability across studies (e.g., inclusion of moderators) Detection of biases (e.g., Tower of Babel bias) Detection of deficiencies in design, analysis, or interpretation Ioannidis & Lau, 1999

7 Limitations of Meta-analyses Cannot improve the original studies Method is frequently misapplied Can never follow the rules of science – Sources of bias are not controlled Ioannidis & Lau, 1999

8 Rules of the Game It is quantitative There is no arbitrary exclusion of data File drawer effect – Dissertation research is research too! – Unpublished studies Meta-analysis seeks general conclusions – It is contradictory to think that we can only compare studies that are the same (if they were the same you wouldn’t need to compare them!) Glass, 2000

9 Methodological Adequacy of Research Base Findings must be interpreted within the bounds of the methodological quality of the research base synthesized. Studies often cannot simply be grouped into “good” and “bad” studies. Some methodological weaknesses may bias the overall findings, others may merely add “noise” to the distribution. From “Practical Meta-analysis” by D.B. Wilson

10 Confounding of Study Features Important study features are often confounding, obscuring the interpretive meaning of observed differences If the confounding is not severe and you have a sufficient number of studies, you can model “out” the influence of method features to clarify substantive differences From “Practical Meta-analysis” by D.B. Wilson

11 Meta-analysis Overview Descriptives – Effect sizes (e.g., correlation coefficients) – Distribution and central tendency summarized Method section – Databases searched – Journals – What attempts were made to not have a biased search? – Criteria for inclusion – No effect studies Rosenthal, 2005

12 Meta-analysis Overview Study quality – Use a weighting system – Use raters and non-dichotomous ratings to avoid weighter bias – Optimally raters should be blind to the results of the study – Ratings can be used as an adjustment on effect size or as a moderator to determine whether quality is related to obtained effect size Rosenthal, 2005

13 Meta-analysis Overview Consider independence of studies – Treat non-independent studies as a single study with different dependent variables Recorded variables – Number, Age, Sex, Education, etc – Volunteer status – Laboratory or field study? – Randomized? – Method of data collection (e.g., interview vs questionnaire) – How constructs are operationalized – etc. Rosenthal, 2005

14 Meta-analysis Overview Summarize recorded variables Study characteristics could all be potential moderators of outcome aside from those with particular meaning for the specific area of research Effect sizes (there are others) – R – Z r (Fisher’s r-Z transformation) – d family Cohen’s d Hedge’s g Glass’s delta Rosenthal, 2005

15 Examples of Different Types of Effect Sizes Standardized mean difference – Group contrast research Treatment groups Naturally occurring groups – Inherently continuous construct Odds-ratio – Group contrast research Treatment groups Naturally occurring groups – Inherently dichotomous construct Correlation coefficient – Association between variables research From “Practical Meta-analysis - The Effect Size” by D.B. Wilson

16 Interpreting Effect Size Results Cohen’s “Rules-of-Thumb” – standardized mean difference effect size small = 0.20 medium = 0.50 large = 0.80 – correlation coefficient small = 0.10 medium = 0.25 large = 0.40 – odds-ratio small = 1.50 medium = 2.50 large = 4.30 From “Practical Meta-analysis” by D.B. Wilson

17 Interpreting Effect Size Results Rules-of-Thumb do not take into account the context of the intervention – a “small” effect may be highly meaningful for an intervention that requires few resources and imposes little on the participants – a small effect may be meaningful if the intervention is delivered to an entire population (prevention programs for school children) – small effects may be more meaningful for serious and fairly intractable problems From “Practical Meta-analysis” by D.B. Wilson

18 Meta-analysis Overview Significance levels recorded – Recorded as the one-tailed standard normal deviates associated with p’s E.g., p’s of.10,.01.,.001 would be recorded as Z’s of 1.28, 2.33, and 3.09

19 Meta-analysis Overview Report central tendency – Unwieghted mean effect size – Weighted mean effect size (weighting by size of study – can also use quality or other characteristic of interest) – Median – Proportion of studies showing effect sizes in the expected direction – Report number of studies reported on – Optional: total number of participants on which the weighted mean is based – Optional: median number of participants per obtained effect size

20 Meta-analysis Overview Report variability – Standard deviation – Max and min effect size found at the 75th and 25th percentile – If normally distributed, the standard deviation is estimated at.75(Q3-Q1)

21 Database Structure Database structures – The hierarchical nature of meta-analytic data – The familiar flat data file – The relational data file – Advantages and disadvantages of each – What about the meta-analysis bibliography? From “Practical Meta-analysis – Database Structure” by D.B. Wilson

22 Database Structure Meta-analytic data is inherently hierarchical Any specific analysis can only include one effect size per study (or one effect size per sub-sample within a study) Analyses almost always are of a subset of coded effect sizes. Data structure needs to allow for the selection and creation of those subsets From “Practical Meta-analysis – Database Structure” by D.B. Wilson

23 Example of a Flat Data File Note that there is only one record (row) per study Multiple ESs handled by having multiple variables, one for each potential ES. From “Practical Meta-analysis – Database Structure” by D.B. Wilson

24 Database Structure Advantages and Disadvantages of a Single Flat File Structure Advantages – All data is stored in a single location – Familiar and easy to work with – No manipulation of data files prior to analysis Disadvantages – Only a limited number of ESs can be calculated per study – Any adjustments applied to ESs must be done repeatedly When to use – Interested in a small predetermined set of ESs – Number of coded variables is modest – Comfort level with a multiple data file structure is low From “Practical Meta-analysis – Database Structure” by D.B. Wilson

25 Database Structure Example of Relational Data Structure (Multiple Related Flat Files) Note that a single record in the file above is “related” to five records in the file to the right Study Level Data File Effect Size Level Data File From “Practical Meta-analysis – Database Structure” by D.B. Wilson

26 Database Structure Example of a More Complex Multiple File Data Structure Study Level Data FileOutcome Level Data File Effect Size Level Data File Note that study 100 has 3 records in the outcomes data file and 6 outcomes in the effect size data file, 2 for each outcome measured at different points in time (Months) From “Practical Meta-analysis – Database Structure” by D.B. Wilson

27 Database Structure Advantages & Disadvantages of Multiple Flat Files Data Structure Advantages – Can “grow” to any number of ESs – Reduces coding task (faster coding) – Simplifies data cleanup – Smaller data files to manipulate Disadvantages – Complex to implement – Data must be manipulated prior to analysis (creation of “working” analysis files) – Must be able to select a single ES per study for any given analysis When to use – Large number of ESs per study are possible From “Practical Meta-analysis – Database Structure” by D.B. Wilson

28 What about Sub-Samples? So far I have assumed that the only ESs that have been coded were based on the full study sample What if you are interested in coding ESs separately for different sub-samples, such as, by gender or SES – Just say “no”! Often not enough of such data for meaningful analysis Complicates coding and data structure – Well, if you must, plan your data structure carefully Include a full sample effect size for each dependent measure of interest Place sub-sample in a separate data file From “Practical Meta-analysis – Database Structure” by D.B. Wilson

29 Tips on Coding Paper Coding – include data file variable names on coding form – all data along left or right margin eases data entry Coding Directly into a Computer Database From “Practical Meta-analysis – Database Structure” by D.B. Wilson

30 Example Screen from a Computerized Database for Direct Coding

31 Coding Directly into a Computer Database Advantages – Avoids additional step of transferring data from paper to computer – Easy access to data for data cleanup – Data base can perform calculations during coding process (e.g., calculation of effect sizes) – Faster coding Disadvantages – Can be time consuming to set up the bigger the meta-analysis the bigger the payoff – Requires a higher level of computer skill From “Practical Meta-analysis – Database Structure” by D.B. Wilson

32 Final Comments Meta-analysis – is a replicable and defensible method of synthesizing findings across studies – often points out gaps in the research literature, providing a solid foundation for the next generation of research on that topic – illustrates the importance of replication – facilitates generalization of the knowledge gain through individual evaluations From “Practical Meta-analysis” by D.B. Wilson

33 Thank You! Web: