PSY 1950 Meta-analysis December 3, 2008. Definition “the analysis of analyses... the statistical analysis of a large collection of analysis results.

Slides:



Advertisements
Similar presentations
Practical Meta-Analysis
Advertisements

8. Evidence-based management Step 3: Critical appraisal of studies
Effect Size Overheads1 The Effect Size The effect size (ES) makes meta-analysis possible. The ES encodes the selected research findings on a numeric scale.
Funded through the ESRC’s Researcher Development Initiative Department of Education, University of Oxford Session 2.3 – Publication bias.
Critiquing Research Articles For important and highly relevant articles: 1. Introduce the study, say how it exemplifies the point you are discussing 2.
Introduction to Meta-Analysis Joseph Stevens, Ph.D., University of Oregon (541) , © Stevens 2006.
Research Synthesis (Meta-Analysis) Research Synthesis (Meta-Analysis) CHAPTER 1 CHAPTER 10.
Evidenced Based Practice; Systematic Reviews; Critiquing Research
Research Design After: finding an interesting research question; finding an interesting research question; reviewing the literature on the topic area;
Specifying a Purpose, Research Questions or Hypothesis
Meta-analysis & psychotherapy outcome research
Practical Meta-Analysis -- D. B. Wilson 1 Practical Meta-Analysis David B. Wilson.
Correlational Designs
Chapter 7 Correlational Research Gay, Mills, and Airasian
CORRELATIO NAL RESEARCH METHOD. The researcher wanted to determine if there is a significant relationship between the nursing personnel characteristics.
September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW.
Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.
Statistical Analyses & Threats to Validity
1. An Overview of the Data Analysis and Probability Standard for School Mathematics? 2.
Overview of Meta-Analytic Data Analysis
● Midterm exam next Monday in class ● Bring your own blue books ● Closed book. One page cheat sheet and calculators allowed. ● Exam emphasizes understanding.
Funded through the ESRC’s Researcher Development Initiative
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Unanswered Questions in Typical Literature Review 1. Thoroughness – How thorough was the literature search? – Did it include a computer search and a hand.
Advanced Statistics for Researchers Meta-analysis and Systematic Review Avoiding bias in literature review and calculating effect sizes Dr. Chris Rakes.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
The Effect of Computers on Student Writing: A Meta-Analysis of Studies from 1992 to 2002 Amie Goldberg, Michael Russell, & Abigail Cook Technology and.
How to Write a Critical Review of Research Articles
Introduction to Meta-Analysis a bit of history definitions, strengths & weaknesses what studies to include ??? “choosing” vs. “coding & comparing” studies.
September 19, 2012 SYSTEMATIC REVIEWS It is necessary, while formulating the problems of which in our advance we are to find the solutions, to call into.
Evaluating a Research Report
L 1 Chapter 12 Correlational Designs EDUC 640 Dr. William M. Bauer.
Quantitative Research. Overview Non-experimental QualitativeCase study Phenomenology Ethnography Historical Literature Review QuantitativeObservational.
Simon Thornley Meta-analysis: pooling study results.
Introduction to Research
The Literature Search and Background of the Problem.
Systematic reviews to support public policy: An overview Jeff Valentine University of Louisville AfrEA – NONIE – 3ie Cairo.
Qualitative Research Designs Day 4 The Curious Skeptics at work.
The Campbell Collaborationwww.campbellcollaboration.org C2 Training: May 9 – 10, 2011 Introduction to meta-analysis.
1 STAT 500 – Statistics for Managers STAT 500 Statistics for Managers.
Religiousness and Depression: Evidence for a Main Effect and the Moderating Influence of Stressful Life Events T. Smith, M. McCullough, and J. Poll Presented.
Academic Research Academic Research Dr Kishor Bhanushali M
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
One-Way Analysis of Covariance (ANCOVA)
McGraw-Hill/Irwin Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. Using Nonexperimental Research.
Retain H o Refute hypothesis and model MODELS Explanations or Theories OBSERVATIONS Pattern in Space or Time HYPOTHESIS Predictions based on model NULL.
Statistics for the Social Sciences Psychology 340 Spring 2010 Introductions & Review of some basic research methods.
28. Multiple regression The Practice of Statistics in the Life Sciences Second Edition.
Reasoning in Psychology Using Statistics Psychology
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Systematic Synthesis of the Literature: Introduction to Meta-analysis Linda N. Meurer, MD, MPH Department of Family and Community Medicine.
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
Chapter 14 Research Synthesis (Meta-Analysis). Chapter Outline Using meta-analysis to synthesize research Tutorial example of meta-analysis.
Experimental Psychology PSY 433 Chapter 5 Research Reports.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 27 Systematic Reviews of Research Evidence: Meta-Analysis, Metasynthesis,
Choosing and using your statistic. Steps of hypothesis testing 1. Establish the null hypothesis, H 0. 2.Establish the alternate hypothesis: H 1. 3.Decide.
WELCOME TO BIOSTATISTICS! WELCOME TO BIOSTATISTICS! Course content.
Week Seven.  The systematic and rigorous integration and synthesis of evidence is a cornerstone of EBP  Impossible to develop “best practice” guidelines,
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
Statistics & Evidence-Based Practice
Writing a sound proposal
Meta-analysis: Conceptual and Methodological Introduction
Rehabilitation Research July 27, 2005
Reasoning in Psychology Using Statistics
Reasoning in Psychology Using Statistics
Meta-analysis April 11, 2006.
Reasoning in Psychology Using Statistics
Meta-analysis, systematic reviews and research syntheses
META-ANALYSIS PROCEDURES
Presentation transcript:

PSY 1950 Meta-analysis December 3, 2008

Definition “the analysis of analyses... the statistical analysis of a large collection of analysis results from individual studies for the purpose of integrating the findings. It connotes a rigorous alternative to the casual, narrative discussions of research studies which typify our attempts to make sense of the rapidly expanding research literature.” –Glass (1976) “Mega-silliness” –Eysenck (1977)

History Pre-history –Pearson (1904), Fisher (1948), Cochran’s (1955) The Great Debate –1952: Eysenck concluded that psychotherapy was bunk –20 years of research did not settle debate –1978: Glass & Smith statistically aggregated findings from 375 studies, concluding that psychotherapy works Necessity is the mother of invention –Psychology abounds!

Rationale Meta-analyses avoid the limitations of qualitative/narrative/traditional reviews: –Weak effects overloooked Meta-analyses are more powerful –Differences between studies over-interpreted In meta-analysis, heterogeneity assessed statistically –Moderating variables overestimated or overlooked In meta-analysis moderators assessed statistically –Limited, subjective sampling of studies In meta-analysis, exhaustive search and defined inclusion/exclusion criteria –Overwhelmed by large database Meta-analyses can summarize hundreds of effects –Subjective assessment In meta-analysis, any subjectivity is discernible

Example: Finding Weak Effects Cooper, H. M., & Rosenthal, R. (1980). Statistical versus traditional procedures for summarizing research findings. Psychological Bulletin, 87, –32 grad students, 9 faculty members –Randomly assigned to statistical and traditional review technique –Given 7 studies that examined sex differences in persistence For 2 studies, females more persistent than males (ps =.005,.001) For other studies, no significant difference –Does evidence presented support the conclusions that females are more persistent?

actual p =.016

Example: Assessing Moderators Statistically

Criticisms Weak –Apples and oranges –Flat Earth society –Garbage in, garbage out –File-drawer problem Strong –Post-hoc

Apples and Oranges Critique –Meta-analyses add together apples and oranges Response –Glass: “in the study of fruit, nothing else is sensible” –Analogy with single experiments –Empirical question resolved through examination of moderating variables

Flat Earth Society Critique –Cronbach: "...some of our colleagues are beginning to sound like a kind of Flat Earth Society. They tell us that the world is essentially simple: most social phenomena are adequately described by linear relations; one- parameter scaling can discover coherent variables independent of culture and population; and inconsistences among studies of the same kind will vanish if we but amalgamate a sufficient number of studies.... The Flat Earth folk seek to bury any complex hypothesis with an empirical bulldozer.” Response –Code and analyze moderating variables

Garbage In, Garbage Out Critique –The inclusion of flawed studies “dirties” the database, obscures the truth, and invalidates meta-analytic conclusions Response –Glass: “I remain staunchly committed to the idea that meta-analyses must deal with all studies, good bad and indifferent, and that their results are only properly understood in the context of each other, not after having been censored by some a priori set of prejudices.” –Empirical question: Study quality (or better yet, related variables) can be coded and analyzed as moderators

File-drawer Problem Critique –Meta-analytic database is biased sampling of studies –Significant findings are more likely to be published than nonsignificant findings Response –Less bias than narrative reviews –File-drawer analyses (e.g., funnel plots) can empirically address the presence and influence of missing studies

Post-hoc Criticism –By definition, meta-analysis is a post-hoc endeavor, i.e., an observational study Moderating variables may be confounded, sometimes extremely so Effects may be correlational Response –Confounding may be interested in its own right –Statistical control –Hypothesis generation versus hypothesis testing

Steps of a Meta-analysis Define question Search literature Determine inclusion/exclusion criteria Code moderating variables Analyze data This is an iterative process!

Defining Meta-analytic Question Interestingness –Establish presence of effect –Determine magnitude of effect –Resolve differences in literature –Test competing theories e.g., psychotherapy, imagery v1

Inclusion/Exclusion Criteria Theoretical considerations –Scope/generalizability –Quality Practical considerations –Power –Missing data –Time

Studies were included if they –had written published or unpublished reports in English available by March 1, 2008 –presented original data from between-participants, within-participants (i.e., single-group pretest-posttest, or PP), or mixed design (i.e., independent-groups pretest-posttest, or IG-PP) experiments or quasi- experiments –objectively, quantitatively evaluated performance on at least one cognitive task as a function of meditative experience or state Studies were excluded if they –used a psychopathologically or neurologically disordered population –confounded meditation with other mental training (e.g., education), maturation, or practice and used measures susceptible to such confounding (e.g., academic achievement test) –did not report data on or contained data that allow estimation of participants’ age or meditative experience –did not contain basic methodological information (e.g. type of task administered)

Literature Search Types of searches –Keyword –Ancestor –Descendent Available Resources –Electronic e.g., PsychInfo, SCI, Google scholar –Physical Conference proceedings Bibliographies Key journals –Mental Experts in the field

Harvard’s Electronic Resources SSCI/SCI (Social/Science Citations Index) – PsychInfo – Google Scholar – Harvard – HOLLIS – Interlibrary Loan – Dissertations (Proquest) –

Coding What to code –Anything possibly interesting e.g., control group/condition, participant variables –Anything possibly confounding e.g., publication year, journal impact factor –How you coded effect sizes How to code –Using explicit coding scheme –Set measurement scale –Multiple coders –Calculate reliability

Analysis Calculate effect size Weight effect size Describe Infer –Univariate analyses –Multivariate analyses

Calculating Effect Size Only one ES per construct per study –Balance between dependency and thoroughness Typically d or r Can be calculated in lots of way (from raw data to graphs) Effect size calculator

Weighting Effect Size Why weight? –Studies vary significantly in size –Studies with large n have more reliable effect sizes than studies with small n How weight? –Simple approach: weight by sample size –Better approach: weight by precision What is precision weighting? –Each effect size has associated SE –Hedges showed that best meta-analytic estimate of precision is weight by inverse sampling variance

Describing Distribution Central tendency Spread Shape

Inferencial Statistics Select a model –Fixed effects –Random effects Univariate analyses –Analogous to one-way ANOVA –Examine how much variation in effect sizes is explained by one (categorical) variable Multivariate analyses –Analogous to multivariate regression –Examine how much variation in effect sizes is explained by set of (categorical or continuous) variables –Examine how much unique variation in effect sizes is explained by one (categorical or continuous) variable