Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Slides:



Advertisements
Similar presentations
EVAL 6970: Meta-Analysis Introduction to Meta-Analysis Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2013.
Advertisements

Practical Meta-Analysis
Effect Size Overheads1 The Effect Size The effect size (ES) makes meta-analysis possible. The ES encodes the selected research findings on a numeric scale.
Reading the Dental Literature
Funded through the ESRC’s Researcher Development Initiative Department of Education, University of Oxford Session 2.3 – Publication bias.
Introduction to Meta-Analysis Joseph Stevens, Ph.D., University of Oregon (541) , © Stevens 2006.
Research Synthesis (Meta-Analysis) Research Synthesis (Meta-Analysis) CHAPTER 1 CHAPTER 10.
15 de Abril de A Meta-Analysis is a review in which bias has been reduced by the systematic identification, appraisal, synthesis and statistical.
Friday, November 14 and Monday, November 17 Evaluating Scientific Argument: Peer Review IPHY 3700 Writing Process Map.
Research Methods for Business Students
Meta Analysis & Programmatic Research Re-introduction to Programmatic research Importance of literature reviews Meta Analysis – the quantitative contribution.
Meta-analysis & psychotherapy outcome research
Practical Meta-Analysis -- D. B. Wilson
Practical Meta-Analysis -- D. B. Wilson 1 Practical Meta-Analysis David B. Wilson.
Today Concepts underlying inferential statistics
Using Statistics in Research Psych 231: Research Methods in Psychology.
Chapter 7. Getting Closer: Grading the Literature and Evaluating the Strength of the Evidence.
Science and Engineering Practices
The Research Process. Purposes of Research  Exploration gaining some familiarity with a topic, discovering some of its main dimensions, and possibly.
September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW.
Reading and Writing at the Graduate Level By Kevin Eric DePew & Julia Romberger June 26, 2007.
Chapter 2: The Research Enterprise in Psychology
CHAPTER 3: DEVELOPING LITERATURE REVIEW SKILLS
Chapter 2: The Research Enterprise in Psychology
Chapter 2 The Research Enterprise in Psychology. n Basic assumption: events are governed by some lawful order  Goals: Measurement and description Understanding.
Meta Analysis MAE Course Meta-analysis The statistical combination and analysis of data from separate and independent studies to determine if there.
Research Methods School of Economic Information Engineering Dr. Xu Yun Office : Phone : :
Chapter 3 An Overview of Quantitative Research
Unanswered Questions in Typical Literature Review 1. Thoroughness – How thorough was the literature search? – Did it include a computer search and a hand.
Advanced Statistics for Researchers Meta-analysis and Systematic Review Avoiding bias in literature review and calculating effect sizes Dr. Chris Rakes.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
The Effect of Computers on Student Writing: A Meta-Analysis of Studies from 1992 to 2002 Amie Goldberg, Michael Russell, & Abigail Cook Technology and.
Systematic Reviews.
Introduction to Meta-Analysis a bit of history definitions, strengths & weaknesses what studies to include ??? “choosing” vs. “coding & comparing” studies.
Copyright © Allyn & Bacon 2008 Locating and Reviewing Related Literature Chapter 3 This multimedia product and its contents are protected under copyright.
The Campbell Collaborationwww.campbellcollaboration.org C2 Training: May 9 – 10, 2011 Data Evaluation: Initial screening and Coding Adapted from David.
Chapter 16 Conducting & Reading Research Baumgartner et al Chapter 16 Developing the Research Proposal.
Chapter 3 Copyright © Allyn & Bacon 2008 Locating and Reviewing Related Literature This multimedia product and its contents are protected under copyright.
Evidence Based Medicine Meta-analysis and systematic reviews Ross Lawrenson.
LECTURE 2 EPSY 642 META ANALYSIS FALL CONCEPTS AND OPERATIONS CONCEPTUAL DEFINITIONS: HOW ARE VARIABLES DEFINED? Variables are operationally defined.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Systematic reviews to support public policy: An overview Jeff Valentine University of Louisville AfrEA – NONIE – 3ie Cairo.
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
LITERATURE REVIEW  A GENERAL GUIDE  MAIN SOURCE  HART, C. (1998), DOING A LITERATURE REVIEW: RELEASING THE SOCIAL SCIENCE RESEARCH IMAGINATION.
Educational Research Chapter 13 Inferential Statistics Gay, Mills, and Airasian 10 th Edition.
Developing a Review Protocol. 1. Title Registration 2. Protocol 3. Complete Review Components of the C2 Review Process.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
From description to analysis
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
 Descriptive Methods ◦ Observation ◦ Survey Research  Experimental Methods ◦ Independent Groups Designs ◦ Repeated Measures Designs ◦ Complex Designs.
Retain H o Refute hypothesis and model MODELS Explanations or Theories OBSERVATIONS Pattern in Space or Time HYPOTHESIS Predictions based on model NULL.
EBM --- Journal Reading Presenter :呂宥達 Date : 2005/10/27.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Writing A Review Sources Preliminary Primary Secondary.
Systematic Reviews and Meta-analyses. Introduction A systematic review (also called an overview) attempts to summarize the scientific evidence related.
What Have We Learnt About Attendance Demand in Sports During the Last Decades? A Meta-Regression Analysis X IASE ANNUAL CONFERENCE
Chapter 14 Research Synthesis (Meta-Analysis). Chapter Outline Using meta-analysis to synthesize research Tutorial example of meta-analysis.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Cedric D. Murry APT Instructor of Applied Technology in research and development.
Research methods revision The next couple of lessons will be focused on recapping and practicing exam questions on the following parts of the specification:
Week Seven.  The systematic and rigorous integration and synthesis of evidence is a cornerstone of EBP  Impossible to develop “best practice” guidelines,
Disseminating Research Findings Shawn A. Lawrence, PhD, LCSW SOW 3401
The Research Design Continuum
Supplementary Table 1. PRISMA checklist
Understanding Results
Rehabilitation Research July 27, 2005
Introduction to Meta-Analysis
Meta-analysis April 11, 2006.
Meta-analysis, systematic reviews and research syntheses
Presentation transcript:

Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010

Practical Meta- Analysis -- D. B. Wilson 2 Overview of the Workshop Topics covered will include  Review of the basic methods Problem definition Document Retrieval Coding Effect sizes and computation Analysis of effect sizes Publication Bias  Cutting edge issues  Interpretation of results  Evaluating the quality of a meta-analysis

Practical Meta- Analysis -- D. B. Wilson 3 Forest Plot from a Meta-Analysis of Correctional Boot-Camps

Practical Meta- Analysis -- D. B. Wilson 4 The Great Debate 1952: Hans J. Eysenck concluded that there were no favorable effects of psychotherapy, starting a raging debate 20 years of evaluation research and hundreds of studies failed to resolve the debate 1978: To proved Eysenck wrong, Gene V. Glass statistically aggregate the findings of 375 psychotherapy outcome studies Glass (and colleague Smith) concluded that psychotherapy did indeed work Glass called his method “meta-analysis”

Practical Meta- Analysis -- D. B. Wilson 5 The Emergence of Meta-analysis Ideas behind meta-analysis predate Glass’ work by several decades  Karl Pearson (1904) averaged correlations for studies of the effectiveness of inoculation for typhoid fever  R. A. Fisher (1944) “When a number of quite independent tests of significance have been made, it sometimes happens that although few or none can be claimed individually as significant, yet the aggregate gives an impression that the probabilities are on the whole lower than would often have been obtained by chance” (p. 99). Source of the idea of cumulating probability values

Practical Meta- Analysis -- D. B. Wilson 6 The Emergence of Meta-analysis Ideas behind meta-analysis predate Glass’ work by several decades  W. G. Cochran (1953) Discusses a method of averaging means across independent studies Laid-out much of the statistical foundation that modern meta- analysis is built upon (e.g., Inverse variance weighting and homogeneity testing)

Practical Meta- Analysis -- D. B. Wilson 7 The Logic of Meta-analysis Traditional methods of review focus on statistical significance testing Significance testing is not well suited to this task  Highly dependent on sample size  Null finding does not carry the same “weight” as a significant finding significant effect is a strong conclusion nonsignificant effect is a weak conclusion Meta-analysis focuses on the direction and magnitude of the effects across studies, not statistical significance  Isn’t this what we are interested in anyway?  Direction and magnitude are represented by the effect size

Practical Meta- Analysis -- D. B. Wilson 8 Illustration Simulated data from 21 validity studies. Taken from: Schimdt, F. L. (1996). Statistical significance testing and cumulative knowledge in psychology: implications for training of researchers. Psychological Methods, 1,

Practical Meta- Analysis -- D. B. Wilson 9 Illustration (Continued)

Practical Meta- Analysis -- D. B. Wilson 10 When Can You Do Meta-analysis? Meta-analysis is applicable to collections of research that  Are empirical, rather than theoretical  Produce quantitative results, rather than qualitative findings  Examine the same constructs and relationships  Have findings that can be configured in a comparable statistical form (e.g., as effect sizes, correlation coefficients, odds-ratios, proportions)  Are “comparable” given the question at hand

Practical Meta- Analysis -- D. B. Wilson 11 Forms of Research Findings Suitable to Meta- analysis Central tendency research  Prevalence rates Pre-post contrasts  Growth rates Group contrasts  Experimentally created groups Comparison of outcomes between treatment and comparison groups  Naturally occurring groups Comparison of spatial abilities between boys and girls Rates of morbidity among high and low risk groups

Practical Meta- Analysis -- D. B. Wilson 12 Forms of Research Findings Suitable to Meta- analysis Association between variables  Measurement research Validity generalization  Individual differences research Correlation between personality constructs

Practical Meta- Analysis -- D. B. Wilson 13 Effect Size: The Key to Meta-analysis The effect size makes meta-analysis possible  It is the “dependent variable”  It standardizes findings across studies such that they can be directly compared

Practical Meta- Analysis -- D. B. Wilson 14 Effect Size: The Key to Meta-analysis Any standardized index can be an “effect size” (e.g., standardized mean difference, correlation coefficient, odds-ratio) as long as it meets the following  Is comparable across studies (generally requires standardization)  Represents the magnitude and direction of the relationship of interest  Is independent of sample size Different meta-analyses may use different effect size indices

Practical Meta- Analysis -- D. B. Wilson 15 The Replication Continuum Pure Replications Conceptual Replications You must be able to argue that the collection of studies you are meta-analyzing examine the same relationship. This may be at a broad level of abstraction, such as the relationship between criminal justice interventions and recidivism or between school- based prevention programs and problem behavior. Alternatively it may be at a narrow level of abstraction and represent pure replications. The closer to pure replications your collection of studies, the easier it is to argue comparability.

Practical Meta- Analysis -- D. B. Wilson 16 Which Studies to Include? It is critical to have an explicit inclusion and exclusion criteria (see pages 20-21)  The broader the research domain, the more detailed they tend to become  Refine criteria as you interact with the literature  Components of a detailed criteria distinguishing features research respondents key variables research methods cultural and linguistic range time frame publication types

Practical Meta- Analysis -- D. B. Wilson 17 Methodological Quality Dilemma Include or exclude low quality studies?  The findings of all studies are potentially in error (methodological quality is a continuum, not a dichotomy)  Being too restrictive may restrict ability to generalize  Being too inclusive may weaken the confidence that can be placed in the findings  Methodological quality is often in the “eye-of-the-beholder”  You must strike a balance that is appropriate to your research question

Practical Meta- Analysis -- D. B. Wilson 18 Searching Far and Wide The “we only included published studies because they have been peer-reviewed” argument Significant findings are more likely to be published than nonsignificant findings Critical to try to identify and retrieve all studies that meet your eligibility criteria

Practical Meta- Analysis -- D. B. Wilson 19 Searching Far and Wide (continued) Potential sources for identification of documents  Computerized bibliographic databases  “Google” internet search engine  Authors working in the research domain ( a relevant Listserv?)  Conference programs  Dissertations  Review articles  Hand searching relevant journal  Government reports, bibliographies, clearinghouses

Practical Meta- Analysis -- D. B. Wilson 20 A Note About Computerized Bibliographies Rapidly changing area Get to know your local librarian! Searching one or two databases is generally inadequate Use “wild cards” (e.g., random? will find random, randomization, and randomize) Throw a wide net; filter down with a manual reading of the abstracts

Practical Meta- Analysis -- D. B. Wilson 21 Strengths of Meta-analysis Imposes a discipline on the process of summing up research findings Represents findings in a more differentiated and sophisticated manner than conventional reviews Capable of finding relationships across studies that are obscured in other approaches Protects against over-interpreting differences across studies Can handle a large numbers of studies (this would overwhelm traditional approaches to review)

Practical Meta- Analysis -- D. B. Wilson 22 Weaknesses of Meta-analysis Requires a good deal of effort Mechanical aspects don’t lend themselves to capturing more qualitative distinctions between studies “Apples and oranges” criticism Most meta-analyses include “blemished” studies to one degree or another (e.g., a randomized design with attrition) Selection bias posses a continual threat  Negative and null finding studies that you were unable to find  Outcomes for which there were negative or null findings that were not reported Analysis of between study differences is fundamentally correlational