Week Seven.  The systematic and rigorous integration and synthesis of evidence is a cornerstone of EBP  Impossible to develop “best practice” guidelines,

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

Combining findings from different research The EBM workshop A.A.Haghdoost, MD; PhD of Epidemiology Qualitative versus quantitative.
Systematic Reviews Dr Sharon Mickan Centre for Evidence-based Medicine
Conducting systematic reviews for development of clinical guidelines 8 August 2013 Professor Mike Clarke
Estimation and Reporting of Heterogeneity of Treatment Effects in Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare.
Introduction to Meta-Analysis Joseph Stevens, Ph.D., University of Oregon (541) , © Stevens 2006.
Evidenced Based Practice; Systematic Reviews; Critiquing Research
Meta-analysis & psychotherapy outcome research
September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW.
Making all research results publically available: the cry of systematic reviewers.
Are the results valid? Was the validity of the included studies appraised?
Funded through the ESRC’s Researcher Development Initiative
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
S519: Evaluation of Information Systems Week 14: April 7, 2008.
Systematic Review of the Literature: A Novel Research Approach.
Advanced Statistics for Researchers Meta-analysis and Systematic Review Avoiding bias in literature review and calculating effect sizes Dr. Chris Rakes.
Systematic Reviews Professor Kate O’Donnell. Reviews Reviews (or overviews) are a drawing together of material to make a case. These may, or may not,
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
Research Synthesis of Population-Based Prevalence Studies ORC Macro Benita J. O’Colmain, M.P.H. Wanda Parham, M.P.A. Arlen Rosenthal, M.A. Adrienne Y.
The Effect of Computers on Student Writing: A Meta-Analysis of Studies from 1992 to 2002 Amie Goldberg, Michael Russell, & Abigail Cook Technology and.
PTP 560 Research Methods Week 8 Thomas Ruediger, PT.
Literature review Osama A Samarkandi, PhD, RN BSc, GMD, BSN, MSN, NIAC EMS 423; EMS Research and Evidence Based Practice.
September 19, 2012 SYSTEMATIC REVIEWS It is necessary, while formulating the problems of which in our advance we are to find the solutions, to call into.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 2 Evidence-Based Nursing: Translating Research Evidence Into Practice.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 13 Building an Evidence-Based Nursing Practice.
Simon Thornley Meta-analysis: pooling study results.
Session I: Unit 2 Types of Reviews September 26, 2007 NCDDR training course for NIDRR grantees: Developing Evidence-Based Products Using the Systematic.
LECTURE 2 EPSY 642 META ANALYSIS FALL CONCEPTS AND OPERATIONS CONCEPTUAL DEFINITIONS: HOW ARE VARIABLES DEFINED? Variables are operationally defined.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 2 Translating Research Evidence Into Nursing Practice: Evidence-Based Nursing.
Appraising Randomized Clinical Trials and Systematic Reviews October 12, 2012 Mary H. Palmer, PhD, RN, C, FAAN, AGSF University of North Carolina at Chapel.
Systematic reviews to support public policy: An overview Jeff Valentine University of Louisville AfrEA – NONIE – 3ie Cairo.
Meta-analysis 統合分析 蔡崇弘. EBM ( evidence based medicine) Ask Acquire Appraising Apply Audit.
Conducting and Interpreting Systematic Reviews and Meta- Analyses July 12, 2007.
The Campbell Collaborationwww.campbellcollaboration.org C2 Training: May 9 – 10, 2011 Introduction to meta-analysis.
RE - SEARCH ---- CAREFUL SEARCH OR ENQUIRY INTO SUBJECT TO DISCOVER FACTS OR INVESTIGATE.
PH 401: Meta-analysis Eunice Pyon, PharmD (718) , HS 506.
Academic Research Academic Research Dr Kishor Bhanushali M
EBM Conference (Day 2). Funding Bias “He who pays, Calls the Tune” Some Facts (& Myths) Is industry research more likely to be published No Is industry.
Evidence Based Practice RCS /9/05. Definitions  Rosenthal and Donald (1996) defined evidence-based medicine as a process of turning clinical problems.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Sifting through the evidence Sarah Fradsham. Types of Evidence Primary Literature Observational studies Case Report Case Series Case Control Study Cohort.
EBM --- Journal Reading Presenter :呂宥達 Date : 2005/10/27.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Systematic Synthesis of the Literature: Introduction to Meta-analysis Linda N. Meurer, MD, MPH Department of Family and Community Medicine.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 18 Systematic Review and Meta-Analysis.
Introduction to research
Course: Research in Biomedicine and Health III Seminar 5: Critical assessment of evidence.
Systematic reviews and meta-analyses: when and how to do them Andrew Smith Royal Lancaster Infirmary 18 May 2015.
Evidence Based Practice (EBP) Riphah College of Rehabilitation Sciences(RCRS) Riphah International University Islamabad.
Lecture №4 METHODS OF RESEARCH. Method (Greek. methodos) - way of knowledge, the study of natural phenomena and social life. It is also a set of methods.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 27 Systematic Reviews of Research Evidence: Meta-Analysis, Metasynthesis,
Chapter 11 Meta-Analysis. Meta-analysis  Quantitative means of reanalyzing the results from a large number of research studies in an attempt to synthesize.
How to Conduct a Meta-Analysis Arindam Basu MD MPH About the Author Required Browsing.
CHAPTER 2 LITERATION REVIEW 1-1. LEARNING OUTCOMES 1.The reasons for a literature review being an essential part of every project. 2.The purpose of a.
Lesson 3 Measurement and Scaling. Case: “What is performance?” brandesign.co.za.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
NURS 306, Nursing Research Lisa Broughton, MSN, RN, CCRN RESEARCH STATISTICS.
Systematic Reviews of Evidence Introduction & Applications AEA 2014 Claire Morgan Senior Research Associate, WestEd.
Building an Evidence-Based Nursing Practice
Evidence Synthesis/Systematic Reviews of Eyewitness Accuracy
The Research Design Continuum
Supplementary Table 1. PRISMA checklist
Analyzing Qualitative Data
Lecture 4: Meta-analysis
H676 Meta-Analysis Brian Flay WEEK 1 Fall 2016 Thursdays 4-6:50
Gerald Dyer, Jr., MPH October 20, 2016
Meta-Analysis Information from various studies is used to develop a common metric (effect size) Effect size yields information about the existence of.
EAST GRADE course 2019 Introduction to Meta-Analysis
Meta-analysis, systematic reviews and research syntheses
Presentation transcript:

Week Seven

 The systematic and rigorous integration and synthesis of evidence is a cornerstone of EBP  Impossible to develop “best practice” guidelines, protocols, and procedures without organizing and evaluating research evidence through a systematic review

 Forms of systematic reviews: ◦ Narrative, qualitative integration (traditional review of quantitative or qualitative results) ◦ Meta-analysis (statistical integration of results) ◦ Metasynthesis (theoretical integration and interpretation of qualitative findings)

 Objectivity—statistical integration eliminates bias in drawing conclusions when results in different studies are at odds  Increased power—reduces risk of Type II error compared to single study  Increased precision—results in smaller confidence intervals than single studies

 Research question or hypothesis should be essentially identical across studies. ◦ The “fruit” problem—don’t combine apples and oranges!  Must be a sufficient knowledge base—must be enough studies of acceptable quality  Results can be varied but not totally at odds.

 Delineate research question or hypothesis to be tested.  Identify sampling criteria for studies to be included.  Develop and implement a search strategy.  Locate and screen sample of studies meeting the criteria.

 Appraise the quality of the study evidence.  Extract and record data from reports.  Formulate an analytic plan (i.e., make analytic decisions).  Analyze data according to plan.  Write a systematic review.

 Identify electronic databases to use.  Identify additional search strategies (e.g., ancestry approach).  Decide whether or not to pursue the gray literature (unpublished reports).  Identify keywords for the search: ◦ Think creatively and broadly.

 Meta-analysts must make decisions about handling study quality.  Approaches: ◦ Omit low-quality studies (e.g., in intervention studies, non-RCTs). ◦ Give more weight to high-quality studies. ◦ Analyze low- and high-quality studies to see if effects differ (sensitivity analyses).

 Evaluations of study quality can use: ◦ A scale approach (e.g., use a formal instrument to “score” overall quality) ◦ A component approach (code whether certain methodologic features were present or not, e.g., randomization, blinding, low attrition)

 Decisions include: ◦ What effect size index will be used? ◦ How will heterogeneity be assessed? ◦ Which analytic model will be used? ◦ Will there be subgroup (moderator) analyses? ◦ How will quality be addressed? ◦ Will publication bias be assessed?

 A central feature of meta-analysis is the calculation of an effect size index for each study that encapsulates the study results.  An ES index is computed for each study and then combined and averaged (often weighted for sample size).  Several different effect size (ES) indexes can be used.

 Major effect size indexes: ◦ d: the standardized difference between 2 groups (e.g., Es vs. Cs) on an outcome for which a mean can be calculated (e.g., BMI) ◦ Odds Ratio (OR): relative odds for two groups on a dichotomous outcome (e.g., smoke/not smoke) ◦ r: correlation between 2 continuous variables (e.g., age and depression)

 Results (effects) inevitably vary from one study to the next.  Major question: Is heterogeneity just random fluctuations? ◦ If “yes,” then a fixed effects model of analysis can be used. ◦ If “no,” then a random effects model should be used.  Heterogeneity can be formally tested but also can be assessed visually via a forest plot.

 Factors influencing variation in effects is usually explored via subgroup analysis (moderator analysis).  Do variations relate to: ◦ Participant characteristics (e.g., men vs. women)? ◦ Methods (e.g., RCTs vs. quasi-experiments)? ◦ Intervention characteristics (e.g., 3-week vs. 6- week intervention)?

 Nonpublished studies are more likely to have no effects or weak effects than published ones. ◦ So…excluding them could result in overestimating effects.  One approach: ◦ Compute a fail-safe number to see how many studies with 0 effect would be needed to change conclusions from significant to nonsignificant.

 One definition: The bringing together and breaking down of findings, examining them, discovering essential features, and combining phenomena into a transformed whole  Integrations that are more than the sum of the parts—novel interpretations of integrated findings

 Whether to exclude low-quality studies  Whether to integrate studies based in multiple qualitative traditions  Various typologies and approaches; differing terminology

 Similar to meta-analysis in many ways ◦ Formulate question ◦ Decide selection criteria, search strategy ◦ Search for and locate studies ◦ Extract data for analysis ◦ Formulate and implement an analysis approach ◦ Integrate, interpret, write up results

 Noblit and Hare (developed an approach for a meta-ethnography) ◦ Suggest a 7-phase approach ◦ Involves “translating” findings from qualitative studies into one another ◦ An “adequate translation maintains the central metaphors and/or concepts of each account” ◦ Final step is synthesizing the translations

 Paterson and colleagues’ approach involves three components: ◦ Meta-data analysis (analyzing and integrating the study findings) ◦ Meta-method (analyzing the methods and rigor of studies in the analysis) ◦ Meta-theory (analysis of the studies’ theoretical underpinnings)

 Sandelowski and Barrosa’s approach distinguishes studies that are summaries (no conceptual reframing) and syntheses (studies involving interpretation and metaphorical reframing).  Both summaries and syntheses can be used in a meta-summary, which can lay a foundation for a metasynthesis.

 Involve making an inventory of findings and can be aided by computing manifest effect sizes (effect sizes calculated from the manifest content in the studies in the review).  Two types: ◦ Frequency effect size ◦ Intensity effect size

 Frequency effect size ◦ Count the total number of findings across all studies in the review (specific themes or categories). ◦ Compute prevalence of each theme across all reports (e.g., the #1 theme was present in 75% of reports).  Intensity effect size ◦ For each report, compute how many of the total themes are included (e.g., report 1 had 60% of all themes identified).

 Can build on a meta-summary  But can only be done with studies that are syntheses (not summaries), because the purpose is to offer novel interpretations of interpretive findings—not just summaries of findings