Principles of Research Synthesis Benjamin Djulbegovic, M.D.,PhD. H. Lee Moffitt Cancer Center University of South Florida San.

Slides:



Advertisements
Similar presentations
What is a review? An article which looks at a question or subject and seeks to summarise and bring together evidence on a health topic.
Advertisements

Evidence-Based Medicine
Protocol Development.
Overall and subgroup analysis If the OVERALL results show highly significant evidence of a worthwhile effect of treatment, but a few subgroups of the overview.
Reading the Dental Literature
ODAC May 3, Subgroup Analyses in Clinical Trials Stephen L George, PhD Department of Biostatistics and Bioinformatics Duke University Medical Center.
Estimation and Reporting of Heterogeneity of Treatment Effects in Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare.
How to Use Systematic Reviews Primary Care Conference June 27, 2007 David Feldstein, MD.
15 de Abril de A Meta-Analysis is a review in which bias has been reduced by the systematic identification, appraisal, synthesis and statistical.
Michelle Henley, MLS San Francisco General Hospital Bethany Myers, MLIS UCLA Louise M. Darling Biomedical Library.
Chapter 7. Getting Closer: Grading the Literature and Evaluating the Strength of the Evidence.
By Dr. Ahmed Mostafa Assist. Prof. of anesthesia & I.C.U. Evidence-based medicine.
Introduction to evidence based medicine
What is a Systematic review?. Systematic review  Combination of the best research projects in a specific area Selecting Identifying Synthesizing  Health.
The Bahrain Branch of the UK Cochrane Centre In Collaboration with Reyada Training & Management Consultancy, Dubai-UAE Cochrane Collaboration and Systematic.
Are the results valid? Was the validity of the included studies appraised?
Their contribution to knowledge Morag Heirs. Research Fellow Centre for Reviews and Dissemination University of York PhD student (NIHR funded) Health.
Department of O UTCOMES R ESEARCH. Daniel I. Sessler, M.D. Michael Cudahy Professor and Chair Department of O UTCOMES R ESEARCH The Cleveland Clinic Clinical.
EBD for Dental Staff Seminar 2: Core Critical Appraisal Dominic Hurst evidenced.qm.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
SYSTEMATIC REVIEWS AND META-ANALYSIS. Objectives Define systematic review and meta- analysis Know how to access appraise interpret the results of a systematic.
Systematic Reviews.
Evidence Based Medicine Meta-analysis and systematic reviews Ross Lawrenson.
Introduction to Systematic Reviews Afshin Ostovar Bushehr University of Medical Sciences Bushehr, /9/20151.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Simon Thornley Meta-analysis: pooling study results.
Session I: Unit 2 Types of Reviews September 26, 2007 NCDDR training course for NIDRR grantees: Developing Evidence-Based Products Using the Systematic.
Finding Relevant Evidence
Appraising Randomized Clinical Trials and Systematic Reviews October 12, 2012 Mary H. Palmer, PhD, RN, C, FAAN, AGSF University of North Carolina at Chapel.
Systematic reviews to support public policy: An overview Jeff Valentine University of Louisville AfrEA – NONIE – 3ie Cairo.
Meta-analysis and “statistical aggregation” Dave Thompson Dept. of Biostatistics and Epidemiology College of Public Health, OUHSC Learning to Practice.
Conducting and Interpreting Systematic Reviews and Meta- Analyses July 12, 2007.
Evidence-Based Medicine Presentation [Insert your name here] [Insert your designation here] [Insert your institutional affiliation here] Department of.
Landmark Trials: Recommendations for Interpretation and Presentation Julianna Burzynski, PharmD, BCOP, BCPS Heme/Onc Clinical Pharmacy Specialist 11/29/07.
Deciding how much confidence to place in a systematic review What do we mean by confidence in a systematic review and in an estimate of effect? How should.
META-ANALYSIS: THE ART AND SCIENCE OF COMBINING INFORMATION Ora Paltiel, October 28, 2014.
Evidence Based Review. Introduction to Evidence Based Reviews Systematic reviews comprehensively examine the medical literature, –seeking to identify.
How to read a paper D. Singh-Ranger. Academic viva 2 papers 1 hour to read both Viva on both papers Summary-what is the paper about.
PH 401: Meta-analysis Eunice Pyon, PharmD (718) , HS 506.
Systematic Reviews Michael Chaiton Tobacco and Health: From Cells to Society September 24, 2014.
EBM Conference (Day 2). Funding Bias “He who pays, Calls the Tune” Some Facts (& Myths) Is industry research more likely to be published No Is industry.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Objectives  Identify the key elements of a good randomised controlled study  To clarify the process of meta analysis and developing a systematic review.
CAT 5: How to Read an Article about a Systematic Review Maribeth Chitkara, MD Rachel Boykan, MD.
This material was developed by Oregon Health & Science University, funded by the Department of Health and Human Services, Office of the National Coordinator.
Sifting through the evidence Sarah Fradsham. Types of Evidence Primary Literature Observational studies Case Report Case Series Case Control Study Cohort.
Making childbirth safer: Promoting Evidence-based Care Name of presenter Prevention of Postpartum Hemorrhage Initiative (POPPHI) Project.
EBM --- Journal Reading Presenter :呂宥達 Date : 2005/10/27.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Systematic Synthesis of the Literature: Introduction to Meta-analysis Linda N. Meurer, MD, MPH Department of Family and Community Medicine.
R. Heshmat MD; PhD candidate Systematic Review An Introduction.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 18 Systematic Review and Meta-Analysis.
EBM --- Journal Reading Presenter :葉麗雯 Date : 2005/10/27.
Research Design Evidence Based Medicine Concepts and Glossary.
Course: Research in Biomedicine and Health III Seminar 5: Critical assessment of evidence.
Systematic reviews and meta-analyses: when and how to do them Andrew Smith Royal Lancaster Infirmary 18 May 2015.
Evidence Based Practice (EBP) Riphah College of Rehabilitation Sciences(RCRS) Riphah International University Islamabad.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Is a meta-analysis right for me? Jaime Peters June 2014.
Week Seven.  The systematic and rigorous integration and synthesis of evidence is a cornerstone of EBP  Impossible to develop “best practice” guidelines,
Tim Friede Department of Medical Statistics
Principles of Research Synthesis
NURS3030H NURSING RESEARCH IN PRACTICE MODULE 7 ‘Systematic Reviews’’
Supplementary Table 1. PRISMA checklist
Heterogeneity and sources of bias
Introduction to Systematic Reviews
Narrative Reviews Limitations: Subjectivity inherent:
What are systematic reviews and why do we need them?
What is a review? An article which looks at a question or subject and seeks to summarise and bring together evidence on a health topic. Ask What is a review?
Introduction to Systematic Reviews
Presentation transcript:

Principles of Research Synthesis Benjamin Djulbegovic, M.D.,PhD. H. Lee Moffitt Cancer Center University of South Florida San Francisco Radiation Oncology Conference February 28 to March 2, 2003

I The need for research synthesis

The need for research synthesis Health care decision makers need to access research evidence to make informed decisions on diagnosis, treatment and health care management for both individual patients and populations. There are few important questions in health care which can be informed by consulting the result of a single empirical study.

II The problems with traditional review articles

The need for research synthesis Importance of review articles –Review articles in medical journals summarize large amounts of information on a particular topic and therefore are useful and popular source of information for health care professionals – review articles have the highest impact factor which means, that research, practice and policy- decisions are significantly influenced by review articles

Science of research synthesis: problems with traditional review articles Personal views on the available body of evidence Selection bias and selective citations monster –Has been pervasive in medicine, economics and social sciences Can obscure up to 40-60% of true intervention’s effect In 2000, Nobel prize in Economic Science was awarded to James Heckman of the University of Chicago for his analysis of selection bias, which in turn profoundly affected applied research in economics as well as in other social sciences Lack of reproducibility –that is, the lack of the key scientific criterion

Selective citation bias: blind men and elephant

Critique of reviews of chemotherapy for ovarian cancer Crx superior (by Qualitative analysis) 48/53 Search strategy3/53 Inclusion/exclusion criteria2/53 Validity assessment1/53 Quantitative Assessment3/53 Courtesy of Dr. C. Williams

Quality of Review Article:158 articles only 2 met all 10 criteria. Ann Intern Med 1999;131:

Research Synthesis: terminology Systematic review. The application of strategies that limit bias in the assembly, critical appraisal, and synthesis of all relevant studies on a specific topic. Meta- analysis may be, but is not necessary, used as part of this process. Meta-analysis. The statistical synthesis of the data from separate but similar, i.e. comparable studies, leading to a quantitativ summary of the pooled results.

Key Distinctions Between Narrative and Systematic Reviews, by Core Features of Such Reviews Core FeatureNarrative Review Systematic Review Study question Data sources and search strategy Selection of articles for study Article review or appraisal Study quality Synthesis Inferences Often broad in scope Which databases were searched and search strategy are not typically provided. Not usually specified, potentially biased. Variable, depending onwho is conducting the review. If assessed, may not use formal quality assessment. Often a qualitative summary. Sometimes evidence-based. Often a focused clinical question. Comperehensive search of many databases as well as so-called gray literature. Explicit search strategy Criterion-basedSelection, uniformlyapplied. Rigorous critical appraisal, typically usinga data extraction form. Some assessment of quality is almost always included as part of the data extraction process. Quantitative summary (meta-analysis) if the data can be appropriately pooled ; qualitative otherwise. Usually evidence-based

Principles of reliable detection of the effects of health care interventions Methods to reduce bias Methods to reduce statistical imprecision

III Principles of systematic reviews and meta-analysis

Principle #1:the need to consider the totality of evidence –“The world can be only considered as the totality of facts…for the totality of facts determines what is the case, and also whatever is not the case” L. Wittgenstein (“Tractus logico-philosophicus”), 1921

Principle #2: requirement for reproducibility Transparent, explicit and systematic approach in identifying and synthesizing evidence –Methods for search for evidence –Inclusion and exclusion criteria –Quality assessment

Steps of a Systematic Review Search of personal files Systematic manual searches of key journals Computerized Databases Review of reference lists of articles Consultation with experts Identify studies Review for relevance Evaluate methodological quality Extract data Analyze data Reject Draw Conclusions RelevantNot Relevant

the QUOROM statement

Principles of reliable detection of the effects of health care interventions Systematic bias must be < the effect of intervention which we are trying to detect –The need for the totality of evidence (published and unpublished) Random errors (play of chance) must be < the effect of intervention which we are trying to detect –uncertainty/imprecision reduced by pooling all available data –Need for large number of patients/events

Rationale for (quantitative) synthesis of all available evidence The rationale for pooling data is clinical and not statistical Similar interventions for similar conditions will produce the similar effects (i.e. in the same direction) –While the effect size may not be the same, it will rarely be in the opposite directions –Meta-analysis attempts to show direction of the effect (i.e. help establish generalisability of the effect)

Disease population RCT1 RCT2RCT3 0.8 Test for heterogeneity chi square; df 2, p=0.1 Test for overall effect Z; p= Favors new treatmentFavors control RR (95% CI Fixed) 10 Relative risk A)

Disease population RCT1 RCT2RCT3 0.8 Test for heterogeneity chi square; df 2, p=0.02 Test for overall effect Z; p= Favors new treatment Favors control RR (95% CI Fixed) 10 Relative risk B)

Calculate “observed minus expected” for each trial TreatedControl Dead Alive Obs=15 Obs=10 Exp= o-e= -2.5 v= 5.5 odds ratio= 0.64 Conf.Int.= P= 0.29 Courtesy of Dr. K. Wheatley

Compare only patients in one trial with patients in the same trial Statistics Obs’d – exp’dVariance Trial 1 (o – e) 1 V 1 Trial 2 (o – e) 2 V 2 Trial 3 (o – e) 3 V 3 All Trials (o – e) T V T Courtesy of Dr. K. Wheatley

Compare only patients in one trial with patients in the same trial Statistics Obs’d – exp’dVariance Trial Trial …… Trial All Trials Odds ratio = % confidence interval: 0.49 to 0.83 P<0.001

Rationale for (quantitative) synthesis of all available evidence Reduction of bias: Comparison of alike with alike –Use of randomized comparison whenever possible –Always within the same trial –Pooling is done by adding trials (not patients) Reduction of imprecision and uncertainty –Particularly important when the effects of interventions are of small to moderate size (e.g. RRR=5-10% or 15-25%) 20% of reduction in a 50% risk of death=avoidance of death in 1 in 10 patients

Effect of random errors Function of the size of the trial Subgroup analysis

No evidence or no evidence of an effect? Absence of evidence of benefit is not evidence of absence of benefit Truly negative trial (evidence of no effect) vs. false-negative trial (no evidence of an effect)

Size of randomized trials in myeloma

Effect of chance: data-dependent subgroup analysis vs. indirect extrapolation of overall analysis Data-dependent subgroup analyses may result to importantly biased conclusions… and should be avoided… Paradoxically, even effects among specific categories of patients may be best assessed indirectly by approximation of overall treatment effect to the patients into a specific category qualitatively –As long as the effect the effect in the specific subgroup is not qualitatively different from the overall effect

Real trial (ISIS-2): EXAGGERATEDLY POSITIVE mortality effect in a subgroup defined only by astrological “birth sign” Astrological “birth sign”Atenolol effect on day 0-1 mortality in acute myocardial infarction Mortality reduction Statistical comparing Atenlol significance with control group(2P) Leo (I.e. born beween71% + 23<0.01 July 24 & August 23) 11 other birth signs Mean 24% E ach > 0.1 (NS) (taken separately) Any birth sign 30% + 10<0.004 (appropriate overall analysis)