Download presentation
Presentation is loading. Please wait.
Published byElla Pauline Harper Modified over 9 years ago
1
1 DrPH Seminar Session 1 Use of Systematic Review in Public Health Policy & Getting Started Defining SR Questions Mei Chung, PhD, MPH Research Assistant Professor Nutrition/Infection Unit, Department of Public Health and Community Medicine, Tufts School of Medicine
2
2 Going Over the Syllabus Session 1 (9/11 5:30-8:30pm) –Homework due by 9/22 (Monday): posting your homework on the class website (http://sites.tufts.edu/systematicreviews/ho me-page/) and comment on others’ posting Session 2 (9/22 4-7pm at Sackler 510- Computer Lab) –Bring your “Building a search strategy” worksheet to the class –Homework due by 10/8 (Monday)
3
3 Going Over the Syllabus Session 3 (10/9 5:30-8:30pm) –Homework due by 10/20 (Monday) Session 4 (10/23 5:30-8:30pm) –Post your final presentation slides on the class website by noon on 10/23
4
4 Outline of Session 1 Why systematic reviews are needed in public health policy and practice Very quick overview of SR methods SR versus traditional narrative review –Current debates on the scientific value of SR How to formulate a SR research question How to evaluate a SR (focusing only on SR protocol/methods)
5
5 Evidence-based X Evidence-based public health is defined as the development, implementation, and evaluation of effective programs and policies in public health through application of principles of scientific reasoning, including systematic uses of data and information systems, and appropriate use of behavioral science theory and program planning models. Source: Brownson, Ross C., Elizabeth A. Baker, Terry L. Leet, and Kathleen N. Gillespie, Editors. Evidence-Based Public Health. New York: Oxford University Press, 2003.
6
6 Evidence-based X Evidence-based dietetic practice is the use of systematically reviewed scientific evidence in making food and nutrition practice decisions by integrating best available evidence with professional expertise and client values to improve outcomes. Source: Academy Scope of Dietetics Practice Framework https://www.andeal.org/evidence-analysis-process-overview
7
7 Organizations producing systematic reviews or using systematic review to inform evidence-based policy and practice guidelines http://sites.tufts.edu/systematicreviews/mainpage/
8
8 Synthesizing Evidence Narrative Reviews Systematic Reviews Meta-Analysis Decision Analysis Cost-effectiveness analysis Clinical practice guidelines Algorithms
9
9 What is a Systematic Review? (Sometimes called systematic evidence-based reviews or evidence review) Systematic review –a comprehensive summary of all available evidence that meets predefined eligibility criteria to address a specific clinical question or range of questions Meta-analysis –commonly included in systematic reviews, a statistical method that quantitatively combines the results from different studies
10
10 Basic Steps in a SR Prepare topic Search for studies Screen studiesExtract data Analyze and synthesize data Apply qualitative and/or quantitative methods Report findings
11
11 Ask Identify Acquire Appraise Synthesize
12
12 Systematic review process flowchart
13
13 Systematic review and meta-analysis is a retrospective exercise, suffering from all the limitations of being an observational design.
14
14 For all research, and for systematic review in particular, a clear research question is needed An important clinical/public health question might not be a meaningful research question
15
15 Ask Identify Acquire Appraise Synthesize
16
16 Example - poorly formulated question: Should dietary supplements be recommended to patients with hypertension?
17
17
18
18 PICO approach to formulating answerable research question Counsell, 1997 Ppopulation Iintervention (or exposure) Ccomparator Ooutcomes D study design You’ll also see PICO, PICOS (study design), PICODD (+duration), PICOT (time), and others
19
19 PI(E)COS approach to formulating answerable research question What is the relevant population? What is the intervention/exposure of interest? What is the appropriate comparison? What are the important outcomes of interest? In what setting would the results be applicable? You’ll also see PICOT (timing), PICOD (design/duration), and others
20
20 The PICO method to formulate research question on interventions PopulationInterventions / Exposure ComparatorOutcomes Primary prevention Fish, fish oil, ALA PlaceboOverall mortality Secondary prevention DosageNo controlSudden death Background intake Active comparator Revascular- ization DurationStroke Blood pressure
21
21 Populations Problems with defining condition Varying definitions –lack of an adequate reference standard (e.g., patients with anemia; patients with metabolic syndrome) Different levels of rigor –Loose vs. strict definitions (e.g. elderly vs. adults men and women who are 60 years old or grater) –Applicability/generalizability tradeoffs
22
22 Example – population of interest Primary Prevention - patients without prior history of cardiovascular disease –Country –Background diet Secondary Prevention – patients with prior history of cardiovascular disease
23
23 Example – Intervention / Exposure of Interest What is an omega-3 fatty acid? EPA, DHA, (fish oil, fish) –Levels differ by type of fish –Levels (and/or effect) may differ by preparation (broiled, fried fish sandwiches) ALA (plant source: walnut, canola oil, mustard seed, etc.)
24
24 Example – Outcomes of interest Hard outcomes (clinical events) –Overall mortality –Stroke –Myocardial infarction –Sudden death –Revascularization Soft [surrogate, intermediate] outcomes (biomarkers, measurements) –Coronary flow –Blood pressure –Lipid levels Intermediate –Diagnosis of hypertension
25
25 Analytic Framework Series of specific questions can be formulated into a model that analyzes all effects and interactions between intervention or exposure and outcomes Analytic framework can be used to clarify and generate questions (topics) Can highlight what aspects are known and unknown Can clarify what study designs may be best to address specific questions 25
26
26 USPSTF, Nelson et al., Ann Intern Med. 2009;151(10):727-737 26
27
27 Omega-3 FAs and CVD 27
28
28 Ask Identify Acquire Appraise Synthesize Will be covered in session 2
29
29 Goals of Data Extraction & Quality Assessment Data extraction –To collect key study characteristics and results from published articles pertaining to the SR research question –Important to use a standardized form, customized for the SR research question Quality / risk of bias assessment –To avoid “Garbage in, garbage out” –To assess the confidence in the validity of study findings
30
30 Rationale for Quality Assessment Quality assessment of all studies included in the SRs is important: –estimate extent to which study’s design and methods prevented systematic errors (biases) –variation in quality may explain differences in results of SRs –necessary even if there is little variability among studies (consistent trash is still trash)
31
31 Tools for Quality/Risk of Bias Assessment Many tools, but few “validated” tools: –Cochrane risk of bias assessments (http://bmg.cochrane.org/assessing-risk-bias-included- studies): RCTshttp://bmg.cochrane.org/assessing-risk-bias-included- studies –The Newcastle-Ottawa Scale (http://www.ohri.ca/programs/clinical_epidemiology/oxford.as p): Observational studieshttp://www.ohri.ca/programs/clinical_epidemiology/oxford.as p No well-accepted nutrition/public health specific [content specific] quality assessment tools –Lichtenstein AH, Yetley EA, Lau J. Application of systematic review methodology to the field of nutrition. J Nutr 2008;138:2207-2306
32
32 Linking Quality Assessment to Analysis as a threshold for inclusion and exclusion of studies in the review (generally not recommended) as a possible explanation for differences in results between studies as a variable in sensitivity analysis (test of robustness) as weights in statistical analysis of the results
33
33 Ask Identify Acquire Appraise Synthesize
34
34 Qualitative & Quantitative Syntheses Qualitative synthesis – required –Summary tables (many different forms) Key study characteristics Summary of study results –Graphical presentation of study results (a plus) Quantitative synthesis (a.k.a meta-analysis) - optional –highly depending on types of results/data, and reporting of the data –may not be appropriate – to pool or not to pool can be a tricky decision
35
35 Combine data from multiple studies to illustrate trends in the data May be focused on describing study characteristics, results, or both Can be designed to include characteristics of all included studies –Examples: funding sources, assessment method, country of study Can be designed for subsets of included studies –Examples: summary tables for randomized controlled trials, prevalence studies, harms/side effects, outcomes for specific treatments Summary Tables (I)
36
36 Simplified entry (one row) for each study Table columns may include, for example: –PICOTS (may be listed in table title or headers) –Methodological quality –Applicability –Study size (weight) –Magnitude of effect A single study may be represented in multiple summary tables (e.g., different outcomes) Summary Tables (II) PICOTS = population, intervention, comparator, outcomes, timing, and setting
37
37 Example: Summary Table of Study Characteristics A basic summary table is the “study characteristics” table. The overall summary provides an overview of the state of the available studies in the literature. Hartmann KE, et al. AHRQ Evidence Report/Technology Assessment No. 187. Available at: http://www.ahrq.gov/downloads/pub/evidence/pdf/ bladder/bladder.pdf.
38
38 Example: Summary Table of Study Characteristics (More descriptive, most common)
39
39 Example: Summary Table for Cohort Studies Wang C, et al. AHRQ Evidence Report/Technology Assessment No. 94. http://www.ahrq.gov/downloads/pub/evidence/pdf/o3cardio/o3cardio.pdf. Available at: http://www.ahrq.gov/downloads/pub/evidence/pdf/o3cardio/o3cardio.pdf.
40
40 Example: Summary tables can be specialized for different types of outcomes
41
41 Summary Matrix Wang C, et al. AHRQ Evidence Report/Technology Assessment No. 94. Available at: http://www.ahrq.gov/downloads/pub/evidence/pdf/o3cardio/o3cardio.pdf.
42
42 Example: Graphical presentation of the study results
43
43 An Example of an Assessment of Strength of Body of Evidence High High level of assurance with the validity of the results (based on quality, applicability, effect size, consistency) for the key question At least 2 high quality studies with long-term followup No important disagreement across studies Moderate Good to moderate level of assurance with the validity of the results Fewer than 2 high quality studies Little disagreement across studies in the results Low Low level of assurance with the validity of results Based on studies of moderate to poor quality or limited applicability Insufficient Little data or disagreement across or within studies 43
44
44 Summary tables provide key information on study characteristics and study findings. Through table and graphical formats, respectively Properly constructed summary tables: –Effectively convey results –Provide an overview of the literature in a given field –Enable the reader to grasp results for subsets of the literature Key Messages
45
45 Synonyms of Meta Analysis Quantitative overview/Synthesis Pooling –Less precise –Suggests that data from multiple sources are simply lumped together Combining –Preferred by some –Suggests applying statistical procedures to data
46
46 Reading a Generic Forest Plot Reference: Szajewska H. The role of meta-analysis in the evaluation of the effects of early nutrition on mental and motor development in children. Am J Clin Nutr. 2011 Dec;94(6 Suppl):1889S-1895S. Epub 2011 Apr 27. Review
47
47 Heterogeneity of Data ~Diversity~ Clinical Are studies of similar treatments, populations, settings, design etc. such that an average effect would be clinically meaningful? Methodological Are studies of similar design and conduct such that an average effect would be clinically meaningful? Statistical Is the observed variability of effects greater than that expected by chance alone? Are the characteristics and effects of studies sufficiently similar to estimate an average effect?
48
48 Summary
49
49 RCT Observational Systematic review of RCTs
50
50 PRISMA Checklist
51
51 PRISMA Checklist
52
52 PRISMA Checklist
53
53 References IOM (Institute of Medicine). 2011. Finding What Works in Health Care: Standards for Systematic Reviews, Washington, DC: Natl. Acad. Press IOM (Institute of Medicine). 2011. Clinical Practice Guidelines We Can Trust. Washington, DC: The National Academies Press. Systematic reviews: synthesis of best evidence for clinical decisions. Ann Intern Med. 1997 Mar 1;126(5):376-80.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.