1 Eduardo Ortiz, M.D., M.P.H. National Heart, Lung, and Blood Institute National Institutes of Health May 10, 2011 IOM Standards for Systematic Reviews:

Slides:



Advertisements
Similar presentations
Evidence-based Dental Practice Developing guidelines or clinical recommendations Slide #1 This lecture follows the previous online lecture on evidence.
Advertisements

ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Participation Requirements for a Guideline Panel Member.
Protocol Development.
Participation Requirements for a Guideline Panel Co-Chair.
©PPRNet 2014 Impact of Patient Engagement on Treatment Decisions and Patient-Centered Outcomes in the Implementation of New Guidelines for the Treatment.
Participation Requirements for a Patient Representative.
Instances the user perspective is also important to the decision making process. In order to achieve a realistic and practical outcome, district administrators.
April 2009 Netta Conyers-Haynes, Principal Consultant, Communications Kaiser Permanente National Guideline Program Implications of IOM SR Standards Wiley.
Improving The Clinical Care of Children and Adolescents With Mild Traumatic Brain Injury Madeline Joseph, MD, FACEP, FAAP Professor of Emergency Medicine.
KDIGO Guidelines Development: Strengths and Challenges KDIGO Controversies Conference Clinical Practice Guidelines: Methodology October 12, 2007 Joseph.
Participation Requirements for a Guideline Panel PGIN Representative.
The Campbell Collaborationwww.campbellcollaboration.org Education Panel Session Comments and Points for Discussion Sandra Jo Wilson Vanderbilt University.
Clinical Guidelines on the Identification, Evaluation, and Treatment of Overweight and Obesity in Adults NHLBI Obesity Education Initiative National Heart,
Critical Appraisal Dr Samira Alsenany Dr SA 2012 Dr Samira alsenany.
April 2009 Netta Conyers-Haynes, Principal Consultant, Communications Kaiser Permanente National Guideline Program (NGP): Implications of IOM CPG Standards.
Evidenced Based Practice; Systematic Reviews; Critiquing Research
Implementation Survey Results – Systematic Review Questions Next Steps: Implementation Workshop on Standards for Systematic Reviews and Clinical Practice.
Systematic Reviews and the American Academy of Pediatrics Virginia A. Moyer, MD, MPH Professor of Pediatrics Baylor College of Medicine.
NCCN and NCCN Clinical Practice Guidelines in Oncology™
Manuscript Writing and the Peer-Review Process
Strengths and challenges of the CPG development process: Canadian Society of Nephrology Marcello Tonelli MD SM Chair, CSN-CPG Committee.
Critical Appraisal of Clinical Practice Guidelines
421 MDS Course Course Director: Dr Asmaa Faden Faden Course Contributors: Prof. A AlDosari.
Time Tested Guideline Development and Implementation : The Institute for Clinical Systems Improvement (ICSI) Collaborative Process © 2007 Institute for.
Why the Alliance was Formed Rising rates of overweight and obesity; 50% of adults are not active enough for health benefits; Concern about dietary practices.
The NHLBI Division for the Application of Research Discoveries (DARD): Translating Science into Practice Karen A. Donato, S.M. Acting Deputy Director,
Systematic Reviews.
POWER: Project for an Ontario Women’s Health Evidence-Based Report Card Asma Razzaq Academy Health Annual Research Meeting June 3,
The ACC/AHA Perspective Alice K. Jacobs, MD, FAHA, FACC Professor of Medicine Boston University Medical Center Chair, ACC/AHA Task Force on Practice Guidelines.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.
Evidence Review Group: Past to Present James M. Perrin, MD Professor of Pediatrics, Harvard Medical School MGH Center for Child and Adolescent Health Policy.
Organization and guideline development April 2010 ACCC The Netherlands.
Implementing GRADE in Guideline Development: Real-World Experiences NIAID Guidelines for the Diagnosis and Management of Food Allergy Dr. Matthew Fenton.
What Works Clearinghouse Susan Sanchez Institute of Education Sciences.
AHRQ 2011 Annual Conference: Insights from the AHRQ Peer Review Process Training Grant Review Perspective Denise G. Tate Ph.D., Professor, Chair HCRT Study.
Health Promotion as a Quality issue
Appraising Randomized Clinical Trials and Systematic Reviews October 12, 2012 Mary H. Palmer, PhD, RN, C, FAAN, AGSF University of North Carolina at Chapel.
TEACH LEVEL II: CLINICAL POLICIES AND GUIDELINES STREAM Craig A Umscheid, MD, MSCE, FACP Assistant Professor of Medicine and Epidemiology Director, Center.
Results The final report was presented to NICE and published by NICE and WHO. See
Next Steps: Implementation Workshop on Standards for Systematic Reviews and Clinical Practice Guidelines Institute of Medicine Sandra Zelman Lewis, PhD.
Deciding how much confidence to place in a systematic review What do we mean by confidence in a systematic review and in an estimate of effect? How should.
Copyright © 2010 by ICSI 1 #2257 Copyright © 2010 by ICSI Transforming Health Care Through Collaboration Institute for Clinical Systems Improvement (ICSI)
BMH CLINICAL GUIDELINES IN EUROPE. OUTLINE Background to the project Objectives The AGREE Instrument: validation process and results Outcomes.
Conducting a Sound Systematic Review: Balancing Resources with Quality Control Eric B. Bass, MD, MPH Johns Hopkins University Evidence-based Practice Center.
Moving the Evidence Review Process Forward Alex R. Kemper, MD, MPH, MS September 22, 2011.
Implementing the GRADE Method in Guideline Development: Real- World Experiences Contemplation Stage: To GRADE or Not to GRADE? Sheila A. Agyeman, MHA Director.
Finding evidence-based approaches. Assessing your Community Implementing & Evaluating Establishing Goals & Objectives; Planning for Evaluation Finding.
Sifting through the evidence Sarah Fradsham. Types of Evidence Primary Literature Observational studies Case Report Case Series Case Control Study Cohort.
1 Dissemination and Implementation Paul K. Whelton MD, MSc for the ALLHAT Collaborative Research Group ALLHAT U.S. Department of Health and Human Services.
RTI International is a trade name of Research Triangle Institute Nancy Berkman, PhDMeera Viswanathan, PhD
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Guidelines Recommandations. Role Ideal mediator for bridging between research findings and actual clinical practice Ideal tool for professionals, managers,
NIHR Themed Call Prevention and treatment of obesity Writing a good application and the role of the RDS 19 th January 2016.
More Timely, Credible and Cost Effective Performance Information on Multilateral Partners Presented by: Goberdhan Singh Director of the Evaluation Division.
Evidence Based Practice (EBP) Riphah College of Rehabilitation Sciences(RCRS) Riphah International University Islamabad.
Institute of Medicine Committee on Standards for Developing Trustworthy Clinical Practice Guidelines Washington, DC January 11, 2010 Marguerite Koster,
Workshop on Standards for Clinical Practice Guidelines Institute of Medicine January 11, 2010 Vivian H. Coates, Vice President, ECRI Project Director,
Making Clinical Practice Guidelines “Better” Katrin Uhlig, MD MS Director, Guideline Development Tufts Center for Kidney Disease Guideline Development.
Building an Evidence-Based Nursing Practice
NURS3030H NURSING RESEARCH IN PRACTICE MODULE 7 ‘Systematic Reviews’’
MUHC Innovation Model.
Automation of systematic reviews: the reviewer’s viewpoint
Systematic Review, Synthesis, & Clinical Practice Guidelines
WHO Guideline development
EAST GRADE course 2019 Creating Recommendations
ASCO/NCODA Oral Chemotherapy Dispensing Standards Initiative
Presentation transcript:

1 Eduardo Ortiz, M.D., M.P.H. National Heart, Lung, and Blood Institute National Institutes of Health May 10, 2011 IOM Standards for Systematic Reviews: Finding and Assessing Individual Studies

2 Acknowledgements Colleagues at the Division for the Application of Research Discoveries at NHLBI:  Denise Simons-Morton, MD, PhD (Division Director)  Glen Bennett, MS  Janet de Jesus, MS, RD  Karen Donato, SM, RD  Rob Fulwood, PhD, MSPH  Edward Donnell Ivy, MD  Chi Onyewu, MD, PhD  Susan Shero, MS, RN  Joylene John-Sowah, MD  Sid Smith, MD  Zhi-Jie Zheng, MD, PhD

3 Joint National Committee on Prevention, Detection, Evaluation, &Treatment of High Blood Pressure JNC 7: 2003 JNC 6: 1997 JNC 5: 1992 JNC 4: 1988 JNC 3: 1984 JNC 2: 1980 JNC 1: 1976 Detection, Evaluation, &Treatment of High Blood Cholesterol in Adults (ATP - Adult Treatment Panel) ATP III Update: 2004 ATP III: 2002 ATP II: 1993 ATP I: 1988 Clinical Guidelines on the Identification, Evaluation, & Treatment of Overweight and Obesity in Adults Obesity 1: NHLBI Cardiovascular Clinical Guidelines

4 Guidelines for the Diagnosis and Management of Asthma (EPR-3) EPR-3: 2007 EPR-2 Update: 2002 EPR-2: 1997 EPR-1: 1991 The Diagnosis, Evaluation, and Management of Von Willebrand Disease 2006 Management of Sickle Cell Disease Other NHLBI Clinical Guidelines

5 Current Guideline Efforts  High Blood Pressure  Cholesterol  Overweight/Obesity  Adult Cardiovascular Risk Reduction  Crosscutting Work Groups to support our Adult CVD Guideline efforts:  Lifestyle, Risk Assessment, Implementation  Pediatric Cardiovascular Risk Reduction  Sickle Cell Disease

66 Guidelines Development Process Expert Panel Selection External Review with Revisions as Neede d Literature Search Screening Data Abstraction Study Quality Grading Create Evidence Tables; Grade Body of Evidence Dissemination Implementation Evaluation Evidence Statements and Recommendations Topic Area Identification Analytic Models and Critical Questions

777 Evidence-Based Systematic Review Process Step 1 – Develop analytic framework and critical questions Step 2 – Establish inclusion & exclusion criteria Step 3 – Screen titles, abstracts, and full text to identify relevant studies for inclusion Step 4 – Rate the quality of each individual study Step 5 – Abstract data for studies rated good or fair Step 6 – Create evidence tables Step 7 – Create summary tables Step 8 – Create narrative summary Step 9 – Rate the quality of the overall body of evidence for each critical question

888 Developing the Recommendations Step 10 – Review the evidence Step 11 – Develop evidence statements, including rating the quality of evidence for each statement Step 12 – Develop recommendations, including grading the recommendations and making sure they are supported by the evidence Step 13 – GLIA assessment of recommendations Step 14 – Public comment period, with invitations for review Step 15 – Review comments and revise recommendations Step 16 – Final recommendations Step 17 – Dissemination, implementation, and evaluation

9 Developing the Critical Questions  Critical questions are developed by the expert panels working collaboratively with the methodology team and NHLBI leads  PICOTS format  Predefined inclusion and exclusion criteria

10 Screening  Each study is screened by two independent reviewers using the I/E criteria  Review titles and abstracts, followed by full text  If they disagree, they discuss and try to reach consensus  If they do not achieve consensus or need additional input, there is 3 rd party adjudication  Panel members can appeal a decision  Re-assessed and adjudicated by a 3 rd party, but the panel cannot override a decision made by the reviewers.

11 Lessons Learned and Challenges  Issues:  Sometimes you get the prespecified I/E criteria wrong and have to be practical and make post-hoc adjustments  Despite your best efforts, it is sometimes difficult to figure out whether a study should be included or excluded

12 Rating the Evidence  Quality of each included study is rated by two trained independent reviewers at the time of data abstraction.  No satisfactory tools were available for assessing study quality, so we developed our own  Controlled intervention studies, cohort studies, case control studies, and systematic reviews / meta-analyses  Despite your best efforts, it is sometimes difficult to determine the quality rating of an individual study  Panel members can appeal a decision  Re-assessed and adjudicated by a 3 rd party, but the panel cannot override a decision made by the reviewers.

13 Lessons Learned and Challenges  We used one abstractor and a reviewer to check the abstracted data for accuracy and completeness  Consistent with the Buscemi study, we experienced a substantial number of errors  Dependent on the individual abstractor and reviewer  Takes a lot of time and can create a bottleneck  Using two independent abstractors would not have been feasible

14 Rating the Evidence  Overall body of evidence is rated for each critical question using a standardized rating instrument  We reviewed GRADE, USPSTF, ACC-AHA, and many other systems  Hybrid model similar to USPSTF  High, Moderate, Low  We need a better, standardized, user-friendly approach to rating evidence and grading the strength of recommendations

15 Lessons Learned and Challenges  For guidelines, you need to answer many questions  Conducting multiple systematic reviews to answer these questions is very challenging and requires a lot of time, effort, expertise, manpower, and money  How do we develop high-quality, credible systematic reviews to support our guidelines that can be completed and updated in a timely manner? Is there a sweet spot to aim for between evidence-based rigor and practicality?

16 Lessons Learned and Challenges  Standard 3.2 – Take action to address potentially biased reporting of research results  Grey literature and other sources of unpublished studies  Contacting researchers and asking them to clarify study related information  Asking study sponsors to submit unpublished data  Conference abstracts  Studies not published in English  It takes a lot of time and effort to search the published literature and conduct all the other steps in the SR process without searching all these additional resources  Some of these issues hopefully will be addressed by investigators and publishers, as it is not realistic to expect most groups developing SRs to be able to do all this

17 Lessons Learned and Challenges  Updating searches during and after the review process  Very important, but once again practical considerations come into play  A considerable amount of time can lapse between the initial search and completion of the review  If you have to go back and update the material, it is not just the search that has to be repeated, but all the other steps in the EB review process, which then takes more time and can lead to a vicious cycle.  How do we deal with this from a practical perspective, yet maintain a reasonable time frame and budget?

Lessons Learned and Challenges  Lots of personnel, time, effort and costs to screen studies for inclusion/exclusion, assess study quality, adjudicate decisions, abstract data, create evidence tables and summaries, etc.  If you don’t have all the needed expertise to do this in-house, contracting out the work can be challenging  High costs, coordination, and decision-making across organizations and individuals  Getting screeners, reviewers, and abstractors that have enough methodological expertise and clinical knowledge to understand important contextual issues and other nuances can be a challenge, especially when conducting multiple SRs  Variability in the quality of the reviewers and methodologists  Reviewers and methodologists can differ in their knowledge, perspectives, biases, attention to detail, etc., so the quality of reviews can vary, depending on the individuals

Final Comments  Support the IOM recommendations  Hopefully will improve the quality and consistency of systematic reviews  Report is comprehensive and represents an ideal process, but it would have been helpful to provide more practical or “at a minimum” recommendations to factor in real-world limitations facing most organizations  Stronger linkage between the SR and CPG reports, including more practical recommendations to assist those of us who conduct SRs for the purpose of developing guidelines

20 Thank you! Eduardo Ortiz, M.D., M.P.H. NHLBI Guideline Information: