Workshop on VHL and HEN, Sao Paulo, 10-11 April 2006 HEN Methodology Step by step.

Slides:



Advertisements
Similar presentations
What is a review? An article which looks at a question or subject and seeks to summarise and bring together evidence on a health topic.
Advertisements

RESEARCH CLINIC SESSION 1 Committed Officials Pursuing Excellence in Research 27 June 2013.
Protocol Development.
Introducing... Reproduced and modified from a presentation produced by Zoë Debenham from the original presentation created by Kate Light, Cochrane Trainer.
Participation Requirements for a Patient Representative.
8. Evidence-based management Step 3: Critical appraisal of studies
Reading the Dental Literature
Screen 1 of 24 Reporting Food Security Information Understanding the User’s Information Needs At the end of this lesson you will be able to: define the.
The material was supported by an educational grant from Ferring How to Write a Scientific Article Nikolaos P. Polyzos M.D. PhD.
Conducting systematic reviews for development of clinical guidelines 8 August 2013 Professor Mike Clarke
Estimation and Reporting of Heterogeneity of Treatment Effects in Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare.
Oregon EPC DRUG EFFECTIVENESS REVIEW PROJECT Methods for Comparative Evidence Reviews September 2005 Oregon Evidence-based Practice Center for the Drug.
THE NEWCASTLE CRITICAL APPRAISAL WORKSHEET
Critical appraisal of the literature Michael Ferenczi Head of Year 4 Head of Molecular Medicine Section, National Heart and Lung Institute.
Evidenced Based Practice; Systematic Reviews; Critiquing Research
Educational Research by John W. Creswell. Copyright © 2002 by Pearson Education. All rights reserved. Slide 1 Chapter 10 Reporting and Evaluating Research.
Objectives and Indicators for MCH Programs MCH in Developing Countries January 25, 2011.
How to write a publishable qualitative article
Introduction to evidence based medicine
Critical Appraisal of an Article by Dr. I. Selvaraj B. SC. ,M. B. B. S
BC Jung A Brief Introduction to Epidemiology - XI (Epidemiologic Research Designs: Experimental/Interventional Studies) Betty C. Jung, RN, MPH, CHES.
Developing Research Proposal Systematic Review Mohammed TA, Omar Ph.D. PT Rehabilitation Health Science.
Their contribution to knowledge Morag Heirs. Research Fellow Centre for Reviews and Dissemination University of York PhD student (NIHR funded) Health.
Workshop on VHL and HEN, Sao Paulo, April 2006 Health Evidence Network (HEN) Outline HEN – why, what, who, how HEN dissemination Impact of HEN reports.
Teaching the Science Base of MCH Donna Strobino, PhD.
P. W. Stone M6728 Columbia University, School of Nursing Evaluating the Evidence.
U.S. Department of Agriculture Center for Nutrition Policy and Promotion Slides provided by the USDA Center for Nutrition Policy and Promotion.
Research Design. Research is based on Scientific Method Propose a hypothesis that is testable Objective observations are collected Results are analyzed.
Systematic Reviews.
Evidence Based Medicine Meta-analysis and systematic reviews Ross Lawrenson.
Skills Building Workshop: PUBLISH OR PERISH. Journal of the International AIDS Society Workshop Outline Journal of the International.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Session I: Unit 2 Types of Reviews September 26, 2007 NCDDR training course for NIDRR grantees: Developing Evidence-Based Products Using the Systematic.
Transparency in Searching and Choosing Peer Reviewers Doris DEKLEVA SMREKAR, M.Sc.Arch. Central Technological Library at the University of Ljubljana, Trg.
Division Of Early Warning And Assessment MODULE 5: PEER REVIEW.
Evidence-Based Medicine Presentation [Insert your name here] [Insert your designation here] [Insert your institutional affiliation here] Department of.
Workshop on VHL and HEN, Sao Paulo, April 2006 Workshop on VHL and HEN Sao Paulo, April 2006 Anca Dumitrescu, M.D. WHO Regional Office for.
Dr Jamal Roudaki Faculty of Commerce Lincoln University New Zealand.
Evidence-Based Medicine: What does it really mean? Sports Medicine Rounds November 7, 2007.
L1 Chapter 10 Reporting and Evaluating Research EDUC 640 Dr. William Bauer.
Wipanee Phupakdi, MD September 15, Overview  Define EBM  Learn steps in EBM process  Identify parts of a well-built clinical question  Discuss.
Evidence Based Practice RCS /9/05. Definitions  Rosenthal and Donald (1996) defined evidence-based medicine as a process of turning clinical problems.
Evidence serving practice: a perspective from WHO Regional Office for Europe Anca Dumitrescu, MD Director, Division of Information, Evidence and Communication.
Workshop on VHL and HEN, Sao Paulo, April 2006 Access to evidence HEN - sources of evidence Collection of health and health- related information.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
The Research Process.  There are 8 stages to the research process.  Each stage is important, but some hold more significance than others.
Objectives  Identify the key elements of a good randomised controlled study  To clarify the process of meta analysis and developing a systematic review.
Systematic Review: Interpreting Results and Identifying Gaps October 17, 2012.
Critical Appraisal (CA) I Prepared by Dr. Hoda Abd El Azim.
Module 3 Finding the Evidence: Pre-appraised Literature.
Sifting through the evidence Sarah Fradsham. Types of Evidence Primary Literature Observational studies Case Report Case Series Case Control Study Cohort.
EBM --- Journal Reading Presenter :呂宥達 Date : 2005/10/27.
EVALUATING u After retrieving the literature, you have to evaluate or critically appraise the evidence for its validity and applicability to your patient.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 18 Systematic Review and Meta-Analysis.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Unit 11: Evaluating Epidemiologic Literature. Unit 11 Learning Objectives: 1. Recognize uniform guidelines used in preparing manuscripts for publication.
The Bahrain Branch of the UK Cochrane Centre In Collaboration with Reyada Training & Management Consultancy, Dubai-UAE Cochrane Collaboration and Systematic.
Evidence Based Practice (EBP) Riphah College of Rehabilitation Sciences(RCRS) Riphah International University Islamabad.
Primary studies Secondry studies. Primary studies Experimental studies Clinical trial studies Surveys studies.
Copyright © 2010, 2006, 2002 by Mosby, Inc., an affiliate of Elsevier Inc. Chapter 10 Evidence-Based Practice Sharon E. Lock.
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
Developing a proposal Dónal O’Mathúna, PhD Senior Lecturer in Ethics, Decision-Making & Evidence
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. EVIDENCE-BASED TEACHING IN NURSING – Chapter 15 –
Critically Appraising a Medical Journal Article
Using internet information critically Reading papers Presenting papers
Clinical Study Results Publication
School of Dentistry Education Research Fund (SDERF)
Style You need to demonstrate knowledge and understanding beyond undergraduate level and should also reach a level of scope and depth beyond that taught.
Presentation transcript:

Workshop on VHL and HEN, Sao Paulo, April 2006 HEN Methodology Step by step

Workshop on VHL and HEN, Sao Paulo, April 2006 Reviewing evidence Systematic reviews are increasingly common Cochrane, Campbell collaboration, DARE, HTA… These are often done by experts Can be slow (6 months plus) Expensive: $130,000 average Probably needs 2+ people per review We don’t have time! We don’t have the resources!

Workshop on VHL and HEN, Sao Paulo, April 2006 Types of methods to produce evidence

Workshop on VHL and HEN, Sao Paulo, April 2006 The starting point for HEN is the policy makers’ question Policy makers’ question Synthesis of evidence 10 page reports & 1 page summary

Final Version POLICY QUESTION HEN – TUs Refine questions Identify authors HEN Possible reject Drafting Report Initial Review Author External / Internal Reviewers TUs Peer Review Draft revision Possible reject HEN Quality Control Possible reject Author Final Synthesis HEN Communication INFORMATION NEED Identify sources Description Mapping Documents/DB Selection Edit Resources HEN Consultant HEN Freelance editor HEN Member States’ Evidence Needs Synthesis Update Authors Re-phrase question Immediate answers Resources HEN Copy-editor

Workshop on VHL and HEN, Sao Paulo, April 2006 Collection of questions A) Proactive Call for topics once a year - Ministries of health (using WHO channels) - Technical Units of WHO - Network Members Systematically review the work of HEN Members B) Reactive HEN electronic mailbox Phone questions from policy makers Questions or policy concerns identified by WHO Technical Units Issues raised at regional committees and ministerial conferences

Workshop on VHL and HEN, Sao Paulo, April 2006 Prioritization of the policy concerns Highest priority: questions from Ministries of Health Priority areas of WHO Europe –health systems –mental health –child health –environment –HIV/AIDS –nutrition –non-communicable diseases –ageing –poverty

Workshop on VHL and HEN, Sao Paulo, April 2006 Selection criteria Availability of evidence (after preliminary literature searches, and discussions with technical staff) Feasibility (is it practically possible to produce a synthesis report within a reasonable time and budget frame?) Relevance for audience Coverage (whether the proposed public health questions are of interest to a number of Member States or only one) Timeliness (how long it will take to produce the answer)

Workshop on VHL and HEN, Sao Paulo, April 2006 Selection of authors Proven ability to undertake a systematic review (participation in systematic reviews conducted by the Cochrane Collaboration, HTA agencies) Proven international record of publication in the field of public health (international scientific papers, indexed in Medline/PubMed or another scientific bibliographical database) Proven record of communication with policy makers (indicated by the topics of the experts publications, conference presentations or teaching areas) (Medline, CV) Availability and possibility to produce a paper in a given time table

Workshop on VHL and HEN, Sao Paulo, April 2006 Standard structure for synthesis reports The report: Introduction Findings from health related research Other knowledge Current debate Discussion Conclusions References The summary: Issue Findings Policy considerations Type of evidence

Workshop on VHL and HEN, Sao Paulo, April 2006 Quality Control 1) Author has conducted a proper and systematic search and is transparent with the search strategy 2) Inclusion/exclusion criteria and the methods of analysis are described properly 3) List of references, adding those relevant to the paper (and to our Member States) 4) Findings/results reported critically but objectively 5) Style suitable for policy makers (not too clinical or technical) 6) Paper and summary follow the standard format 7) Policy considerations are clear, based on findings and provide concrete support for decision making or action

Workshop on VHL and HEN, Sao Paulo, April 2006 Grading of evidence and strength of policy options Strong evidence – consistent findings in two or several scientific studies of high quality Moderate evidence – consistent findings in two or several scientific studies of acceptable quality Limited evidence – only one study available or inconsistent findings in several studies No evidence – no study of acceptable scientific quality available

Workshop on VHL and HEN, Sao Paulo, April 2006 Methodological appropriateness different types of questions best answered by different types of study Muir Gray, 1997 (on social interventions in children)

Workshop on VHL and HEN, Sao Paulo, April 2006 (Randomised) Controlled Trials Was the assignment to the treatment groups random? Was relatively complete follow-up achieved? Were the outcomes of people who withdrew described and included in the analysis? Were the control and intervention groups similar at the start of the study? Were the groups treated identically (other than the intervention/s of interest? How big is the study? How big is the effect? Do the numbers add up?

Workshop on VHL and HEN, Sao Paulo, April 2006 Cohort Studies (that follow cohorts of people over time) Is the sample representative? What else happened? (What factors other than the intervention may have affected the outcome, and were the cohorts being compared comparable on these important confounding factors? Are the outcomes meaningful? Was there adequate statistical adjustment, or matching for the effects of these confounding variables? How big is it? How big is the effect? Do the numbers add up?

Workshop on VHL and HEN, Sao Paulo, April 2006 Surveys (cross-sectional - one point in time) Is the study based on a representative (random) sample? Was follow-up long enough for important events to occur? Are the measures meaningful? Do we know how people got into the survey? Do we know how many were surveyed and how many refused? How big is the study? (Big surveys are not necessarily better) Who dropped out? Surveys: gold mines for data-dredgers

Workshop on VHL and HEN, Sao Paulo, April 2006 Appraisal of qualitative studies How credible are the findings? How clear is the basis for evaluation? How defensible is the research design? How well was the data collection carried out? Hoe well has diversity of perspective and content been explored? How adequately has the research process been documented? Etc.

Workshop on VHL and HEN, Sao Paulo, April 2006 Assessing the validity of literature reviews Look for sources of bias. The main ones are: A poorly defined question A limited search for literature Poorly defined inclusion/exclusion criteria Lack of assessment of the validity of the included studies Lack of investigation of heterogeneity Inappropriate pooling of studies

Workshop on VHL and HEN, Sao Paulo, April 2006 The “systematic-ish” review Looks a bit systematic... Includes details of some databases that were searched... Uses the right jargon... Doesn’t appraise the included studies... The conclusions are not consistent with the results of the studies...

Workshop on VHL and HEN, Sao Paulo, April 2006 Reading other people’s reviews Are they answering the right question? Are they using the right evidence to answer that question? Is it likely that they missed relevant evidence? Do you suspect that they are using evidence selectively? Have they paid attention to the quality of that evidence? E.g. Its methodological soundness, and relevance Who pays them? Who paid for the review? Are these, or other sources of funding likely to introduce bias?

Workshop on VHL and HEN, Sao Paulo, April 2006 How to be more evidence-based: Presentation is everything Be clear about what questions you are answering Be clear about what sort of evidence you believe is admissible, and why Be clear about what you have excluded/included Reference well-conducted research, in preference to opinion pieces, editorials, general reviews, general WHO or World Bank commentaries Reference relevantly Show them 1. methods and 2. the evidence

Workshop on VHL and HEN, Sao Paulo, April 2006 Always check the evidence - whatever the source “Those who can make you believe absurdities...can make you commit atrocities” Voltaire