Paper 1: How to increase value and reduce waste when research priorities are set Iain Chalmers Coordinator, James Lind Initiative Launch of Lancet Series.

Slides:



Advertisements
Similar presentations
Summarising what we already know – the pivotal role of systematic reviews Malcolm Macleod.
Advertisements

Definitions of EBP Popular in SW
Avoiding waste in research: the role of public involvement Iain Chalmers Coordinator James Lind Initiative NCRN Consumer Liaison Group Meeting Leeds, 25.
Limitations of Clinical Experience. The first principle is that you must not fool yourself, and you are the easiest person to fool Richard Feynman The.
Gabriella Salvini Porro CARER HOW CARERS CAN IMPROVE RESEARCH HOW CARERS CAN IMPROVE RESEARCH  
NIHR Research Design Service London Enabling Better Research Forming a research team Victoria Cornelius, PhD Senior Lecturer in Medical Statistics Deputy.
Avoiding waste in research: the role of public involvement Iain Chalmers Coordinator, James Lind Initiative ‘Putting people first in research’ INVOLVE.
Doug Altman Centre for Statistics in Medicine, Oxford, UK
Systematic reviews – informing policy, practice and research
Transparency and accuracy in reporting health research Doug Altman The EQUATOR Network Centre for Statistics in Medicine, Oxford, UK.
Department of Health and Human Services Measuring Clinical Lab Ordering Quality: Theory and Practice Steven M. Asch MD MPH VA, RAND, UCLA April 29, 2005.
Introduction to Critical Appraisal : Quantitative Research
NURS 505B Library Session Rachael Clemens Spring 2007.
EPIDEMIOLOGY V M 2009/10 Shane Allwright Dept. of Public Health & Primary Care.
By Dr. Ahmed Mostafa Assist. Prof. of anesthesia & I.C.U. Evidence-based medicine.
Introduction to evidence based medicine
Critical Appraisal of an Article by Dr. I. Selvaraj B. SC. ,M. B. B. S
Research Priorities for Osteoarthritis of the Knee Chard JA, Tallon D, Dieppe PA.
Primary Care Research in Northern Ireland: where’s the evidence? Carmel M. Hughes School of Pharmacy Queen’s University Belfast.
Core Outcome Measures in Effectiveness Trials
1 Experimental Study Designs Dr. Birgit Greiner Dep. of Epidemiology and Public Health.
European Society of Cardiology Cardiovascular diseases in women.
SYSTEMATIC REVIEWS AND META-ANALYSIS. Objectives Define systematic review and meta- analysis Know how to access appraise interpret the results of a systematic.
HSRU is funded by the Chief Scientist Office of the Scottish Government Health Directorates. The author accepts full responsibility for this talk. Health.
Finding out what’s already known and what’s already happening before planning additional research Iain Chalmers on behalf of Mike Clarke, Sally Hopewell.
Chapter 1. Chapter 2 Dr Spock 1956 edition switches his recommendation to face down USA Second study Suggests harm First.
How to Analyze Systematic Reviews: practical session Akbar Soltani.MD. Tehran University of Medical Sciences (TUMS) Shariati Hospital
Introduction to Systematic Reviews Afshin Ostovar Bushehr University of Medical Sciences Bushehr, /9/20151.
Reducing waste in deciding what research to do Iain Chalmers Coordinator, James Lind Initiative NIHR Trainees Meeting Leeds, 26 November 2013.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
A. R. Markos FRCOG FRCP Consultant in Genito Urinary Medicine and Sexual Health Mid Staffordshire NHS Foundation Trust Stafford, UK.
Introduction To Evidence Based Nursing By Dr. Hanan Said Ali.
Overview of Chapter The issues of evidence-based medicine reflect the question of how to apply clinical research literature: Why do disease and injury.
Finding Relevant Evidence
Surveillance Data in Action: Tuberculosis Indicators Melissa Ehman, MPH Tuberculosis Control Branch (TBCB) Division of Communicable Disease Control Center.
Experimental studies Jean-François Boivin 25 October 2010.
This material was developed by Oregon Health & Science University, funded by the Department of Health and Human Services, Office of the National Coordinator.
Evaluating the impact of health research: Revisiting the Canadian Institutes of Health Research Impact Assessment Framework Nicola Lauzon, Marc Turcotte.
Evidence-Based Medicine Presentation [Insert your name here] [Insert your designation here] [Insert your institutional affiliation here] Department of.
RevMan for Registrars Paul Glue, Psychological Medicine What is EBM? What is EBM? Different approaches/tools Different approaches/tools Systematic reviews.
How to read a paper D. Singh-Ranger. Academic viva 2 papers 1 hour to read both Viva on both papers Summary-what is the paper about.
Evidence-Based Medicine – Definitions and Applications 1 Component 2 / Unit 5 Health IT Workforce Curriculum Version 1.0 /Fall 2010.
2nd Concertation Meeting Brussels, September 8, 2011 Reinhard Prior, Scientific Coordinator, HIM Evidence in telemedicine: a literature review.
James Lind Initiative Database of Uncertainties about the Effects of Treatments (DUETs) what gaps in the evidence matter to patients and doctors? Iain.
INTRODUCTION. Majmaah graduates should be:  scientific in their approach to practice  proficient in clinical care  professional  community conscious.
Methodological quality of malaria RCTs conducted in Africa Vittoria Lutje*^, Annette Gerritsen**, Nandi Siegfried***. *Cochrane Infectious Diseases Group.
Transparency and access to medical research Dr. Wim Weber European editor, BMJ.
This material was developed by Oregon Health & Science University, funded by the Department of Health and Human Services, Office of the National Coordinator.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Finding, Evaluating, and Presenting Evidence Sharon E. Lock, PhD, ARNP NUR 603 Spring, 2001.
SOC Animal Experimentation James G. Anderson, Ph.D. Purdue University.
Open to all: the role of public involvement in health research Faculty of Health and Social Care, Open University, Wednesday 9 th December 2015 Simon Denegri,
1 PRIORITY MEDICINES FOR EUROPE AND THE WORLD Barriers to Pharmaceutical Innovation Richard Laing EDM/PAR WHO.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Why are systematic reviews important? Iain Chalmers Editor, James Lind Library Cochrane Workshop Independent University, Bangladesh.
NIHR using systematic reviews to inform funding decisions Matt Westmore, Director of Finance and Strategy Sheetal Bhurke, Research Fellow NIHR Evaluation,
Acupuncture in Pain Management Zekeriya AKTÜRK Şifa University Medical Faculty, Department of Family Medicine 17 March 2016
Why is waste in research an ethical issue? Elizabeth Wager PhD Publications Consultant, Sideview, UK Co-Editor-in-Chief : Research Integrity & Peer Review.
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
Introduction to Systematic Reviews Afshin Ostovar 6/24/
Reporting in health research Why it matters How to improve Presentation for the Center for Open Science July 10, 2015 April Clyburne-Sherin.
Tim Friede Department of Medical Statistics
Health Technology Assessment
Reporting quality in preclinical studies Emily S Sena, PhD Centre for Clinical Brain Sciences, University of
Musculoskeletal Health in Europe
Evidence-Based Practice I: Definition – What is it?
Critical Reading of Clinical Study Results
Association between risk-of-bias assessments and results of randomized trials in Cochrane reviews: the ROBES study Jelena Savović1, Becky Turner2, David.
Evidence-Based Public Health
Presentation transcript:

Paper 1: How to increase value and reduce waste when research priorities are set Iain Chalmers Coordinator, James Lind Initiative Launch of Lancet Series on Waste London, 8 January 2014

Iain Chalmers, health services researcher Michael B Bracken, epidemiologist Benjamin Djulbegovic, oncologist, methodologist Silvio Garattini, clinical pharmacologist Jonathan Grant, science policy analyst Metin Gulmezoglu, clinical trialist David Howells, preclinical animal researcher John PA Ioannidis, methodologist, bibliometrician Sandy Oliver, social scientist

Issue 1

Health dividends from basic research Comroe & Dripps claimed 62 per cent of all articles judged essential for later clinical advances resulted from basic research. The rigour and objectivity of the C&D analysis was challenged (eg by Richard Smith 1987). Attempted replication of the C&D analysis found that it was ‘not repeatable, reliable or valid’, and that only between 2 and 21 per cent of research underpinning clinical advances could be described as basic (Jonathan Grant et al. 2003).

Recent bibliometric analyses (1)

Recent bibliometric analyses (2)

UK concern about inadequate capacity for testing findings from basic research in applied research From Chalmers I, Rounding C, Lock K. Descriptive survey of non-commercial randomised trials in the United Kingdom, BMJ 2003;327:

Type of research (categories included) 2004/52009/10 Pure basic (aetiology and underpinning) Pure applied (prevention, detection & diagnosis, treatment evaluation, disease management, health services) Use-led basic (development of detection, diagnosis and treatment) Public/charitable funding of medical research, by investment category, 2004/5 and 2009/10 (UK Clinical Research Collaboration, 2012).

Recommendation 1

Issue 2

Recommendation 2

Low priority questions addressed in research on treatments for osteoarthritis of the knee Tallon, Chard and Dieppe. Lancet, 2000.

Interventions mentioned in research priorities identified by James Lind Alliance patient-clinician Priority Setting Partnerships, and among registered trials, (Chalmers et al. 2013)

Priority treatment outcome from a survey of patients with rheumatoid arthritis was not pain It was fatigue Waste resulting from ignoring outcomes of importance to patients

Issue 3

Recommendation 3

Recommendation 4

20 animal studies: “The results of this review did not show convincing evidence to substantiate the decision to perform trials with nimodipine in large numbers of patients. Stroke 2001;32: STUDIES IN ANIMALS Horn J, Limburg M. Calcium antagonists for acute ischemic stroke. The Cochrane Database of Systematic Reviews, 2000 “46 trials were identified of which 28 were included (7521 patients). No effect of calcium antagonists on poor outcome at the end of follow- up (OR 1.07, 95% CI 0.97/1.18), or on death at end of follow-up (OR 1.10, 95% CI 0.98/1.24) was found.” STUDIES IN HUMANS

“systematic reviews and meta-analyses [are needed] to evaluate more fully the predictability and transferability of animal models.” 2005

Bracken 2012 Increase in proportion of meta-analyses in PubMed, Human Animal

Some illustrative examples of waste from redundant research

Sena et al Redundant animal research

Redundant clinical research…

…leaving key questions unaddressed

Cumulative odds ratios for front versus non-front sleeping position of sudden infant deaths versus controls. Gilbert et al Redundant epidemiological research

“Systematic review of preventable risk factors for SIDS from 1970 would have led to earlier recognition of the risks of sleeping on the front and might have prevented over infant deaths in the UK and at least in Europe the USA and Australasia.” Consequences of failure to analyse epidemiological research cumulatively

What are research regulators, research funders and academia in the UK doing to reduce this sometimes lethal waste?

Research ethics committees/IRBs

Inappropriate continued use of placebo controls in clinical trials assessing the effects on death of antibiotic prophylaxis for colorectal surgery

Department of Health 2001 ”It is essential that existing sources of evidence, especially systematic reviews, are considered carefully prior to undertaking research… Research which duplicates other work unnecessarily, or which is not of sufficient quality to contribute something useful to existing knowledge, is in itself unethical.“

MS Society supports systematic reviews

The Wellcome Trust

Science is cumulative, and scientists should cumulate scientifically. Scientific cumulation entails using (i) methods to reduce systematic errors (biases) and where appropriate and possible, (ii) meta-analysis to reduce random errors (the play of chance).

Patients have suffered and died unnecessarily and resources for research have been wasted because the research community has failed to review existing evidence systematically when planning new research. Why should patients and the public trust us if we and our professional institutions fail to make systematic, efficient use of the results of the research that the public has funded?

Alessandro Liberati