Nicola Cooper Centre for Biostatistics & Genetic Epidemiology,

Slides:



Advertisements
Similar presentations
Health Economics for Prescribers
Advertisements

Mixed methods synthesis ESRC Methods Festival 2006 James Thomas Institute of Education, University of London.
What is a review? An article which looks at a question or subject and seeks to summarise and bring together evidence on a health topic.
METHODOLOGY FOR META- ANALYSIS OF TIME TO EVENT TYPE OUTCOMES TO INFORM ECONOMIC EVALUATIONS Nicola Cooper, Alex Sutton, Keith Abrams Department of Health.
Technology Appraisal of Medical Devices at NICE – Methods and Practice Mark Sculpher Professor of Health Economics Centre for Health Economics University.
USE OF EVIDENCE IN DECISION MODELS: An appraisal of health technology assessments in the UK Nicola Cooper Centre for Biostatistics & Genetic Epidemiology,
USE OF EVIDENCE IN DECISION MODELS: An appraisal of health technology assessments in the UK Nicola Cooper Centre for Biostatistics & Genetic Epidemiology,
Protocol Development.
Dangerous Omissions: The Consequences of Ignoring Decision Uncertainty Karl Claxton Centre for Health Economics*, Department of Economics and Related Studies,
1 Use of Cochrane review results in designing new studies Nicola Cooper Centre for Biostatistics and Genetic Epidemiology, University of Leicester UK
Estimating the cost-effectiveness of an intervention in a clinical trial when partial cost information is available: A Bayesian approach Nicola Cooper.
Evidence synthesis of competing interventions when there is inconsistency in how effectiveness outcomes are measured across studies Nicola Cooper Centre.
Systematic Reviews Dr Sharon Mickan Centre for Evidence-based Medicine
Paul Tappenden Jim Chilcott Health Economics and Decision Science (HEDS) School of Health and Related Research (ScHARR) 25 th July 2005 Consensus working.
Secondary Data Analysis: Systematic Reviews & Associated Databases
Systematising the process : the role of PSMs in informing model structure Jim Chilcott Technical Director, ScHARR-TAG Suzy Paisley DoH Research Scientist.
Implementation of new technologies Dr Keith Cooper Southampton Health Technology Assessments Centre University of Southampton.
When is there Sufficient Evidence? Karl Claxton, Department of Economics and Related Studies and Centre for Health Economics, University of York.
BACKGROUND AND AIM Website: Challenges in conducting a systematic review of the diagnostic accuracy of genetic tests: an example.
Exploring uncertainty in cost effectiveness analysis NICE International and HITAP copyright © 2013 Francis Ruiz NICE International (acknowledgements to:
Modelling Partially & Completely Missing Preference-Based Outcome Measures (PBOMs) Keith Abrams Department of Health Sciences, University of Leicester,
Populating decision analytic models Laura Bojke, Zoë Philips With M Sculpher, K Claxton, S Golder, R Riemsma, N Woolacoot, J Glanville.
The role of economic modelling – a brief introduction Francis Ruiz NICE International © NICE 2014.
The Importance of Decision Analytic Modelling in Evaluating Health Care Interventions Mark Sculpher Professor of Health Economics Centre for Health Economics.
Elicitation Some introductory remarks by Tony O’Hagan.
Cost-Effectiveness Analyses in the UK - Lessons from the National Institute for Clinical Excellence Mark Sculpher Professor of Health Economics Centre.
Structural uncertainty from an economists’ perspective
The Cost-Effectiveness and Value of Information Associated with Biologic Drugs for the Treatment of Psoriatic Arthritis Y Bravo Vergel, N Hawkins, C Asseburg,
Michael Rawlins Chairman, National Institute for Health and Clinical Excellence, London Emeritus Professor, University of Newcastle upon Tyne Honorary.
Health care decision making Dr. Giampiero Favato presented at the University Program in Health Economics Ragusa, June 2008.
Prioritising HTA funding: The benefits and challenges of using value of information in anger CENTRE FOR HEALTH ECONOMICS K Claxton, L Ginnelly, MJ Sculpher,
Generalised Evidence Synthesis Keith Abrams, Cosetta Minelli, Nicola Cooper & Alex Sutton Medical Statistics Group Department of Health Sciences, University.
Decision Analysis as a Basis for Estimating Cost- Effectiveness: The Experience of the National Institute for Health and Clinical Excellence in the UK.
Chapter 7. Getting Closer: Grading the Literature and Evaluating the Strength of the Evidence.
Introduction to evidence based medicine
Identifying evidence for decision-analytic models Suzy Paisley DoH Research Scientist in Evidence Synthesis Consensus Working Group on the Use of Evidence.
Guidelines for the reporting of evidence identification in decision models: observations and suggested way forward Louise Longworth National Institute.
USE OF EVIDENCE IN DECISION MODELS: An appraisal of health technology assessments in the UK Nicola Cooper Centre for Biostatistics & Genetic Epidemiology,
DISCUSSION Alex Sutton Centre for Biostatistics & Genetic Epidemiology, University of Leicester.
Their contribution to knowledge Morag Heirs. Research Fellow Centre for Reviews and Dissemination University of York PhD student (NIHR funded) Health.
Economic evaluation of health programmes Department of Epidemiology, Biostatistics and Occupational Health Class no. 16: Economic Evaluation using Decision.
Systematic Reviews.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.
Basic Economic Analysis David Epstein, Centre for Health Economics, York.
Plymouth Health Community NICE Guidance Implementation Group Workshop Two: Debriding agents and specialist wound care clinics. Pressure ulcer risk assessment.
Meta-analysis and “statistical aggregation” Dave Thompson Dept. of Biostatistics and Epidemiology College of Public Health, OUHSC Learning to Practice.
Evidence-Based Medicine Presentation [Insert your name here] [Insert your designation here] [Insert your institutional affiliation here] Department of.
Evidence-Based Medicine: What does it really mean? Sports Medicine Rounds November 7, 2007.
Patricia Guyot1,2, Nicky J Welton1, AE Ades1
Objectives  Identify the key elements of a good randomised controlled study  To clarify the process of meta analysis and developing a systematic review.
Doing a Systematic Review Jo Hunter Linda Atkinson Oxford University Health Care Libraries 1 March 2006 Workshops in Information Skills and Electronic.
Sifting through the evidence Sarah Fradsham. Types of Evidence Primary Literature Observational studies Case Report Case Series Case Control Study Cohort.
Matching Analyses to Decisions: Can we Ever Make Economic Evaluations Generalisable Across Jurisdictions? Mark Sculpher Mike Drummond Centre for Health.
Component 1: Introduction to Health Care and Public Health in the U.S. 1.9: Unit 9: The evolution and reform of healthcare in the US 1.9a: Evidence Based.
Evidence Based Practice (EBP) Riphah College of Rehabilitation Sciences(RCRS) Riphah International University Islamabad.
The Value of Reference Case Methods for Resource Allocation Decision Making Mark Sculpher, PhD Professor of Health Economics Centre for Health Economics.
“New methods in generating evidence for everyone: Can we improve evidence synthesis approaches?” Network Meta-Analyses and Economic Evaluations Petros.
Copyright © 2010, 2006, 2002 by Mosby, Inc., an affiliate of Elsevier Inc. Chapter 10 Evidence-Based Practice Sharon E. Lock.
Benjamin Kearns, The University of Sheffield
The Research Design Continuum
Supplementary Table 1. PRISMA checklist
Strategies to incorporate pharmacoeconomics into pharmacotherapy
Chapter 7 The Hierarchy of Evidence
Health care decision making
How to apply successfully to the NIHR HTA Board?
Analysing RWE for HTA: Challenges, methods and critique
What is a review? An article which looks at a question or subject and seeks to summarise and bring together evidence on a health topic. Ask What is a review?
Evidence-Based Public Health
Presentation transcript:

USE OF EVIDENCE IN DECISION MODELS: An appraisal of health technology assessments in the UK Nicola Cooper Centre for Biostatistics & Genetic Epidemiology, Department of Health Sciences, University of Leicester, U.K. http://www.hs.le.ac.uk/personal/njc21/ Acknowledgements to: Doug Coyle, Keith Abrams, Miranda Mugford & Alex Sutton

OUTLINE Background to empirical research Methods & Findings from Study Conclusions

BACKGROUND                                       Increasingly decision models developed to inform complex clinical/economic decisions (e.g. NICE technology appraisals) Decision models provide: Explicit quantitative & systematic approach to decision making Compares at least 2 alternatives Useful way of synthesising evidence from multiple sources (e.g. effectiveness data from trials, adverse event rates from observational studies, etc.)

BACKGROUND Decision modelling techniques commonly used for:                                       Decision modelling techniques commonly used for: i) Extrapolation of primary data beyond endpoint of a trial, ii) Indirect comparisons when no ‘head-to-head’ trials iii) Investigation of how cost-effectiveness of clinical strategies/interventions changes with values of key parameters iv) Linking intermediate endpoints to ultimate measures of health gain (e.g. QALYs) v) Incorporation of country specific data relating to disease history and management.

BACKGROUND                                       Decision models contain many unknown parameters & evidence may include published data, controlled trial data, observational study data, or expert knowledge. Need to utilise/synthesise available evidence Model parameters can include: clinical effectiveness, costs, disease progression rates, & utilities Evidence-based models – Require systematic methods for identification & synthesis of evidence to estimate model parameters with appropriate levels of uncertainty If select only “best” (most relevant) evidence – potentially ignore valuable information from other sources

ECONOMIC DECISION MODEL RCT1 RCT2 RCT3 OBS1 OBS2 ROUTINE EXPERT DATA SOURCES Meta-analysis EVIDENCE SYNTHESIS Gen. synthesis Opinion pooling Bayes theorem In combination Adverse Events Clinical Effect MODEL INPUTS Utility Cost DECISION MODEL

MRC FELLOWSHIP The use of evidence synthesis & uncertainty modelling in economic evidence-based health-related decision models Part 1) To review and critique use of evidence in decision models developed as part of health technology assessments to date Part 2) Develop practical solutions for synthesising evidence, with appropriate uncertainty, to inform model inputs: For example, combining evidence in different formats (e.g. mean and median), from different sources (e.g. RCT, cohort, registry, etc.), etc.

NICE GUIDANCE NICE methods guidelines to Health Technology Assessment (2004) ‘all relevant evidence must be identified’ ‘evidence must be identified, quality assessed and, where appropriate, pooled using explicit criteria and justifiable and reproducible methods’ and ‘explicit criteria by which studies are included or excluded’

USE OF EVIDENCE IN HTA DECISION MODELS (Cooper et al, In press) OBJECTIVE: Review sources & quality of evidence used in the development of economic decision models in health technology assessments in the UK METHODOLOGY: Review included all economic decision models developed as part of the NHS Research & Development Health Technology Assessment (HTA) Programme between 1997 and 2003 inclusively. Quality of evidence was assessed using a hierarchy of data sources developed for economic analyses (Coyle & Lee 2002) & good practice guidelines (Philips et al 2004).

GOOD PRACTICE CRITERIA FOR DECISION MODELS (Philips et al 2004) Statement of perspective (e.g. healthcare, societal, etc.) Description of strategies/comparators Diagram of model/disease pathways Development of model structure and assumptions discussed Table of model input parameters presented Source of parameters clearly stated Model parameters expressed as distributions Discussion of model assumptions Sensitivity analysis performed Key drivers/influential parameters identified Evaluation of internal consistency undertaken

HIERARCHY OF DATA SOURCES Hierarchy of evidence - a list of potential sources of evidence for each data component of interest: Main clinical effectiveness Baseline clinical data Adverse events and complications Resource use Costs Utilities Sources ranked on increasing scale from 1 to 6, most appropriate (best quality) assigned a rank of 1

HIERARCHY OF DATA SOURCES #Surrogate outcomes = an endpoint measured in lieu of some other so-called true endpoint (including survival at end of clinical trial as predictor of lifetime survival)

FLOW DIAGRAM 180 HTA published 1997-2003 5 out of 42 (12%) Individual Sampling# #One HTA reported both decision & Markov models, one reported both Markov & Individual Patient models, and one model type was unclear. 26 out of 42 (62%) Decision Trees# 12 out of 42 (29%) Markov Models# 147 out of 180 (73%) considered Health Economics 48 out of 147 (33%) Developed Decision Models 42 out of 48 (88%) Economic Evaluation Models 6 out of 48 (15%) Cost Analyses Models 22 (out of 42) NICE Appraisals

GOOD PRACTICE CRITERIA FOR DECISION MODELS (n=42)

RESULTS FROM APPLYING HIERARCHIES OF EVIDENCE (n=42 decision models)

Rank 1 Rank 2 Rank 3 Rank 4 Rank 5 Rank 6 Unclear N/A High Medium low

CONCLUSIONS Evidence on main clinical effect mostly: identified & quality assessed (76%) as part of companion systematic review for HTA reported in a fairly transparent & reproducible way. For all other model inputs (i.e. adverse events, baseline clinical data, resource use, and utilities) search strategies for identifying relevant evidence rarely made explicit sources of specific evidence not always reported

CONCLUSIONS Concerns about decision models confirmed by this study: (1) Use of data from diverse sources (e.g. RCTs, observational studies, expert opinion) - may be subject to varying degrees of bias due to confounding variables, patient selection, or methods of analysis (2) Lack of transparency regarding identification of model input data & key assumptions underlying model structure and evaluation (3) Bias introduced by the researcher with regards to choice of model structure & selection of parameter values to input into the model.

CONCLUSIONS Hierarchies of evidence for different data components provide useful tool for assessing i) quality of evidence, ii) promoting transparency, & iii) informing weakest aspects of model for future work. Acknowledged, highly ranked evidence for certain model parameters may not always be available but needs to be made explicit (e.g. expert opinion used as no other data available?). Value of evidence input into decision models, regardless of position in hierarchy, depends on its quality & relevance to question of interest. QUANTITY vs. QUALITY (PRECISION vs. BIAS)

UNANSWERED QUESTIONS How best to identify the relevant evidence? How much evidence is sufficient and when would there be benefit from identifying additional/supplementary evidence (possibly from lower levels of the hierarchy)? How to appropriately assess, and where possible adjust for, quality of different types of evidence? - Instruments for assessing quality within study designs but across different study designs non-trivial (Downs & Black 1998) How to appropriately combine/synthesis evidence from different study types? For example, meta-analyse all data assuming equal weight, observational data as prior for RCT data, or hierarchical synthesis model

Copy of slides available at: http://www.hs.le.ac.uk/personal/njc21/ REFERENCES Cooper NJ, Coyle D, Abrams KR, Mugford M, Sutton AJ. Use of evidence in decision models: An appraisal of health technology assessments in the UK to date. Journal of Health Services Research and Policy (In press 2005). Coyle D, Lee KM. Evidence-based economic evaluation: how the use of different data sources can impact results. Donaldson C, Mugford M, Vale L. Evidence-based health economics: From effectiveness to efficiency in systematic review. London: BMJ Publishing Group, 2002: 55-66. Downs SH,.Black N. The feasibility of creating a checklist for the assessment of the methodological quality both of randomised and non-randomised studies of health care interventions. Journal of Epidemiology and Community Health 1998;52:377-84. Philips Z, Ginnelly L, Sculpher M et al. Review of guidelines for good practice in decision-analytic modelling in health technology assessment. Health Technology Assessment. 2004; 8(36). National Institute for Clinical Excellence (NICE). Guide to the methods of technology appraisal. London: National Institute of Clinical Excellence, 2004. Copy of slides available at: http://www.hs.le.ac.uk/personal/njc21/