Structural uncertainty from an economists’ perspective

Slides:



Advertisements
Similar presentations
INCREASING THE TRANSPARENCY OF CEA MODELING ASSUMPTIONS: A SENSITIVITY ANALYSIS BASED ON STRENGTH OF EVIDENCE RS Braithwaite MS Roberts AC Justice.
Advertisements

USE OF EVIDENCE IN DECISION MODELS: An appraisal of health technology assessments in the UK Nicola Cooper Centre for Biostatistics & Genetic Epidemiology,
Integrating the gender aspects in research and promoting the participation of women in Life Sciences, Genomics and Biotechnology for Health.
USE OF EVIDENCE IN DECISION MODELS: An appraisal of health technology assessments in the UK Nicola Cooper Centre for Biostatistics & Genetic Epidemiology,
Dangerous Omissions: The Consequences of Ignoring Decision Uncertainty Karl Claxton Centre for Health Economics*, Department of Economics and Related Studies,
Comparator Selection in Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare Research and Quality (AHRQ)
When is there Sufficient Evidence? Karl Claxton, Department of Economics and Related Studies and Centre for Health Economics, University of York.
Biointelligence Laboratory, Seoul National University
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Exploring uncertainty in cost effectiveness analysis NICE International and HITAP copyright © 2013 Francis Ruiz NICE International (acknowledgements to:
Making Decisions in Health Care: Cost-effectiveness and the Value of Evidence Karl Claxton Centre for Health Economics, Department of Economics and Related.
HERU is supported by the Chief Scientist Office of the Scottish Government Health Directorates and the University of Aberdeen. The author accepts full.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
ASSESSMENT IN HIAs Elizabeth J. Fuller, DrPH, MSPH Georgia Health Policy Center.
Optimal Drug Development Programs and Efficient Licensing and Reimbursement Regimens Neil Hawkins Karl Claxton CENTRE FOR HEALTH ECONOMICS.
The uptake of value of information methods Solutions found and challenges to come Alan Brennan Director of Operational Research ScHARR.
What role should probabilistic sensitivity analysis play in SMC decision making? Andrew Briggs, DPhil University of Oxford.
Australian Centre for Environmetrics. Developing Risk-based guidelines for Water Quality Monitoring and Evaluation Prof. David Fox CSIRO Land and Water.
Populating decision analytic models Laura Bojke, Zoë Philips With M Sculpher, K Claxton, S Golder, R Riemsma, N Woolacoot, J Glanville.
The role of economic modelling – a brief introduction Francis Ruiz NICE International © NICE 2014.
Introduction to decision modelling Andrew Sutton.
CS 589 Information Risk Management 30 January 2007.
The Importance of Decision Analytic Modelling in Evaluating Health Care Interventions Mark Sculpher Professor of Health Economics Centre for Health Economics.
A Role for Decision Analysis in PHIAC? Mark Sculpher Centre for Health Economics University of York.
Elicitation Some introductory remarks by Tony O’Hagan.
The Cost-Effectiveness and Value of Information Associated with Biologic Drugs for the Treatment of Psoriatic Arthritis Y Bravo Vergel, N Hawkins, C Asseburg,
The Delphi Technique: A Tool For Long Range Travel and Tourism Planning Chapter 39 Research Methodologies.
Dangerous Omissions – the Cost of Ignoring Decision Uncertainty Mark Sculpher Susan Griffin Karl Claxton Steve Palmer Centre for Health Economics, University.
The Use of Economic Evaluation For Decision Making: Methodological Opportunities and Challenges Mark Sculpher Karl Claxton Centre for Health Economics.
ABCWINRisk and Statistics1 Risk and Statistics Risk Assessment in Clinical Decision Making Ulrich Mansmann Medical Statistics Branch University of Heidelberg.
Results 2 (cont’d) c) Long term observational data on the duration of effective response Observational data on n=50 has EVSI = £867 d) Collect data on.
Bayesian Statistics Vague, variable, subjective to boot…
Prioritising HTA funding: The benefits and challenges of using value of information in anger CENTRE FOR HEALTH ECONOMICS K Claxton, L Ginnelly, MJ Sculpher,
Meta-analysis & psychotherapy outcome research
Non-parametric Bayesian value of information analysis Aim: To inform the efficient allocation of research resources Objectives: To use all the available.
6/30/20151 Decision Making 3 Factors in decision- making.
Trial Based Economic Evaluation: Just Another Piece Of Evidence Claxton K Department of Economics and Centre for Health Economics, University of York,
Decision Analysis as a Basis for Estimating Cost- Effectiveness: The Experience of the National Institute for Health and Clinical Excellence in the UK.
Chapter 9 Flashcards. measurement method that uses uniform procedures to collect, score, interpret, and report numerical results; usually has norms and.
Supply Chain Management (SCM) Forecasting 3
WMO UNEP INTERGOVERNMENTAL PANEL ON CLIMATE CHANGE NATIONAL GREENHOUSE GAS INVENTORIES PROGRAMME WMO UNEP IPCC Good Practice Guidance Simon Eggleston Technical.
Guidelines for the reporting of evidence identification in decision models: observations and suggested way forward Louise Longworth National Institute.
USE OF EVIDENCE IN DECISION MODELS: An appraisal of health technology assessments in the UK Nicola Cooper Centre for Biostatistics & Genetic Epidemiology,
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
Performance Measurement and Analysis for Health Organizations
Evidence Evaluation & Methods Workgroup: Developing a Decision Analysis Model Lisa A. Prosser, PhD, MS September 23, 2011.
CHP400: Community Health Program- lI Research Methodology STUDY DESIGNS Observational / Analytical Studies Case Control Studies Present: Disease Past:
A National unit for Bayesian Health Decision Science.
1 f02kitchenham5 Preliminary Guidelines for Empirical Research in Software Engineering Barbara A. Kitchenham etal IEEE TSE Aug 02.
Plymouth Health Community NICE Guidance Implementation Group Workshop Two: Debriding agents and specialist wound care clinics. Pressure ulcer risk assessment.
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
Issues concerning the interpretation of statistical significance tests.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
1 Qualitative Research: Challenges and Opportunities Presented by: Anne Smyth Liz Dimitriadis.
Benefit: Cost Ratio David Pannell School of Agricultural and Resource Economics University of Western Australia.
EBM --- Journal Reading Presenter :呂宥達 Date : 2005/10/27.
International Atomic Energy Agency Regulatory Review of Safety Cases for Radioactive Waste Disposal Facilities David G Bennett 7 April 2014.
Matching Analyses to Decisions: Can we Ever Make Economic Evaluations Generalisable Across Jurisdictions? Mark Sculpher Mike Drummond Centre for Health.
Statistical Methods. 2 Concepts and Notations Sample unit – the basic landscape unit at which we wish to establish the presence/absence of the species.
The Value of Reference Case Methods for Resource Allocation Decision Making Mark Sculpher, PhD Professor of Health Economics Centre for Health Economics.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
“New methods in generating evidence for everyone: Can we improve evidence synthesis approaches?” Network Meta-Analyses and Economic Evaluations Petros.
Uncertain Judgements: Eliciting experts’ probabilities Anthony O’Hagan et al 2006 Review by Samu Mäntyniemi.
Present: Disease Past: Exposure
Pekka Laitila, Kai Virtanen
Anja Schiel, PhD Statistician / Norwegian Medicines Agency
Pest Risk Analysis (PRA) Stage 2: Pest Risk Assessment
Ch 3. Linear Models for Regression (2/2) Pattern Recognition and Machine Learning, C. M. Bishop, Previously summarized by Yung-Kyun Noh Updated.
Presentation transcript:

Structural uncertainty from an economists’ perspective Laura Bojke Centre for Health Economics, University of York Looked at the issue of structural uncertainty – specifically relating to decision analytic models – during my phd. Here is just a very brief overview of some thoughts that emerged from it.

Structure of the presentation Why structural uncertainty is a problem in decision modelling What is structural uncertainty Some examples Methods available to characterise structural uncertainty Outstanding issues/discussion points I will go through the following points

Uncertainty in decision analytic models Uncertainty is pervasive in any assessment of cost-effectiveness. Need to produce accurate estimates of cost-effectiveness and assess if current evidence is a sufficient basis for an adoption decision. Much of the focus on uncertainty in decision analysis has been on parameter uncertainty Other forms of model uncertainty exist and these have received much less attention in the HTA literature . The issue of structural uncertainty in particular is under researched. I guess we all know why uncertainty is important in DAMs. Parameter uncertainty in particular is well researched and the methods to characterise it are used widely. SU is under researched so I wanted to look at the issue in more detail for my phd.

What is structural uncertainty? Aside from parameter and methodological uncertainties, other sources of uncertainty include the different types of simplifications and scientific judgements that have to be made when constructing and interpreting a model of any sort. These have been classified in a number of different ways but can be referred to collectively as structural uncertainties. Used to describe uncertainty that does not fit into other 2 categories. Although definitions are of limited use it is important to know what we mean when we use the term SU. It is often used to describe uncertainties that are not parameter or methodological. A catch all group.

Examples from a review of the HTA literature Inclusion/exclusion of potentially relevant comparators. The selection of comparators should be informed by current evidence or opinion. Choice of comparators is often governed by the scope of the model, and rarely are all possible comparisons made. This is often the case where unlicensed comparators exist. Even if the excluded comparators are not cost-effective, excluding them may change EVPI estimates. Inclusion/exclusion of potentially relevant events. The process of simplification will inevitably require certain assumptions to be made. These assumptions should be supported by evidence and choices between alternative assumptions should be justified and made transparent. Events thought to be unrelated to treatment can have a noticeable impact on estimates of cost-effectiveness and EVPI. Conduced a review of HTA models up to 2005 to look at examples of uncertainties that had been called structural or model uncertainties. There were around 12 examples and they allowed me to classify the structural uncertainties seen in DAM as 4 types.

Examples from a review of the HTA literature (2) Statistical models to estimate specific parameters. Decision models are using increasingly sophisticated statistical techniques to derive estimates of parameters. This increased complexity can introduce statistical uncertainties. May be alternative survival models – each plausible given data. Which model is best for survival beyond the observed period? Clinical uncertainty or lack of clinical evidence. A decision model may be commissioned on the basis of a lack of clinical evidence to inform a decision. Even when RCT evidence is available there may be an absence of evidence about key parameters such as treatment effect, baseline event rates, clinical pathways, interaction between model parameters and clinical practice norms. Often scenarios are presented based on alternative but extreme assumptions that could be made.

Identifying current approaches to characterise structural uncertainty Undertook a review to find methods which explore the types of structural uncertainties apparent in DAM Focus on analytical methods rather than qualitative methods of synthesis. Very little published in HTA literature. Methods from statistics, mathematical and operational research are relevant. 3 methods are available: Now that I had an idea of what types of uncertainties were thought of as structural uncertainties I wanted to look at methods to characterise these uncertainties explicitly. Focus on quantitative methods rather than just a narrative synthesis.

Available methods Scenario analysis: Model selection Alternative assumptions presented as separate scenarios Multiple models to digest Model selection Rank alternative models according to some measure of prediction performance, goodness of fit or probability of error Choose the model that maximises that particular criterion In HTA decision modelling it is difficult to define a ‘gold standard’ for outcomes and costs Where there are many competing objectives, it is often not possible to identify one particular parameter whose performance must be maximised by a fitted model Absence of data required to assess fit. Not always advantageous to choose the best model – discards information on other alternative model 3 methods are available.

Available methods (2) Model averaging: Build alternative models and average their results, weighted by some measure of their adequacy or credibility Models can be assigned equal weights or differential weights can be determined using either ranking measures or derived using expert elicitation methods. Bayesian methods for model averaging are commonly used in mathematics and statistics. Issue of determining the posterior distribution of a parameter given the data, when the data may not be available Non-bayesian methods can be used Require a a measure of uncertainty that captures both uncertainty between expectations and uncertainty within expectations. Model averaging must be undertaken for each realisation of the uncertain parameters so as not to underestimate uncertainty.

Available methods (3) Parameterising structural uncertainty: Approach not identified in the review Assumptions that distinguish different models or scenarios can often be thought of as either missing parameters or parameters assigned a single and often extreme value. By generalising the model, to include additional ‘uncertain’ parameters the source of structural uncertainty can be represented directly in the analysis. This approach is analogous to model averaging on individual or sets of model inputs. Treats structural uncertainty like parameter uncertainty A extension of model averaging that is possible – called this parameter sing structural uncertainty. In many circumstances model averaging and parameterising will give the same cost-effectiveness results. But allows value of information on uncertain parameters to be determined.

Case studies See: Characterizing Structural Uncertainty in Decision Analytic Models: A Review and Application of Methods. Laura Bojke, Karl Claxton, Mark Sculpher, Stephen Palmer. Value in Health, 2009 forthcoming. See case studies appling these methods in a forthcoming VIH paper.

Discussion points and further work When is structural uncertainty really parameter uncertainty? Is uncertainty that can be parameterised directly in the model structural uncertainty? If not, what is structural uncertainty? Do we really need a definition? Do issues of what comparators to include/exclude relate to defining the decision scope? Do issues of what events to include/exclude relate to specifying the correct model structure – avoiding over simplification? Clear statement about the definiton of SU (or lack of need for any definition). Are some things not SU?

Discussion points and further work (2) Should the focus be on establishing guidelines on how to structure a model? Is a lot of structural uncertainty just modeller uncertainty? Should we average across models? Does this reflect uncertainty about structure? How do we determine weights? Expert opinion – method of elicitation, which experts? Can we get rid of a lot of SU. Still issues about methods. Ultimately if we average or parameterise we need weights – expert elicitation. Develop methodology in DAM.