1 Transforming the efficiency of Partial EVSI computation Alan Brennan Health Economics and Decision Science (HEDS) Samer Kharroubi Centre for Bayesian.

Slides:



Advertisements
Similar presentations
Markov Chain Monte Carlo Convergence Diagnostics: A Comparative Review By Mary Kathryn Cowles and Bradley P. Carlin Presented by Yuting Qi 12/01/2006.
Advertisements

Some methodological issues in value of information analysis: an application of partial EVPI and EVSI to an economic model of Zanamivir Karl Claxton and.
Bayesian Estimation in MARK
Cost-effectiveness models to inform trial design: Calculating the expected value of sample information Alan Brennan and J Chilcott, S Kharroubi, A O’Hagan.
Bayesian posterior predictive probability - what do interim analyses mean for decision making? Oscar Della Pasqua & Gijs Santen Clinical Pharmacology Modelling.
1 Parametric Sensitivity Analysis For Cancer Survival Models Using Large- Sample Normal Approximations To The Bayesian Posterior Distribution Gordon B.
BAYESIAN POSTERIOR DISTRIBUTIONS FOR PROBABILISTIC SENSITIVITY ANALYSIS Gordon B. Hazen and Min Huang, IEMS Department, Northwestern University, Evanston.
Markov-Chain Monte Carlo
The uptake of value of information methods Solutions found and challenges to come Alan Brennan Director of Operational Research ScHARR.
What role should probabilistic sensitivity analysis play in SMC decision making? Andrew Briggs, DPhil University of Oxford.
Valuing Trial Designs from a Pharmaceutical Perspective using Value Based Pricing (VBP) Penny Watson 1, Alan Brennan 1 1 Health Economics and Decision.
Approaches to Data Acquisition The LCA depends upon data acquisition Qualitative vs. Quantitative –While some quantitative analysis is appropriate, inappropriate.
Modelling Partially & Completely Missing Preference-Based Outcome Measures (PBOMs) Keith Abrams Department of Health Sciences, University of Leicester,
Industrial Engineering College of Engineering Bayesian Kernel Methods for Binary Classification and Online Learning Problems Theodore Trafalis Workshop.
1 Meta-analysis with missing data: metamiss Ian White and Julian Higgins MRC Biostatistics Unit, Cambridge, UK Stata users’ group, London 10 September.
Computing the Posterior Probability The posterior probability distribution contains the complete information concerning the parameters, but need often.
Resampling techniques Why resampling? Jacknife Cross-validation Bootstrap Examples of application of bootstrap.
Bayesian Analysis of X-ray Luminosity Functions A. Ptak (JHU) Abstract Often only a relatively small number of sources of a given class are detected in.
Stochastic Differentiation Lecture 3 Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO Working Group on Continuous.
USE OF LAPLACE APPROXIMATIONS TO SIGNIFICANTLY IMPROVE THE EFFICIENCY
Predictive Automatic Relevance Determination by Expectation Propagation Yuan (Alan) Qi Thomas P. Minka Rosalind W. Picard Zoubin Ghahramani.
Methods to Analyse The Economic Benefits of a Pharmacogenetic (PGt) Test to Predict Response to Biologic Therapy in Rheumatoid Arthritis, and to Prioritise.
Sérgio Pequito Phd Student
Results Doubling the sample size from n=50 to 100, increases the value of a 6 month study by 64% but the value of a 3 years study by only 34%. Similarly.
Evaluating Hypotheses
Results 2 (cont’d) c) Long term observational data on the duration of effective response Observational data on n=50 has EVSI = £867 d) Collect data on.
Value of Information Some introductory remarks by Tony O’Hagan.
A Two Level Monte Carlo Approach To Calculating
1 Bayesian inference of genome structure and application to base composition variation Nick Smith and Paul Fearnhead, University of Lancaster.
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
Value of Information for Complex Economic Models Jeremy Oakley Department of Probability and Statistics, University of Sheffield. Paper available from.
Lecture II-2: Probability Review
1 D r a f t Life Cycle Assessment A product-oriented method for sustainability analysis UNEP LCA Training Kit Module k – Uncertainty in LCA.
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
© Copyright McGraw-Hill CHAPTER 6 The Normal Distribution.
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
Monte Carlo Simulation of Interacting Electron Models by a New Determinant Approach Mucheng Zhang (Under the direction of Robert W. Robinson and Heinz-Bernd.
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
Bayesian parameter estimation in cosmology with Population Monte Carlo By Darell Moodley (UKZN) Supervisor: Prof. K Moodley (UKZN) SKA Postgraduate conference,
MML Inference of RBFs Enes Makalic Lloyd Allison Andrew Paplinski.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Bayesian Extension to the Language Model for Ad Hoc Information Retrieval Hugo Zaragoza, Djoerd Hiemstra, Michael Tipping Presented by Chen Yi-Ting.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Stochastic Linear Programming by Series of Monte-Carlo Estimators Leonidas SAKALAUSKAS Institute of Mathematics&Informatics Vilnius, Lithuania
Module 1: Statistical Issues in Micro simulation Paul Sousa.
Finding Scientific topics August , Topic Modeling 1.A document as a probabilistic mixture of topics. 2.A topic as a probability distribution.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
17 May 2007RSS Kent Local Group1 Quantifying uncertainty in the UK carbon flux Tony O’Hagan CTCD, Sheffield.
Department Author Bayesian Sample Size Determination in the Real World John Stevens AstraZeneca R&D Charnwood Tony O’Hagan University of Sheffield.
On Predictive Modeling for Claim Severity Paper in Spring 2005 CAS Forum Glenn Meyers ISO Innovative Analytics Predictive Modeling Seminar September 19,
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
CS246 Latent Dirichlet Analysis. LSI  LSI uses SVD to find the best rank-K approximation  The result is difficult to interpret especially with negative.
Multilevel and multifrailty models. Overview  Multifrailty versus multilevel Only one cluster, two frailties in cluster e.g., prognostic index (PI) analysis,
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Markov-Chain-Monte-Carlo (MCMC) & The Metropolis-Hastings Algorithm P548: Intro Bayesian Stats with Psych Applications Instructor: John Miyamoto 01/19/2016:
Density Estimation in R Ha Le and Nikolaos Sarafianos COSC 7362 – Advanced Machine Learning Professor: Dr. Christoph F. Eick 1.
1 Life Cycle Assessment A product-oriented method for sustainability analysis UNEP LCA Training Kit Module k – Uncertainty in LCA.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
MCMC Output & Metropolis-Hastings Algorithm Part I
Nodal Methods for Core Neutron Diffusion Calculations
Igor V. Cadez, Padhraic Smyth, Geoff J. Mclachlan, Christine and E
Filtering and State Estimation: Basic Concepts
Efficient Quantification of Uncertainties Associated with Reservoir Performance Simulations Dongxiao Zhang, The University of Oklahoma . The efficiency.
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Opinionated Lessons #39 MCMC and Gibbs Sampling in Statistics
Learning From Observed Data
Probabilistic Surrogate Models
Uncertainty Propagation
Presentation transcript:

1 Transforming the efficiency of Partial EVSI computation Alan Brennan Health Economics and Decision Science (HEDS) Samer Kharroubi Centre for Bayesian Statistics in Health Economics (CHEBS) University of Sheffield, England IHEA July 2005

2 Expected Value of Sample Information (EVSI) EVSI works out the expected impact on decision making if we collect more data We 1.Simulate a collected sample dataset 2. Update uncertainty in parameters given data 3. ? Choose a different decision option given data 4. Quantify increase in benefit over baseline decision 5.Repeat for many sample datasets 6.Calculate the expected increase in benefit

3 EVSI The Computational Problem EVSI works out the expected impact on decision making if we collect more data Conventional Computations required 1.“Outer” Monte Carlo sample 2. Bayesian Update – analytic or MCMC 3. “Inner” Monte Carlo sample e.g. 10,000 times 4. Evaluate each net benefit function each time 5.Repeat for many sample datasets e.g. 10,000 times 6.Total e.g. 100,000,000 evaluations of net benefit

4  = uncertain model parameters t = set of possible treatments (decision options) NB(d,  ) = net benefit (λ*QALY – Cost) for decision d,   i = parameters of interest – possible data collection X  i = data collected on the parameters of interest  i Mathematical Notation EVSI = Expected Payoff given only current information Expected Payoff for each Decision given particular new data X  i Expectation over sampled datasets

5 Laplace approximation Sweeting and Kharroubi (2003) developed a 2nd order approximation to evaluate the posterior expectation of any real valued smooth function v(  ) with a vector of d uncertain parameters  given new available data X st order 2 nd order term term

6 Eureka For EVSI the first term in the formula is We can adapt Laplace approximation to evaluate the EVSI inner expectation ! st order 2 nd order Only requires 1+3d evaluations of net benefit (Kharroubi and Brennan 2005)

7 Univariate Explanation:  +  + and  - are 1 standard deviation away from the posterior mode  ^  ^

8 Univariate Explanation: α+ α+ and α- are weights, functions of the ratio of the slopes of the log density function at θ+, θ- If distribution is symmetric then α+ = α - =½

9 Multivariate Requires Matrix Algebra for each dataset X  i θ i +, θ i - are vectors. each is the i th row of a matrix θ+, θ- ›The first i -1 components are posterior modes θ 1...θ i-1 ›i th is θ i ± (k i ) -1/2, where k i is 1/first entry of {J (i) } -1 ›Remaining i +1 to d components are chosen to maximise the posterior density given the first i components α i + and α i - are vectors of weights, which are calculated based on partial derivatives of the log posterior density function at θ i +, θ i - Requires numerical optimisation ^ ^^

10 Case Studies Case Study 1 ›2 treatments – T1 versus T0 ›Uncertainty in …… 19 independent parameters ›Univariate Normal prior and data ›Net benefit function is sum-product form ›NB1= (θ 5 θ 6 θ 7 + θ 8 θ 9 θ 10 ) – ( θ 1 + θ 2 θ 3 θ 4 ) Case Study 2 ›Uncertainty in …… 19 correlated parameters ›Multivariate Normal prior and data

11 Illustrative Model

12 Case Study 1 Results (5 sets) 1 st order Laplace is accurate

13 Case Study 2: 1 st order wrong 2 nd order is accurate

14 Accuracy of inner integral approximation Parameters 6,15 Sample size n=50 Out of 1000 datasets the resulting decision between 2 treatments was different in 7 i.e. 0.7% error Monte Carlo Laplace

15 Trade-off in Computation Time

16 Computation Time What-If Analyses Efficiency gain due to Laplace approximation increases rapidly as model run time for one evaluation of net benefit increases

17 Limitations Any Type of Net benefit function ›analytic function of model parameters ›result of probabilistic model e.g. individual level simulation Characterisation of Uncertainty ›Need functional form for probability density function ›Smooth and differentiable, ›i.e. not just a histogram to sample from ›write down the equations for posterior density function and its derivative mathematically

18 Conclusions EVSI calculations using the Laplace approximation are in line with those using 2 level Monte-Carlo method in case studies so far Method is very generalisable once you understand the mathematics and algorithm Computation time reductions depend on times to compute net benefit functions

19 Thankyou 'Wisest are they who know they do not know‘ ‘Especially if they can calculate whether it’s worth finding out’

20 References Brennan, A. B., Chilcott, J. B., Kharroubi, S. A, O'Hagan, A. A Two Level Monte Carlo Approach to Calculation Expected Value of Sample Information: How To Value a Research Design. Presented at the 24th Annual Meeting of SMDM, October 23rd, 2002, Washington Ades AE, Lu G, Claxton K. Expected value of sample information calculations in medical decision modelling. Medical Decision Making Mar-Apr;24(2): Sweeting, T. J. and Kharroubi, S. A. (2003). Some new formulae for posterior expectations and Bartlett corrections. Test, 12(2): Kharroubi, S. A. and Brennan, A. (2005). A Novel Formulation for Approximate Bayesian Computation Based on Signed Roots of Log-Density Ratios. Research Report No. 553/05, Department of Probability and Statistics, University of Sheffield. Submitted to Applied Statistics.