Download presentation
Presentation is loading. Please wait.
Published byStephen Woods Modified over 8 years ago
1
Uncertainty and Sensitivity Analysis in Population-Based Disease Simulation Models Behnam Sharif STAR Team July 2010
2
Outline Uncertainty and Sensitivity analysis: Definitions Part I- Uncertainty analysis(UA) – Guidelines – Necessity in Population-based disease models – Sources of uncertainty in population-based disease models – Practical Framework – Example: POHEM-OA
3
Uncertainty and sensitivity analysis Definition SA(Sensitivity Analysis) as a What-if Scenario: one can change an assumption or set a parameter value to its highest level SA as part of the validation of the model: To Identify and rank “key” parameters – Deterministic models: “Key” parameters mean those points in the parameter space at which the outcome function has the highest slope (elasticity). OAT(One-factor –at-a time) paradox: ignoring interaction – Stochastic models: “Key” parameters mean those the largest contribution to the uncertainty (error) of the outcome of the simulation model[1] Here, uncertainty (error) is introduced by several modules of the simulation model. Therefore, UA (Uncertainty Analysis) is a prerequisite step for performing SA.
4
Uncertainty and sensitivity analysis Definition (Cont’d) Uncertainty analysis (UA) is a method to “quantify’ the uncertainty of the output by propagating the uncertainty of input (parameters). By sampling from the distribution of the input parameters, UA is often performed through several runs of the basic (mean- estimate) simulation model to provide uncertainty intervals of the outcome. Primary goal of UA: numerically estimate the standard error of the mean outcome. Estimate the variance of the outcome Estimate any other statistics of the outcome including sensitivity indices performed in Sensitivity Analysis.
5
Uncertainty and sensitivity analysis Definition (Cont’d) Fig. 1. Idealized scheme for sensitivity analysis. From (1): Saltelli, A. et al.2004, Sensitivity Analysis in Practice. A Guide to Assessing Scientific Models, John Wiley and Sons, Chichester, New York.
6
Outline Uncertainty and Sensitivity analysis: Definitions Part I- Uncertainty analysis(UA) – Guidelines – Necessity in Population-based disease models – Sources of uncertainty in population-based disease models – Practical Framework – Example: POHEM-OA
7
Part I- Uncertainty analysis The general approach to describing and estimating uncertainty in quantities of interest is to express them as probability distributions using a Bayesian interpretation of probability (5. Salomon et l. WHO report on Uncertainty, 2001 )5 As an uncertainty analyst, we will try to assign a distribution to input quantities (parameters) and by means of a model, propagate the uncertainty to the output. Separate reports will be submitted for Uncertainty and Sensitivity. Uncertainty paper is more descriptive, Sensitivity more mathematical/statistical. In the first report and this presentation, we discuss only Uncertainty.
8
Uncertainty analysis Guidelines 1.Decision-analytic models in health/health technology assessment, Cost effectiveness models of interventions.. 2.European Commission SEC, IMPACT ASSESSMENT GUIDELINES, 2005 http://ec.europa.eu/governance/docs/index en.htm http://ec.europa.eu/governance/docs/index en.htm 3.US-EPA: Environmental Protection Agency, 2003 - The Council for Regulatory Environmental Modeling http://cfpub.epa.gov/crem/cremlib.cfm. http://cfpub.epa.gov/crem/cremlib.cfm.
9
Decision-analytic models Guidelines National Institute for Clinical Excellence (NICE): “ One aspect of the new guideline is to require the use of probabilistic sensitivity analysis with all cost effectiveness models submitted to the Institute.” (2) Guidelines of health technology assessment : “ Health economic evaluations have recently built upon more advanced statistical decision-theoretic foundations and nowadays it is officially required that uncertainty about both parameters and observable variables be taken into account thoroughly” (3)
10
Outline Uncertainty and Sensitivity analysis: Definitions Part I- Uncertainty analysis(UA) – Guidelines Necessity in Population-based disease models – Sources of uncertainty in population-based disease models – Practical Framework – Example: POHEM-OA
11
Uncertainty analysis Necessity in Population-based disease models 1. Morgan et al (4), UA (Uncertainty analysis) is needed : When one is performing an analysis in which uncertain information from different sources must be combined. The precision of each source should help determine its weighting in the combination. When a decision must be made about whether to expand resources to acquire additional information. In general, the greater the uncertainty, the greater the expected value of additional information 2.Uncertainty analysis and sensitivity analysis have been discussed to be one of the steps in almost all of the validation guidelines of simulation models [7, Jacek et al. validation draft]. 3.Modern epidemiological research needs uncertainty intervals or confidence intervals to be reported for any estimates.([5]. WHO report on uncertainty) If simulation models are going to be used as a means of informing policy decision makers in health (such as observational studies), UA is needed.
12
Outline Uncertainty and Sensitivity analysis: Definitions Part I- Uncertainty analysis(UA) – Guidelines – Necessity in Population-based disease models Sources of uncertainty in population-based disease models – Practical Framework – Example: POHEM-OA
13
Sources of uncertainty in population-based disease models 1.Monte Carlo error (First order uncertainty): This sort of uncertainty rises from the variability between individuals in the population. Poulter (6) described this as natural heterogeneity of a system (model). 2.Parameter uncertainty (Second order uncertainty): It is the sampling error and arises from sample in the dataset. It is due to “incomplete’ knowledge of the true parameter(input) values. 3.Model uncertainty: This represent a lack of knowledge about the way that variables are related to each other and thus constitutes uncertainty about whether a model approximate a real-world process or relationship(6). 4.Other sources of uncertainty: Data sources, indicator variables (definitions), model assumptions and approximations.
14
Outline Uncertainty and Sensitivity analysis: Definitions Part I- Uncertainty analysis(UA) – Guidelines – Necessity in Population-based disease models – Sources of uncertainty in population-based disease models Practical Framework – Example: POHEM-OA
15
Practical framework for Performing UA in Population-based disease simulation models 1.Describe all sources of Uncertainty that exists in the model and modify the approach ti incorporate them. Example: how to incorporate Monte Carlo error: sub-populaitions or multiple runs?) 2.Define “What-if’ scenarios based on model and other sources. i.e. combination of alternative models, data sources and definitions. 3. For each What-if scenario: Define the outcome of interest form the model and for each outcome: Perform the quantitative UA with Monte Calro analysis.Monte Calro analysis.
16
Monte Carlo analysis Define a list of parameters/variables (risk factors) to be included in the Monte Carlo analysis. We call this the MC- list. 1.Parameter screening: Decide which parameter to go into the MC-list.Parameter screening 2.Assign distribution to parameters inside MC-listAssign distribution 3.SamplingSampling 4.EvaluationEvaluation Calibration: If any of the parameters in the MC-list were involved in the calibration of the basic model, we need to automate the calibration for each sample. Execute the MC analysis Rank order the results and estimate Uncertainty intervals for the outcome as discussed in Appendix 1 of the report.
17
Monte Carlo analysis (Cont’d) 1. Parameter screening: – Ignore parameter that estimated form life tables, Ignore Constants, Help of Expert opinion, one-ay, SA or graphically. – Start by adding event parameters with regard to the outcome (e.g. for prevalence add Hazard ratios first, for HALE add, QALY’s parameter, etc.) – Add most relevant variables (BMI, cholesterol level) 2. Assign distribution – Parameters estimated form data: – Parameters form literature: – Parameters from expert opinion. -Consider the correlation of parameters.
18
Monte Carlo analysis (Cont’d) 3. Sampling LHS(Latin Hypercube sampling) is discussed to be the best sampling approach (among random, orthogonal, etc.) (6). Estimate the number of MC runs based on the sources of uncertainty (Appendix 1 of the report) 4. Evaluation If computational time is not manageable: parallel processing or use approximation methods (Appendix 2 of the report)
19
Example: POHEM-OA MC- List: one parameter type; – Hazard ratios of OA incidence which includes 8 parameters- gender and BMI level specific. Estimated form NPHS; – Lognormal distribution; Correlation matrix. Need to calibrate for each sample; – Observed incidence rate baseline incidence rate Outcome: Prevalence of OA, gender specific, for year 2001-2021.
20
Result Figure 1. Prevalence of OA in Canada form 2001-2021 for Males using POHEM-OA
21
Result (Cont’d) Figure 2. Prevalence of OA in Canada form 2001-2021 for Females using POHEM-OA
22
Calibration program For automating Calibration of POHEM-OA, we have wrote a code in Visual studio 2008-VB with following features: – we have control over parameters (e.g. Hazard Ratios) form an outside application – can run POHEM in the loop and perform steps of the calibration automatically. – Perform queries on the output database of POHEM and automatically export to Excel. Uncertainty algorithm then calls this calibration code after sampling from parameters in the MC-list at each iteration.
23
Future work Add BMI parameter uncertainty into the example – Use of bootstrap data Modify the result of example with calibration code. Edit the report and shorten it for manuscript Sensitivity analysis report
24
References (1) Saltelli, A. et al.2004, Sensitivity Analysis in Practice. A Guide to Assessing Scientific Models, John Wiley and Sons, Chichester, New York. (2). Claxton, K. and Sculpher, M. and McCabe, C. and Briggs, A.H. and Akehurst, R. and Buxton, M. and Brazier, J. and O'Hagan, T. (2005), Probabilistic sensitivity analysis for NICE technology assessment: not an optional extra. Health Economics 14(4):pp. 339-347. (3). Z Philips et a. Review of guidelines for good practice in decision-analytic modelling in health technology assessment, Health Technology Assessment 2004; Vol. 8: No. 36 (4). M. Granger Morgan, Max Hernon. Uncertaunty: a guide to dealing with uncertainty in Quantitave risk and policy analysis10 th edition, Cambridge university press, 1990 (5). Salomon, JA, Mathers, Murray, Ferguson, 2001: Methods for life expectancy and healthy life expectancy uncertainty analysis, Global programme on Evidence for helath policy, Woprking paper nop. 10, WHO, 2001. Available online: http://www.who.int/healthinfo/paper10.pdf; Acessed October 2009. http://www.who.int/healthinfo/paper10.pdf (6). Susan R. Poulter, Monte Carlo Simulation in Environmental Risk Assessment — Science, Policy And Legal Issues, Risk: Health, Safety & Environment, Winter 1998
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.