Quantitative methods to manage uncertainty in science by

Slides:



Advertisements
Similar presentations
Course round-up subtitle- Statistical model building Marian Scott University of Glasgow Glasgow, Aug 2013.
Advertisements

“Students” t-test.
Session 2 – Introduction: Inference
Uncertainty and confidence intervals Statistical estimation methods, Finse Friday , 12.45–14.05 Andreas Lindén.
A Bayesian perspective on Info- Gap Decision Theory Ullrika Sahlin, Centre of Environmental and Climate Research Rasmus Bååth, Cognitive Science Lund University,
Testing the validity of indicators in the field of education The experience of CRELL Rome, October 3-5, 2012, Improving Education through Accountability.
Bayesian inference Gil McVean, Department of Statistics Monday 17 th November 2008.
Parameterising Bayesian Networks: A Case Study in Ecological Risk Assessment Carmel A. Pollino Water Studies Centre Monash University Owen Woodberry, Ann.
CS 589 Information Risk Management 30 January 2007.
Elementary hypothesis testing
CS 589 Information Risk Management 6 February 2007.
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 4: Modeling Decision Processes Decision Support Systems in the.
Generalised Mean Variance Analysis and Robust Portfolio Construction February 2006 Steve Wright Tel
Elementary hypothesis testing Purpose of hypothesis testing Type of hypotheses Type of errors Critical regions Significant levels Hypothesis vs intervals.
Chapter 11 Multiple Regression.
Further Inference in the Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
1 Andrea Saltelli, European Commission, Joint Research Centre Copenhagen, October 2007 Sensitivity Analysis An introduction.
July 3, A36 Theory of Statistics Course within the Master’s program in Statistics and Data mining Fall semester 2011.
Decision analysis and Risk Management course in Kuopio
1 D r a f t Life Cycle Assessment A product-oriented method for sustainability analysis UNEP LCA Training Kit Module k – Uncertainty in LCA.
AM Recitation 2/10/11.
Basic Statistics (for this class) Special thanks to Jay Pinckney (The HPLC and Statistics Guru) APOS.
QUANTITATIVE METHODS TO MANAGE UNCERTAINTY IN SCIENCE by Andrea Saltelli, Silvio Funtowicz, Stefano Tarantola, Joint Research.
Economic evaluation of health programmes Department of Epidemiology, Biostatistics and Occupational Health Class no. 17: Economic Evaluation using Decision.
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
T tests comparing two means t tests comparing two means.
1 Institute of Engineering Mechanics Leopold-Franzens University Innsbruck, Austria, EU H.J. Pradlwarter and G.I. Schuëller Confidence.
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,
Bayesian vs. frequentist inference frequentist: 1) Deductive hypothesis testing of Popper--ruling out alternative explanations Falsification: can prove.
Sensitivity and Importance Analysis Risk Analysis for Water Resources Planning and Management Institute for Water Resources 2008.
European Commission DG Joint Research Centre Formal and informal approaches to the quality of information in integrated.
“ Building Strong “ Delivering Integrated, Sustainable, Water Resources Solutions Sensitivity and Importance Analysis Charles Yoe
URBDP 591 I Lecture 4: Research Question Objectives How do we define a research question? What is a testable hypothesis? How do we test an hypothesis?
Statistical inference Statistical inference Its application for health science research Bandit Thinkhamrop, Ph.D.(Statistics) Department of Biostatistics.
Unit 11: Evaluating Epidemiologic Literature. Unit 11 Learning Objectives: 1. Recognize uniform guidelines used in preparing manuscripts for publication.
T tests comparing two means t tests comparing two means.
Module 25: Confidence Intervals and Hypothesis Tests for Variances for One Sample This module discusses confidence intervals and hypothesis tests.
Confidence Intervals and Hypothesis Testing Mark Dancox Public Health Intelligence Course – Day 3.
Statistical Inference for the Mean Objectives: (Chapter 8&9, DeCoursey) -To understand the terms variance and standard error of a sample mean, Null Hypothesis,
Building Valid, Credible & Appropriately Detailed Simulation Models
1 Life Cycle Assessment A product-oriented method for sustainability analysis UNEP LCA Training Kit Module k – Uncertainty in LCA.
Regular process for global reporting and assessment of the state of the marine environment, including socio-economic aspects Guidance for Authors.
Virtual University of Pakistan
Responding to Complexity in Impact Evaluation
Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression.
Lezione di approfondimento su RDD (in inglese)
Assumptions For testing a claim about the mean of a single population
Statistical Quality Control, 7th Edition by Douglas C. Montgomery.
How to read a paper D. Singh-Ranger.
Confidence Intervals and Hypothesis Tests for Variances for One Sample
Basic Practice of Statistics - 5th Edition
Chapter 10: Analysis of Variance: Comparing More Than Two Means
Hypothesis Testing: Hypotheses
Towson University - J. Jung
Quantitative Project Risk Analysis
Further Inference in the Multiple Regression Model
Statistical Process Control
Simulation: Sensitivity, Bootstrap, and Power
Discrete Event Simulation - 4
Carl Bro a/s - Team Leader - IPPC-experts - Quality Assurance
Uncertainty management
PSY 626: Bayesian Statistics for Psychological Science
Rick Hoyle Duke Dept. of Psychology & Neuroscience
Pest Risk Analysis (PRA) Stage 2: Pest Risk Assessment
Chapter 7: The Normality Assumption and Inference with OLS
Building Valid, Credible, and Appropriately Detailed Simulation Models
Project Management: A Managerial Approach 4/e
Treatment of statistical confidentiality Part 3: Generalised Output SDC Introductory course Trainer: Felix Ritchie CONTRACTOR IS ACTING UNDER A FRAMEWORK.
Webinar Overview – Host
Assessing Similarity to Support Pediatric Extrapolation
Presentation transcript:

Quantitative methods to manage uncertainty in science by Andrea Saltelli, Stefano Tarantola and Michela Saisana, Joint Research Centre of the European Communities in Ispra (I), Andrea.Saltelli@jrc.it Mini-symposium “The management of uncertainty in risk science and policy”, World Congress on Risk Brussels, 22-25 June 2003. http://www.jrc.cec.eu.int/uasa

Rosen’s formalisation of the modelling process Models mimic systems Rosen’s formalisation of the modelling process http://www.jrc.cec.eu.int/uasa

Models mimic systems (Rosen) “World” (the natural system) and “Model” (the formal system) are internally entailed - driven by a causal structure. Nothing entails with one another “World” and “Model”; the association is hence the result of a craftsmanship. But this does not apply to natural systems only: give 10 engineers the blueprint of the same plant and they will return you 10 model based risk assessments for the same plant. http://www.jrc.cec.eu.int/uasa

Models mimic systems (Rosen) It can help the craftsman that the uncertainty in the information provided by the model (the substance of use for the decoding exercise) is carefully apportioned to the uncertainty associated with the encoding process. http://www.jrc.cec.eu.int/uasa

Models maps assumptions onto inferences ... but often too narrowly <<[…] most simulation models will be complex, with many parameters, state-variables and non linear relations. Under the best circumstances, such models have many degrees of freedom and, with judicious fiddling, can be made to produce virtually any desired behaviour, often with both plausible structure and parameter values.>>, HORNBERGER and Spear (1981) <<Cynics say that models can be made to conclude anything provided that suitable assumptions are fed into them.>>, The Economist, 1998. KONIKOV and Bredehoeft, 1992  Oreskes et al. 1994. http://www.jrc.cec.eu.int/uasa

Use of models in the scientific discourse But yet models are used ... … and a legitimate question is the following: “If we had mapped the space of uncertain assumptions honestly and judiciously, would the space of inference still be of use1?” 1Read: do we still have peak around some useful inference (e.g. YES or NO, safe or unsafe, hypothesis accepted or rejected, policy effective or ineffective etc. ) or do we have as many YES as NO etc.? http://www.jrc.cec.eu.int/uasa

Models maps assumptions onto inferences … <<I have proposed a form of organised sensitivity analysis that I call “global sensitivity analysis” in which a neighborhood of alternative assumptions is selected and the corresponding interval of inferences is identified. Conclusions are judged to be sturdy only if the neighborhood of assumptions is wide enough to be credible and the corresponding interval of inferences is narrow enough to be useful.>> Leamer, “Sensitivity Analysis would help”, 1990 http://www.jrc.cec.eu.int/uasa

Leamer’s view of global Sensitivity Analysis (SA) Models maps assumptions onto inferences … Leamer’s view of global Sensitivity Analysis (SA) Space of estimated parameters Simulation inference Space of plausible models space ... Other assumptions http://www.jrc.cec.eu.int/uasa

Models maps assumptions onto inferences … (Parametric bootstrap version of UA/SA ) Model Input data (Estimation) Estimated parameters (Parametric bootstrap: we sample from the posterior parameter probability) Uncertainty and sensitivity analysis Inference http://www.jrc.cec.eu.int/uasa

Estimation of parameters Bootstrapping-of-the-modelling-process version of UA/SA, after Chatfield, 1995 Model (Model Identification) Loop on boot-replica of the input data (Estimation) Estimation of parameters (Bootstrap of the modelling process) Inference http://www.jrc.cec.eu.int/uasa

Posterior of Parameters Bayesian Uncertainty and Sensitivity Analysis (Draper 1995, Planas and Depoutot 2000) Posterior of Model(s) Prior of Model(s) Prior of Model Data (Sampling) Inference Prior of Parameters Posterior of Parameters http://www.jrc.cec.eu.int/uasa

Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis The space of the model induced choices (the inference) swells and shrinks by our swelling and shrinking the space of the input assumptions. How many of the assumptions are relevant at all for the choice? And those that are relevant, how do they act on the outcome; singularly or in more or less complex combinations? http://www.jrc.cec.eu.int/uasa

Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis I desire to have a given degree of robustness in the choice, what factor/assumptions should be tested more rigorously? (=> look at how much “fixing” any given f/a can potentially reduce the variance of the output) Can I confidently “fix” a subset of the input factors/assumptions? The Beck and Ravetz “relevance” issue. How do I find these factors? http://www.jrc.cec.eu.int/uasa

Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis “Reduced” variance Expected reduced variance – it is small if the factor is important. http://www.jrc.cec.eu.int/uasa

Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis Big if factor important Small if factor important First order effect http://www.jrc.cec.eu.int/uasa

Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis Also used is the total effect term: This is the expected fractional value of the variance that would be left if all factor but Xi were fixed. The use of different sensitivity measures should be seen as the answer to a rigorous question concerning the relative importance of input factors. http://www.jrc.cec.eu.int/uasa

Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis One can thus relate the total effect term to a question relative to the possibility to fix factor(s), (Factor Fixing Setting), while the first order effect frames into the Factors’ Prioritisation Setting. http://www.jrc.cec.eu.int/uasa

Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis Other setting (questions) can easily be imagined. Settings to frame the uncertaitny and sensitivity analyses are crucial. The alternative would be to have different SA methods suggesting different factors relative imortance. Settings should be audited! = Let us agree on what “importance” means before we engage in the analysis. http://www.jrc.cec.eu.int/uasa

Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis Is the model-induced choice weak (non robust) because there is an insufficient number of observations, or because the experts cannot agree on an accepted theory? http://www.jrc.cec.eu.int/uasa

Region where Region where Incineration Landfill Useful inference versus falsification of the analysis Example: imagine the inference is Y = the logarithm of the ratio between the two pressure-on-decision indices (Tarantola et als. 2000). Region where Region where Incineration Landfill is preferred is preferred Frequency of occurrence Y=Log(PI 1/PI 2) http://www.jrc.cec.eu.int/uasa

Useful inference versus falsification of the analysis http://www.jrc.cec.eu.int/uasa

Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis What happens if I address the space of the policy options? http://www.jrc.cec.eu.int/uasa

Gauging the leverage of the policy options latitude

- refocusing of the critical issues/factor, Conclusions The output from global uncertainty and sensitivity analyses can feed back into the extended peer review process via e.g. - refocusing of the critical issues/factor, - (re-assignment of weights for multiple criteria, or) - inference falsification identification of policy relevance/ irrelevance Note: EC Guidelines for Extended Impact Assessment inlcude explicit and detailed indication for global SA! http://www.jrc.cec.eu.int/uasa

References ROSEN R., Life Itself - A Comprehensive Inquiry into Nature, Origin, and Fabrication of Life. Columbia University Press 1991. HORNBERGER G.M., and R. C. Spear (1981) An approach to the preliminary analysis of environmental systems. Journal of Environmental management, 12, 7-18. KONIKOV and Bredehoeft, 1992, "Groundwater models cannot be validated" Advances in Water Resources 15(1), 75-83. ORESKES, N. , Shrader-Frechette K., Belitz, K., 1994, Verification, Validation, and Confirmation of Numerical Models in the Earth Sciences, SCIENCE, 263, 641-646 Edward E. Leamer, “Sensitivity Analysis would help”, in Modelling Economic Series, Edited by CWJ Granger, 1990, Clarendon Press, Oxford. CHATFIELD C., Model uncertainty , data mining and statistical inference, J. R. Statist. Soc. A, 158 (3) , 419-466, 1993 http://www.jrc.cec.eu.int/uasa

A forum - http://sensitivity-analysis.jrc.cec.eu.int/ Further reading on SA Papers - Saltelli et als., Statistical Science, 2000; Saltelli and Tarantola, JASA, 2002 Book - Saltelli et al. Eds., Sensitivity Analysis, 2000, John Wiley & Sons publishers, Probability and Statistics series Book - A primer (Sensitivity Analysis in Practice) will appear by end 2003, with Wiley. A forum - http://sensitivity-analysis.jrc.cec.eu.int/ Presentations of the mini-symposium on www.nusap.net http://www.jrc.cec.eu.int/uasa