QUANTITATIVE METHODS TO MANAGE UNCERTAINTY IN SCIENCE by Andrea Saltelli, Silvio Funtowicz, Stefano Tarantola, Joint Research.

Slides:



Advertisements
Similar presentations
Composite Indicators - The Controversy and the way forward Andrea Saltelli, Michela Nardo, Michaela Saisana and Stefano Tarantola European Commission Joint.
Advertisements

Evaluation of standard ICES stock assessment and Bayesian stock assessment in the light of uncertainty: North Sea herring as an example Samu MäntyniemiFEM,
Fountain Darter Model Bill Grant Hsiao-Hsuan Wang (Dr. Rose) University of Texas Texas A&M University May 12, 2014.
A Bayesian perspective on Info- Gap Decision Theory Ullrika Sahlin, Centre of Environmental and Climate Research Rasmus Bååth, Cognitive Science Lund University,
Testing the validity of indicators in the field of education The experience of CRELL Rome, October 3-5, 2012, Improving Education through Accountability.
Engineering Economic Analysis Canadian Edition
Approaches to Data Acquisition The LCA depends upon data acquisition Qualitative vs. Quantitative –While some quantitative analysis is appropriate, inappropriate.
A Robust Process Model for Calculating Security ROI Ghazy Mahjub DePaul University M.S Software Engineering.
MANAGEMENT SCIENCE The Art of Modeling with Spreadsheets STEPHEN G. POWELL KENNETH R. BAKER Compatible with Analytic Solver Platform FOURTH EDITION CHAPTER.
Introduction  Bayesian methods are becoming very important in the cognitive sciences  Bayesian statistics is a framework for doing inference, in a principled.
Parameterising Bayesian Networks: A Case Study in Ecological Risk Assessment Carmel A. Pollino Water Studies Centre Monash University Owen Woodberry, Ann.
CS 589 Information Risk Management 6 February 2007.
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 4: Modeling Decision Processes Decision Support Systems in the.
USE OF LAPLACE APPROXIMATIONS TO SIGNIFICANTLY IMPROVE THE EFFICIENCY
Generalised Mean Variance Analysis and Robust Portfolio Construction February 2006 Steve Wright Tel
Results 2 (cont’d) c) Long term observational data on the duration of effective response Observational data on n=50 has EVSI = £867 d) Collect data on.
1 Andrea Saltelli, European Commission, Joint Research Centre Copenhagen, October 2007 Sensitivity Analysis An introduction.
Uncertainty in Engineering - Introduction Jake Blanchard Fall 2010 Uncertainty Analysis for Engineers1.
Decision analysis and Risk Management course in Kuopio
1 D r a f t Life Cycle Assessment A product-oriented method for sustainability analysis UNEP LCA Training Kit Module k – Uncertainty in LCA.
Techniques for selecting projects
Development of the EAFI (Ethical Aquaculture Food Index); a Sustainability Decision Support Tool for SE Asia Professor Jason Weeks, Dr.
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
Scenario-building as a communication tool Skryhan Hanna Krasnoyarsk, February, 15 – February, 22, 2014.
1 Institute of Engineering Mechanics Leopold-Franzens University Innsbruck, Austria, EU H.J. Pradlwarter and G.I. Schuëller Confidence.
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,
1 D r a f t Life Cycle Assessment A product-oriented method for sustainability analysis UNEP LCA Training Kit Module f – Interpretation.
Proposed Model for Ranking Business Response to HIV/AIDS Private Sector Conference on HIV/AIDS November 2008 Presented by Gavin George.
Value of information Marko Tainio Decision analysis and Risk Management course in Kuopio
Engineering Economic Analysis Canadian Edition
1 Andrea Saltelli, Jessica Cariboni and Francesca Campolongo European Commission, Joint Research Centre SAMO 2007 Budapest Accelerating factors screening.
Various topics Petter Mostad Overview Epidemiology Study types / data types Econometrics Time series data More about sampling –Estimation.
On joint modelling of random uncertainty and fuzzy imprecision Olgierd Hryniewicz Systems Research Institute Warsaw.
Bayesian statistics Probabilities for everything.
17 May 2007RSS Kent Local Group1 Quantifying uncertainty in the UK carbon flux Tony O’Hagan CTCD, Sheffield.
MBA7020_01.ppt/June 13, 2005/Page 1 Georgia State University - Confidential MBA 7020 Business Analysis Foundations Introduction - Why Business Analysis.
1 Analysis of Variance & One Factor Designs Y= DEPENDENT VARIABLE (“yield”) (“response variable”) (“quality indicator”) X = INDEPENDENT VARIABLE (A possibly.
Sensitivity and Importance Analysis Risk Analysis for Water Resources Planning and Management Institute for Water Resources 2008.
European Commission DG Joint Research Centre Formal and informal approaches to the quality of information in integrated.
Quantitative Project Risk Analysis 1 Intaver Institute Inc. 303, 6707, Elbow Drive S.W., Calgary AB Canada T2V 0E5
Uncertainty Management in Rule-based Expert Systems
Experimental Research Methods in Language Learning Chapter 10 Inferential Statistics.
Eurostat – Unit D5 Key indicators for European policies Third International Seminar on Early Warning and Business Cycle Indicators Annotated outline of.
The science-policy interface at MNP INTARESE training on uncertainty & quality, 16/17 October 2007 Arthur Petersen.
“ Building Strong “ Delivering Integrated, Sustainable, Water Resources Solutions Sensitivity and Importance Analysis Charles Yoe
Tuesday, April 8 n Inferential statistics – Part 2 n Hypothesis testing n Statistical significance n continued….
Workshop A. Development of complex interventions Rob Anderson, PCMD Nicky Britten, PCMD.
1-1 Copyright © 2014, 2011, and 2008 Pearson Education, Inc.
Guidance for Uncertainty Scanning and Assessment at RIVM Jeroen van der Sluijs, James Risbey, Penny Kloprogge (Copernicus Institute, Utrecht) Jerry Ravetz.
04/21/2005 CS673 1 Being Bayesian About Network Structure A Bayesian Approach to Structure Discovery in Bayesian Networks Nir Friedman and Daphne Koller.
- 1 - Outline Introduction to the Bayesian theory –Bayesian Probability –Bayes’ Rule –Bayesian Inference –Historical Note Coin trials example Bayes rule.
Matrix Models for Population Management & Conservation March 2014 Lecture 10 Uncertainty, Process Variance, and Retrospective Perturbation Analysis.
Module 25: Confidence Intervals and Hypothesis Tests for Variances for One Sample This module discusses confidence intervals and hypothesis tests.
Confidence Intervals and Hypothesis Testing Mark Dancox Public Health Intelligence Course – Day 3.
Uncertainty and Sensitivity Analysis in Population-Based Disease Simulation Models Behnam Sharif STAR Team July 2010.
Building composite indices – methodology and quality issues A. Saltelli European Commission, Joint Research Centre of Ispra, Italy
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
Institute of Statistics and Decision Sciences In Defense of a Dissertation Submitted for the Degree of Doctor of Philosophy 26 July 2005 Regression Model.
1 Life Cycle Assessment A product-oriented method for sustainability analysis UNEP LCA Training Kit Module k – Uncertainty in LCA.
Combining Deterministic and Stochastic Population Projections Salvatore BERTINO University “La Sapienza” of Rome Eugenio SONNINO University “La Sapienza”
Marc Kennedy, Tony O’Hagan, Clive Anderson,
Chapter 4 PowerPoint Spreadsheet Analysis.
HydroEurope 2017 Week 2: Flood Map Accuracy & Resilience Estimation
Quantitative methods to manage uncertainty in science by
RIVM/MNP Guidance for Uncertainty Assessment and communication
A Hierarchical Bayesian Look at Some Debates in Category Learning
with contributions from
Pest Risk Analysis (PRA) Stage 2: Pest Risk Assessment
UNEP / Division of Early Warning and Assessment (DEWA)
Assessing Similarity to Support Pediatric Extrapolation
Presentation transcript:

QUANTITATIVE METHODS TO MANAGE UNCERTAINTY IN SCIENCE by Andrea Saltelli, Silvio Funtowicz, Stefano Tarantola, Joint Research Centre of the European Communities in Ispra (I), MUST: Managing Uncertainty in science for suSTainability: future research challenges for Europe FP6 launch event in Brussels, November 11, 2002

Uncertainty is not an accident of the scientific method, but its substance. Peter Høeg, a Danish novelist, writes in Borderliners (Høeg, 1995): "That is what we meant by science. That both question and answer are tied up with uncertainty, and that they are painful. But that there is no way around them. And that you hide nothing; instead, everything is brought out into the open". Uncertainty. Høeg.

Models mimic systems Rosen’s formalisation of the modelling process

“World” (the natural system) and “Model” (the formal system) are internally entailed - driven by a causal structure. Nothing entails with one another “World” and “Model”; the association is hence the result of a craftsmanship. But this does not apply to natural systems only: give 10 engineers the blueprint of the same plant and they will return you 10 model based risk assessments for the same plant. Models mimic systems (Rosen)

It can help the craftsman that the uncertainty in the information provided by the model (the substance of use for the decoding exercise) is carefully apportioned to the uncertainty associated with the encoding process. Models mimic systems (Rosen)

>, HORNBERGER G.M., and R. C. Spear (1981) An approach to the preliminary analysis of environmental systems. Journal of Environmental management, 12, >, The Economist, Models maps assumptions onto inferences... but often too narrowly

But yet models are used... … and a legitimate question is the following: “If we had mapped the space of uncertain assumptions honestly and judiciously, would the space of inference still be of use 1 ?” 1 Read: do we still have peak around some useful inference (e.g. YES or NO, safe or unsafe, hypothesis accepted or rejected, policy effective or ineffective etc. ) or do we have as many YES as NO etc.? Use of models in the scientific discourse

> Edward E. Leamer, “Sensitivity Analysis would help”, in Modelling Economic Series, Edited by CWJ Granger, 1990, Clarendon Press, Oxford. Models maps assumptions onto inferences …

Space of estimated parameters Space of plausible models space Simulation... inference Models maps assumptions onto inferences … Leamer’s view of global Sensitivity Analysis (SA) Other assumptions (Copes with Equifinality)

Estimated parameters Input data Model Uncertainty and sensitivity analysis Models maps assumptions onto inferences … (Parametric bootstrap version of UA/SA ) Inference (  Parametric bootstrap: we sample from the posterior parameter probability) (  Estimation)

Estimation of parameters Loop on boot- replica of the input data Model Bootstrapping-of-the-modelling-process version of UA/SA, after Chatfield, 1995 Inference (  Bootstrap of the modelling process) (  Estimation) (  Model Identification)

Inference Data Prior of Model Bayesian Uncertainty and Sensitivity Analysis (Draper 1995, Planas and Depoutot 2000) Prior of Parameters Prior of Model(s) Posterior of Parameters Posterior of Model(s) (  Sampling)

The space of the model induced choices (the inference) swells and shrinks by our swelling and shrinking the space of the input assumptions. How many of the assumptions are relevant at all for the choice? And those that are relevant, how do they act on the outcome; singularly or in more or less complex combinations? (ANOVA-type analysis) Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis

I desire to have a given degree of robustness in the choice, what factor/assumptions should be tested more rigorously? (=> look at how much “fixing” any given f/a can potentially reduce the variance of the output) E.g. can I confidently “fix” a subset of the input factors/assumptions? The Beck and Ravetz “relevance” issue. How do I find these f/a? (=> total sensitivity indices). Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis

Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis Reduced variance Expected reduced variance First order effect Total effect

16 Environmental sustainability Index, From The Economist, Green and growing, The Economist, Jan 25th 2001, Produced on behalf of the World Economic Forum (WEF), and presented to the annual Davos summit this year. Robustness...

17 Mathis Wackernagel, mental father of the “Ecological Footprint” and thus an authoritative source in the Sustainable Development expert community, concludes an argumented critique of the study done presented at Davos by noting: Robustness...

"Overall, the report would gain from a more extensive peer review and a sensitivity analysis. The lacking sensitivity analysis undermines the confidence in the results since small changes in the index architecture or the weighting could dramatically alter the ranking of the nations.” Robustness - Wackernagel’s critique

Monte Carlo Analysis: countries’ score for the Technology Development Index, (UN), a composite indicator. Modified as to include variability in weights (e.g. as when using budget allocation or Analytic Hierarchy Process). Robustness of composite indicator - a worked example

-> Monte Carlo of score of country A minus score of country B. -> Country A generally better off…mostly due to the weight factors a and b Robustness of composite indicator a worked example b a

-> Scatter-plot of score of country A minus score of country B (colours) as function of the two most important weights (red=negative, blue=zero, green=positive) Robustness of composite indicator a worked example country A – country B weight of 2 nd indicator weight of 1 st indicator

How can I identify model structures in the simultaneous presence of several uncertainty sources ? Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis

23 ? ? Campolongo et al., 1999, JAC Saltelli et al., 1995, JAC

Is the model-induced choice weak (non robust) because there is an insufficient number of observations, or because the experts cannot agree on an accepted theory? Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis

Useful inference versus falsification of the analysis Example: imagine the inference is Y = the logarithm of the ratio between the two pressure-on-decision indices (Tarantola et als. 2000). Y=Log(PI 1/PI 2) Region where Incineration Landfill is preferred Frequency of occurrence

Useful inference versus falsification of the analysis

What happens if I address the space of the policy options? Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis

28 Gauging the leverage of the policy options latitude

29 Use of models in the scientific discourse … and role of uncertainty - sensitivity analysis A broader context of knowledge production and its quality as input to the policy process. Merging formal with informal tools. Ongoing work at the JRC with Angela and Silvio.

© ULYSSES: De Marchi et. al, 1998.

Conclusions The output from global Unc. A./Sens. A. can feed back into the extended peer review process via e.g. - refocusing of the critical issues, - re-assignment of weights for multiple criteria, or - of inference falsification (or otherwise) - identification of policy relevance/ irrelevance

Further reading on SA Saltelli et als. Computer Physics Communications 2002, Saltelli et als. Statistical Science, 2000 Saltelli et als. JASA, 2000 (available here) Saltelli et al. Eds., Sensitivity Analysis, John Wiley & Sons publishers, Probability and Statistics series (buy it!)