Uncertainty session "Uncertainties in measuring and modelling carbon and greenhouse gas balances in Europe“ JUTF activity Martin Wattenbach.

Slides:



Advertisements
Similar presentations
Sampling Design, Spatial Allocation, and Proposed Analyses Don Stevens Department of Statistics Oregon State University.
Advertisements

Bayesian tools for analysing and reducing uncertainty Tony OHagan University of Sheffield.
1 Uncertainty in rainfall-runoff simulations An introduction and review of different techniques M. Shafii, Dept. Of Hydrology, Feb
Error-aware GIS at work: real-world applications of the Data Uncertainty Engine Gerard Heuvelink Wageningen University and Research Centre with contributions.
CHAPTER 8 More About Estimation. 8.1 Bayesian Estimation In this chapter we introduce the concepts related to estimation and begin this by considering.
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Bayesian Estimation in MARK
Bayesian calibration and comparison of process-based forest models Marcel van Oijen & Ron Smith (CEH-Edinburgh) Jonathan Rougier (Durham Univ.)
Sensitivity Analysis In deterministic analysis, single fixed values (typically, mean values) of representative samples or strength parameters or slope.
Uncertainty in Engineering The presence of uncertainty in engineering is unavoidable. Incomplete or insufficient data Design must rely on predictions or.
Testing hydrological models as hypotheses: a limits of acceptability approach and the issue of disinformation Keith Beven, Paul Smith and Andy Wood Lancaster.
MARLAP Measurement Uncertainty
Results 2 (cont’d) c) Long term observational data on the duration of effective response Observational data on n=50 has EVSI = £867 d) Collect data on.
The Calibration Process
Chapter 9: Introduction to the t statistic
Lecture II-2: Probability Review
1 D r a f t Life Cycle Assessment A product-oriented method for sustainability analysis UNEP LCA Training Kit Module k – Uncertainty in LCA.
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
WSEAS AIKED, Cambridge, Feature Importance in Bayesian Assessment of Newborn Brain Maturity from EEG Livia Jakaite, Vitaly Schetinin and Carsten.
VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project.
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
Topics: Statistics & Experimental Design The Human Visual System Color Science Light Sources: Radiometry/Photometry Geometric Optics Tone-transfer Function.
Theory of Probability Statistics for Business and Economics.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
The Scientific Method Formulation of an H ypothesis P lanning an experiment to objectively test the hypothesis Careful observation and collection of D.
Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005 David K. Walters.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Delivering Integrated, Sustainable, Water Resources Solutions Monte Carlo Simulation Robert C. Patev North Atlantic Division – Regional Technical Specialist.
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
Modelling protocols Marcel van Oijen (CEH-Edinburgh) With input from participants Garmisch Mar & May 2006, Protocol-GMP team, Protocol-UQ/UA team.
Why it is good to be uncertain ? Martin Wattenbach, Pia Gottschalk, Markus Reichstein, Dario Papale, Jagadeesh Yeluripati, Astley Hastings, Marcel van.
17 May 2007RSS Kent Local Group1 Quantifying uncertainty in the UK carbon flux Tony O’Hagan CTCD, Sheffield.
Center for Radiative Shock Hydrodynamics Fall 2011 Review Assessment of predictive capability Derek Bingham 1.
Statistical approach Statistical post-processing of LPJ output Analyse trends in global annual mean NPP based on outputs from 19 runs of the LPJ model.
Example: Bioassay experiment Problem statement –Observations: At each level of dose, 5 animals are tested, and number of death are observed.
Upscaling and Uncertainty Analysis of Greenhouse Gas Emission Inventories Linda Nol Analysis for the Dutch fen meadow landscapes.
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
- 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Reducing MCMC Computational Cost With a Two Layered Bayesian Approach
- 1 - Calibration with discrepancy Major references –Calibration lecture is not in the book. –Kennedy, Marc C., and Anthony O'Hagan. "Bayesian calibration.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Goal of Stochastic Hydrology Develop analytical tools to systematically deal with uncertainty and spatial variability in hydrologic systems Examples of.
“ Building Strong “ Delivering Integrated, Sustainable, Water Resources Solutions Uncertainty & Variability Charles Yoe, Ph.D.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
1 Life Cycle Assessment A product-oriented method for sustainability analysis UNEP LCA Training Kit Module k – Uncertainty in LCA.
Establishing by the laboratory of the functional requirements for uncertainty of measurements of each examination procedure Ioannis Sitaras.
CWR 6536 Stochastic Subsurface Hydrology Optimal Estimation of Hydrologic Parameters.
8 Sept 2006, DEMA2006Slide 1 An Introduction to Computer Experiments and their Design Problems Tony O’Hagan University of Sheffield.
Bayesian analysis of a conceptual transpiration model with a comparison of canopy conductance sub-models Sudeep Samanta Department of Forest Ecology and.
Stochasticity and Probability. A new approach to insight Pose question and think of the answer needed to answer it. Ask: How do the data arise? What is.
Markov Chain Monte Carlo in R
Stats Methods at IC Lecture 3: Regression.
Marc Kennedy, Tony O’Hagan, Clive Anderson,
Lecture 1.31 Criteria for optimal reception of radio signals.
Why Stochastic Hydrology ?
The Calibration Process
Session II: Reserve Ranges Who Does What
Statistical Methods For Engineers
Filtering and State Estimation: Basic Concepts
Differences between social & natural sciences
Pest Risk Analysis (PRA) Stage 2: Pest Risk Assessment
Parametric Methods Berlin Chen, 2005 References:
Managerial Decision Making and Evaluating Research
Meta-analysis, systematic reviews and research syntheses
Propagation of Error Berlin Chen
Yalchin Efendiev Texas A&M University
Uncertainty Propagation
Presentation transcript:

Uncertainty session "Uncertainties in measuring and modelling carbon and greenhouse gas balances in Europe“ JUTF activity Martin Wattenbach

CarboEurope standard protocol - aims Treatment, quantification and integration of uncertainties in CarboEurope-IP: There is a strong need for common definitions with respect to terms like ‘error’, ‘uncertainty’, ‘bias’, ‘systematic’, ‘random’ Moreover, a harmonized protocol of how uncertainties have to be treated is needed. standards do exist we can build on.

Hydrology Community HarmoniRiB projectleader : Jens Christian Refsgaard Uncertainty in Water Resources Management paper: Refsgaard, J.C., van der Sluijs, J.P., Hojberg, A.L. and Vanrolleghem, P.A., 2007. Uncertainty in the environmental modelling process - A framework and guidance. Environmental Modelling & Software, 22(11): 1543-1556. Equifinality, data assimilation, and uncertainty estimation in mechanistic modelling of complex environmental systems using the GLUE methodology by Keith Beven Beven, K. and Binley, A., 1992. The future of distributed models: model calibration and uncertainty prediction. Hydrological Processes, 6(3): 279-298. Beven, K. and Freer, J., 2001. Equifinality, data assimilation, and uncertainty estimation in mechanistic modelling of complex environmental systems using the GLUE methodology. Journal of Hydrology, 249(1-4): 11-29.

HarmonRIB EU’s Water Framework Directive (WFD) an outcome of EU environmental policy, where one of the basic principles is ‘‘to contribute to the pursuit of the objectives of preserving, protecting and improving the quality of the environment in prudent and rational use of natural resources, and to be based on the precautionary principle’’ As the precautionary principle aims to protect humans and the environment against uncertain risks by means of pre-damage control (anticipatory measures) it can not be implemented without incorporating uncertainty assessments into the decision making process.

HarmonRib Uncertainty sources Data physical, chemical, biological, etc. scale problems (temporal and spatial) Model parameter values numerical solution (approximations) bugs in model code model structure (process equations) Context - framing of problem multiple framing (ambiguity) among decision makers and stakeholders, including differences in objectives external factors not accounted for in study legislation, regulatory conditions, etc.

Equifinality equifinality concept: “It may be endemic to mechanistic modelling of complex environmental systems that there are many different model structures and many different parameter sets within a chosen model structure that may be behavioural or acceptable in reproducing the observed behaviour of that system.” The generalised likelihood uncertainty estimation (GLUE) methodology for model identification is used to treat equifinality Prediction within this methodology is a process of ensemble forecasting using a sample of parameter sets from the behavioural model space, with each sample weighted according to its likelihood measure to estimate prediction quantiles.

Nitro-Europe Definitions For the purposes of this protocol we use the following definitions: Input All the information needed to run a model that is not incorporated in the model itself. Input is of three types: (1) Initial constants (= values of state variables at start of simulation), (2) Model parameters, (3) Environmental constants and variables. Model A computer program that transforms input into output. Output Model results for given input. UA Uncertainty analysis, i.e. attribution of overall output uncertainty (whose magnitude is determined in the process we call UQ) to the different input uncertainties. UA is specific to a given model and does not address possible errors in coding or model structure. Uncertainty Incomplete knowledge. Uncertainty is of three types: (1) input uncertainty, (2) uncertainty about model structure, (3) output uncertainty. UQ Uncertainty quantification, i.e. quantification of the model output uncertainty caused by uncertainty in the inputs. The degree to which each input, through the uncertainty associated with it, is responsible for output uncertainty, is determined in the process we call UA. UQ does not quantify output uncertainty associated with uncertainty about coding or model structure.

NEU protocol In NEU, all modellers are committed to: Carry out UQ of their model and report: (1) The chosen UQ-method, (2) Output uncertainty. The UQ-method is chosen by the modeller but can be one of the recommended methods described in the following sections. UQ is carried out at least twice: near beginning (year 1 for plot-scale modellers, year 2 for regional scale modellers) and end of the NEU-project, but it is recommended to do this whenever significant new data have become available and whenever the model has changed. In NEU, all modellers are recommended to: Carry out UA for their model and report: (1) The chosen UA-method, (2) The inputs that account for most of output uncertainty. It is recommended to carry out UA after each UQ and use the results for improvement of the modelling as well as to guide the collection of data. Carry out a model comparison, possibly in collaboration with other groups, and report: (1) The method used for model comparison, (2) The relative probabilities of different models, model versions and/or model algorithms (i.e. submodels for specific processes or groups of processes) of being correct. Systematic comparison of completely different models is recommended for near the end of the project, systematic comparison of different model versions is recommended to be repeated throughout the period of model development in the first years of NEU.

Nitro-Europe - UQ Monte Carlo uncertainty analysis Bayesian approach The uncertainty analysis will be carried out using the Monte Carlo method, which basically works as follows: Repeat many times (say in between 200 and 1000 times): Generate a realisation (a random sample) from the joint probability distribution function (pdf) of the uncertain model inputs and parameters; Run the model for this sample of inputs and parameters and store the model output; Compute and report statistics of the sample of model outputs (such as the mean, standard deviation, percentiles, proportion of sample that is above a critical threshold). Bayesian approach

Bayesian approach model model model Prior pdf Bayesian calibration data Posterior pdf model

Prior pdf for the parameters Bayesian appoach Prior pdf for the parameters Likelihood of the data Scaling constant ( = ∫ P() P(D|) d  ) P(|D) = P() P(D|) / P(D) Posterior pdf for the parameters

Sample of 104 -105 parameter vectors from P(|D) Bayesian chainLength = 100000 ; data = [10, 6.09, 1.83 ; 20, 8.81, 2.64 ; 30, 10.66, 3.27 ] ; pMinima = [0, 0] ; pMaxima = [10, 1] ; pValues = [5, 0.5] ; vcovProposal = diag( (0.1*(pMaxima-pMinima)) .^2 ) ; pChain = zeros( chainLength, length(pValues) ) ; pChain(1,:) = pValues ; logPrior0 = sum( log( unifpdf( pValues, pMinima, pMaxima ) ) ) ; for t = 1:30, y(t) = pValues(1) + pValues(2) * t ; end ; for i = 1:length(data), logLi(i) = -0.5 * ((y(data(i,1))-data(i,2))/data(i,3))^2 ... -0.5 * log(2*pi) - log(data(i,3)) ; end ; logL0 = sum( logLi ) ; for c = 2 : chainLength, candidatepValues = mvnrnd( pValues, vcovProposal ) ; Prior1 = prod( unifpdf( candidatepValues, pMinima, pMaxima ) ) ; if (Prior1 > 0), for t = 1:30, y(t) = candidatepValues(1) + candidatepValues(2) * t ; end ; -0.5 * log(2*pi) - log(data(i,3)) ; end ; logL1 = sum( logLi ) ; logalpha = (log(Prior1)+logL1) - (logPrior0+logL0) ; if ( log(rand) < logalpha ), pValues = candidatepValues ; logPrior0 = log(Prior1) ; logL0 = logL1 ; end ; end ; pChain(c,:) = pValues ; end ; disp(mean(pChain)) ; disp(cov(pChain)) ; MCMC trace plots Sample of 104 -105 parameter vectors from P(|D)

Nitro-Europe; open question 1. Is the model clearly defined, is it clear what is part of the model and what not, is it clear what the model inputs, parameters and outputs are and are all of these clearly defined? 2. Which of the model inputs and parameters may be treated as certain (‘known’) and which must be treated as uncertain (‘(partially) unknown’)? 3. How can the joint pdf of the uncertain model parameters be obtained? 4. How can the joint pdf of the uncertain model inputs be obtained? 5. How can samples from the joint pdf be generated? 6. How can uncertainties concerning model structures be incorporated? 7. How can the contribution of individual uncertain inputs and uncertain model parameters to the overall model output uncertainty be assessed? 8. How should the technical implementation of the method be organised, how should the methodology be implemented (automated) and made efficient? Gerard Heuvelink

CarboEurope - protocol There is already a wide range of literature and position papers dealing with uncertainty either in measurements or modelling. In the case of CarboEurope we can conclude that a hierarchical approach is needed starting from the common definition of uncertainty for both areas and then propagate uncertainties from measurement to modelling

CarboEurope – modelling uncertainty Monte Carlo methods could represent a useful standard for uncertainty estimations How do we treat the model ? Black box – only considering input data uncertainty Open box – considering input and parameter uncertainty Model comparison – Monte Carlo ? Bayesian ?

CarboEurope – uncertainty in measurements Open questions Only flux data have reliable uncertainty ranges Soil data, management data have no uncertainty estimation However, in the bottom-up modelling they represent a very important source of variance in the output We need uncertainty ranges to get any meaning full result Could we use other data sources ? e.g regional statistics, soil maps etc

speaker Keith Paustian Dario Papale Christine Moureaux Christian Beer

Bayesian "probability theory is the logic of science" "all statements are conditional" "models can not be usefully evaluated without comparison to other models“ Marcel van Oijen

topics temporal uncertainties in measured data e.g. gap filling procedures, u* correction, filtering etc. spatial uncertainties e.g. representativeness of sample points, interpolation and extrapolation methods, geo- statistics uncertainties in temporal and spatial scaling approaches uncertainty in model applications at the site scale uncertainty in model based up-scaling approaches way to reduce uncertainty in models e.g. Bayesian approaches, Pareto and others ways to identify and reduce uncertainties in measurements