Bayesian analysis of a conceptual transpiration model with a comparison of canopy conductance sub-models Sudeep Samanta Department of Forest Ecology and.

Slides:



Advertisements
Similar presentations
Bayes rule, priors and maximum a posteriori
Advertisements

MCMC estimation in MlwiN
Bayesian Belief Propagation
A Tutorial on Learning with Bayesian Networks
Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
Tutorial on Bayesian Techniques for Inference A.Asensio Ramos Instituto de Astrofísica de Canarias.
Computer vision: models, learning and inference Chapter 18 Models for style and identity.
Markov-Chain Monte Carlo
Model Assessment, Selection and Averaging
Statistics, data, and deterministic models NRCSE.
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
Lecture II-2: Probability Review
Computer vision: models, learning and inference Chapter 6 Learning and Inference in Vision.
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
Graziella Quattrocchi & Louise Marshall Methods for Dummies 2014
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
Bayesian parameter estimation in cosmology with Population Monte Carlo By Darell Moodley (UKZN) Supervisor: Prof. K Moodley (UKZN) SKA Postgraduate conference,
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
Geo597 Geostatistics Ch9 Random Function Models.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
17 May 2007RSS Kent Local Group1 Quantifying uncertainty in the UK carbon flux Tony O’Hagan CTCD, Sheffield.
Center for Radiative Shock Hydrodynamics Fall 2011 Review Assessment of predictive capability Derek Bingham 1.
Lecture 5 Model Evaluation. Elements of Model evaluation l Goodness of fit l Prediction Error l Bias l Outliers and patterns in residuals.
ChEAS 2005 D.S. Mackay June 1-2, 2005 Reference canopy conductance through space and time: Unifying properties and their conceptual basis D. Scott Mackay.
Week 41 Estimation – Posterior mean An alternative estimate to the posterior mode is the posterior mean. It is given by E(θ | s), whenever it exists. This.
Three Frameworks for Statistical Analysis. Sample Design Forest, N=6 Field, N=4 Count ant nests per quadrat.
Uncertainty Management in Rule-based Expert Systems
Chapter 7. Learning through Imitation and Exploration: Towards Humanoid Robots that Learn from Humans in Creating Brain-like Intelligence. Course: Robots.
Ch15: Decision Theory & Bayesian Inference 15.1: INTRO: We are back to some theoretical statistics: 1.Decision Theory –Make decisions in the presence of.
- 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
1 Module One: Measurements and Uncertainties No measurement can perfectly determine the value of the quantity being measured. The uncertainty of a measurement.
MPS/MSc in StatisticsAdaptive & Bayesian - Lect 71 Lecture 7 Bayesian methods: a refresher 7.1 Principles of the Bayesian approach 7.2 The beta distribution.
6. Population Codes Presented by Rhee, Je-Keun © 2008, SNU Biointelligence Lab,
Sampling considerations within Market Surveillance actions Nikola Tuneski, Ph.D. Department of Mathematics and Computer Science Faculty of Mechanical Engineering.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Geogg124: Data assimilation P. Lewis. What is Data Assimilation? Optimal merging of models and data Models Expression of current understanding about process.
Statistical Methods. 2 Concepts and Notations Sample unit – the basic landscape unit at which we wish to establish the presence/absence of the species.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Computacion Inteligente Least-Square Methods for System Identification.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Density Estimation in R Ha Le and Nikolaos Sarafianos COSC 7362 – Advanced Machine Learning Professor: Dr. Christoph F. Eick 1.
Eawag: Swiss Federal Institute of Aquatic Science and Technology Analyzing input and structural uncertainty of deterministic models with stochastic, time-dependent.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
CSC321: Lecture 8: The Bayesian way to fit models Geoffrey Hinton.
Hierarchical Models. Conceptual: What are we talking about? – What makes a statistical model hierarchical? – How does that fit into population analysis?
Stochasticity and Probability. A new approach to insight Pose question and think of the answer needed to answer it. Ask: How do the data arise? What is.
Bayesian Estimation and Confidence Intervals Lecture XXII.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Hierarchical Models.
ESTIMATION.
MCMC Stopping and Variance Estimation: Idea here is to first use multiple Chains from different initial conditions to determine a burn-in period so the.
Point and interval estimations of parameters of the normally up-diffused sign. Concept of statistical evaluation.
Ch3: Model Building through Regression
Graduate School of Business Leadership
Special Topics In Scientific Computing
Remember that our objective is for some density f(y|) for observations where y and  are vectors of data and parameters,  being sampled from a prior.
CHAPTER 29: Multiple Regression*
Predictive distributions
2. Stratified Random Sampling.
More about Posterior Distributions
Modelling data and curve fitting
2. Stratified Random Sampling.
Filtering and State Estimation: Basic Concepts
CS639: Data Management for Data Science
Errors and Uncertainties
Mathematical Foundations of BME Reza Shadmehr
Applied Statistics and Probability for Engineers
Presentation transcript:

Bayesian analysis of a conceptual transpiration model with a comparison of canopy conductance sub-models Sudeep Samanta Department of Forest Ecology and Management University of Wisconsin - Madison

Motivation Mechanistic or process based conceptual models: basis for extrapolation, scientific understanding. Requires: model testing, parameter estimation, model discrimination through comparison. Problem: Most models are deterministic.

Objectives Estimate model parameter and prediction uncertainties with a simple probabilistic error term. Develop a methodology for comparing models that accounts for both model fit and complexity.

Bayesian approach Probability as the mathematical expression of the degree of belief. Direct quantification of uncertainty. Allows for incorporating prior knowledge. Computational advantages: Conceptually simple, Computationally feasible.

Bayesian approach Unknown parameters β and σ 2 are random quantities. Prior distributions describe knowledge about the parameters prior to the experiment. Posterior distributions expresses information gained from data. Bayes’ Rule:

Uncertainty analysis of a mechanistic transpiration model

Half hourly measurements made at the ChEAS site, WI: Average transpiration from eight sugar maple trees, Measurements within canopy - Hay Creek, WI, Measurements above canopy - Willow Creek, WI. Transpiration Data 05/05/200109/11/2001

Transpiration Model Canopy conductance: Penman – Monteith Equation: Probabilistic Model:

Prior and posterior distributions Noninformative prior distribution of parameters: β uniformly distributed within specified limits, σ 2 uniform on log(σ), joint prior density: Joint posterior distribution used for MCMC:

Histograms of marginal posterior distributions Symmetric, well defined.

Residuals Residuals are not obviously biased or heteroscedastic. Small systematic errors might be present.

Posterior interval for transpiration rate Overall, the posterior density regions are consistent with the observations. Relationship between observations and predictions inconsistent from day to day.

Conclusions drawn from uncertainty analysis Possible to obtain estimate of error variability. Bayesian approach is useful: to estimate parameter values, to estimate uncertainties associated with the parameters and predictions. These uncertainties may be considerable and should be taken into account when drawing inferences from data. Systematic errors → consider: Other transpiration models, Collection of other data might improve the modeling.

Comparison of canopy conductance sub-models

Model comparison metric Deviance Information Criterion (DIC): A penalized criterion that combines Bayesian measures of model complexity and fit. Explicitly accounts for prior and posterior distributions. Can be used for comparing models of arbitrary structure. Can be used in situations where all possible models cannot be specified ahead of time. [Spiegelhalter et al., 2002]

Measure of fit in DIC “Bayesian deviance”, in this case:

Measure of complexity in DIC Measures of fit and complexity combined to define DIC as: Effective number of parameters, p D : where = posterior mean of the deviance, and = deviance at the posterior estimates of the parameters. Model comparison metric: DIC

Canopy conductance: Where: Canopy conductance sub-model g Smax, highest conductance per unit leaf area (mol m -2 s -1 ), L max, maximum leaf area index, D, vapor pressure deficit within canopy (kPa), Q p, average PAR photon flux density (  mol m -2 s -1 ), T c, canopy air temperature (˚C), Ψ s, soil water potential (MPa), lf doy, fitted transpiration (mm s -1 ).

Constraint functions Estimated parameters: g Smax, δ, δ h, A, Ψ 0, T o, T lo, T hi, and lf scl.

Marginal posterior distributions for MC7 Distributions for Ψ 0, T o, T lo, and T hi are skewed, Very wide 95% posterior intervals.

Results of comparison using DIC Increased model complexity did not always improve DIC.

Conclusions drawn from model comparison DIC is useful as a metric to identify an appropriate conceptual model for a given set of data. DIC helps in the development of more refined models. Some of the parameters in an overly complex model may be poorly identified based on the information available in the data. Future data collection efforts may be directed by: data requirements of proposed models, need for better parameter identification.

Future research Usability and effectiveness of this methodology with other models and data. Use of informative priors. Use of other error models. Use of the methodology for models with more than one observed output or spatially distributed output.