By Addison Euhus, Guidance by Edward Phillips An Introduction To Uncertainty Quantification.

Slides:



Advertisements
Similar presentations
Contrastive Divergence Learning
Advertisements

Lecture 18: Temporal-Difference Learning
Bayesian Estimation in MARK
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
Markov-Chain Monte Carlo
Approaches to Data Acquisition The LCA depends upon data acquisition Qualitative vs. Quantitative –While some quantitative analysis is appropriate, inappropriate.
1 Graphical Diagnostic Tools for Evaluating Latent Class Models: An Application to Depression in the ECA Study Elizabeth S. Garrett Department of Biostatistics.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Bayesian statistics – MCMC techniques
Suggested readings Historical notes Markov chains MCMC details
Stochastic approximate inference Kay H. Brodersen Computational Neuroeconomics Group Department of Economics University of Zurich Machine Learning and.
BAYESIAN INFERENCE Sampling techniques
Optimization methods Morten Nielsen Department of Systems biology, DTU.
Edge Detection Enhancement Using Gibbs Sampler
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
Computing the Posterior Probability The posterior probability distribution contains the complete information concerning the parameters, but need often.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
The University of Texas at Austin, CS 395T, Spring 2008, Prof. William H. Press IMPRS Summer School 2009, Prof. William H. Press 1 4th IMPRS Astronomy.
Today Introduction to MCMC Particle filters and MCMC
Robin McDougall, Ed Waller and Scott Nokleby Faculties of Engineering & Applied Science and Energy Systems & Nuclear Science 1.
Chapter 8 Introduction to Hypothesis Testing
Monte Carlo Simulation of Interacting Electron Models by a New Determinant Approach Mucheng Zhang (Under the direction of Robert W. Robinson and Heinz-Bernd.
Material Model Parameter Identification via Markov Chain Monte Carlo Christian Knipprath 1 Alexandros A. Skordos – ACCIS,
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
Priors, Normal Models, Computing Posteriors
Exam I review Understanding the meaning of the terminology we use. Quick calculations that indicate understanding of the basis of methods. Many of the.
Finding Scientific topics August , Topic Modeling 1.A document as a probabilistic mixture of topics. 2.A topic as a probability distribution.
1 Gil McVean Tuesday 24 th February 2009 Markov Chain Monte Carlo.
Monte Carlo Methods in Statistical Mechanics Aziz Abdellahi CEDER group Materials Basics Lecture : 08/18/
Monte Carlo Methods Versatile methods for analyzing the behavior of some activity, plan or process that involves uncertainty.
Bayesian Inversion of Stokes Profiles A.Asensio Ramos (IAC) M. J. Martínez González (LERMA) J. A. Rubiño Martín (IAC) Beaulieu Workshop ( Beaulieu sur.
Monte Carlo Methods1 T Special Course In Information Science II Tomas Ukkonen
Simulated Annealing.
An Efficient Sequential Design for Sensitivity Experiments Yubin Tian School of Science, Beijing Institute of Technology.
Bayesian Reasoning: Tempering & Sampling A/Prof Geraint F. Lewis Rm 560:
MCMC (Part II) By Marc Sobel. Monte Carlo Exploration  Suppose we want to optimize a complicated distribution f(*). We assume ‘f’ is known up to a multiplicative.
Application of the MCMC Method for the Calibration of DSMC Parameters James S. Strand and David B. Goldstein The University of Texas at Austin Sponsored.
MCMC reconstruction of the 2 HE cascade events Dmitry Chirkin, UW Madison.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Lecture #9: Introduction to Markov Chain Monte Carlo, part 3
Reducing MCMC Computational Cost With a Two Layered Bayesian Approach
CS774. Markov Random Field : Theory and Application Lecture 15 Kyomin Jung KAIST Oct
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Bayesian Modelling Harry R. Erwin, PhD School of Computing and Technology University of Sunderland.
An Iterative Monte Carlo Method for Nonconjugate Bayesian Analysis B. P. Carlin and A. E. Gelfand Statistics and Computing 1991 A Generic Approach to Posterior.
1 Getting started with WinBUGS Mei LU Graduate Research Assistant Dept. of Epidemiology, MD Anderson Cancer Center Some material was taken from James and.
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
RECITATION 2 APRIL 28 Spline and Kernel method Gaussian Processes Mixture Modeling for Density Estimation.
Density Estimation in R Ha Le and Nikolaos Sarafianos COSC 7362 – Advanced Machine Learning Professor: Dr. Christoph F. Eick 1.
How many iterations in the Gibbs sampler? Adrian E. Raftery and Steven Lewis (September, 1991) Duke University Machine Learning Group Presented by Iulian.
Hierarchical Models. Conceptual: What are we talking about? – What makes a statistical model hierarchical? – How does that fit into population analysis?
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Markov Chain Monte Carlo in R
Introduction to Sampling based inference and MCMC
MCMC Output & Metropolis-Hastings Algorithm Part I
Markov Chain Monte Carlo methods --the final project of stat 6213
Optimization of Monte Carlo Integration
Advanced Statistical Computing Fall 2016
Introducing Bayesian Approaches to Twin Data Analysis
Reinforcement Learning (1)
Bayesian inference Presented by Amir Hadadi
Remember that our objective is for some density f(y|) for observations where y and  are vectors of data and parameters,  being sampled from a prior.
Statement of the Tomography Problem
1.
Ch13 Empirical Methods.
Markov Chain Monte Carlo
Statistical Data Mining
Yalchin Efendiev Texas A&M University
Presentation transcript:

By Addison Euhus, Guidance by Edward Phillips An Introduction To Uncertainty Quantification

Book and References Book – Uncertainty Quantification: Theory, Implementation, and Applications, by Smith Example Source/Data from mc/ mc/

What is Uncertainty Quantification? UQ is a way of determining likely outcomes when specific factors are unknown Parameter, Structural, Experimental Uncertainty Algae Example: Even if we knew the exact concentration of microorganisms in a pond and water/temperature, there are small details (e.g. rock positioning, irregular shape) that cause uncertainty

The Algae Example Consider the pond with phytoplankton (algae) A, zooplankton Z, and nutrient phosphorous P

The Algae Example This can be modeled by a simple predator prey model

Observations and Parameters Concentrations of A, Z, and P can be measured as well as the outflow Q, temperature T, and inflow of phosphorous P in However, the rest of the values cannot be measured as easily – growth rate mu, rho’s, alpha, k, and theta. Because of uncertainty, these will be hard to determine using standard methods

Observed Algae Data

Statistical Approach: MCMC Markov Chain Monte Carlo (MCMC) Technique “Specify parameter values that explore the geometry of the distribution” Constructs Markov Chains whose stationary distribution is the posterior density Evaluate realizations of the chain, which samples the posterior and obtains a density for parameters based on observed values

DRAM Algorithm Delayed Rejection Adaptive Metropolis (DRAM) Based upon multiple iterations and variance/covariance calculations Updates the parameter value if it satisfies specific probabilistic conditions, and continues to iterate on the initial chain “Monte Carlo” on the Markov Chains

Running the Algorithm Using MATLAB code, the Monte Carlo method runs on the constructed Markov Chains (the covariance matrix V) After a certain amount of iterations, the chain plots will show whether or not the chain has converged to values for the parameters After enough iterations have been run, the chain can be observed and the parameter calculations can be used to predict behavior in the model

Small Iterations

Large Iterations

Results from the Algae Model