Bayesian calibration of earth systems models

Slides:



Advertisements
Similar presentations
Introduction to Monte Carlo Markov chain (MCMC) methods
Advertisements

Motivating Markov Chain Monte Carlo for Multiple Target Tracking
A Preliminary Calibrated Deglaciation Chronology for the Eurasian Ice Complex Lev Tarasov¹, W.R. Peltier², R. Gyllencreutz³, O. Lohne³, J. Mangerud³, and.
Markov-Chain Monte Carlo
Computing the Posterior Probability The posterior probability distribution contains the complete information concerning the parameters, but need often.
Neural Networks II CMPUT 466/551 Nilanjan Ray. Outline Radial basis function network Bayesian neural network.
Classification for High Dimensional Problems Using Bayesian Neural Networks and Dirichlet Diffusion Trees Radford M. Neal and Jianguo Zhang the winners.
Bayesian Neural Networks Pushpa Bhat Fermilab Harrison Prosper Florida State University.
Statistics, data, and deterministic models NRCSE.
End of Chapter 8 Neil Weisenfeld March 28, 2005.
Experimental System for Predicting Shelf-Slope Optics (ESPreSSO): Assimilating ocean color data using an iterative ensemble smoother: skill assessment.
Helsinki University of Technology Adaptive Informatics Research Centre Finland Variational Bayesian Approach for Nonlinear Identification and Control Matti.
Bayes Factor Based on Han and Carlin (2001, JASA).
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
Chanyoung Park Raphael T. Haftka Paper Helicopter Project.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 11: Bayesian learning continued Geoffrey Hinton.
Bayesian Inversion of Stokes Profiles A.Asensio Ramos (IAC) M. J. Martínez González (LERMA) J. A. Rubiño Martín (IAC) Beaulieu Workshop ( Beaulieu sur.
Fast Simulators for Assessment and Propagation of Model Uncertainty* Jim Berger, M.J. Bayarri, German Molina June 20, 2001 SAMO 2001, Madrid *Project of.
Suppressing Random Walks in Markov Chain Monte Carlo Using Ordered Overrelaxation Radford M. Neal 발표자 : 장 정 호.
A Preliminary Calibrated Deglaciation Chronology for the Eurasian Ice Complex Lev Tarasov¹, W.R. Peltier², R. Gyllencreutz³, O. Lohne³, J. Mangerud³, and.
A Bayesian Calibrated Deglacial History for the North American Ice Complex Lev Tarasov, Radford Neal, and W. R. Peltier University of Toronto.
A new calibrated deglacial drainage history for North America and evidence for an Arctic trigger for the Younger Dryas Lev Tarasov and W. R. Peltier University.
Tracking Multiple Cells By Correspondence Resolution In A Sequential Bayesian Framework Nilanjan Ray Gang Dong Scott T. Acton C.L. Brown Department of.
The Co-evolution of continental ice cover and 3D ground temperature over the last glacial cycle Lev Tarasov Memorial University of Newfoundland (Thanks.
Dropout as a Bayesian Approximation
Application of the MCMC Method for the Calibration of DSMC Parameters James S. Strand and David B. Goldstein The University of Texas at Austin Sponsored.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Reducing MCMC Computational Cost With a Two Layered Bayesian Approach
Continental Hydrology, Rapid Climate Change, and the Intensity of the Atlantic MOC: Insights from Paleoclimatology W.R. Peltier Department of Physics University.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 17: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
RECITATION 2 APRIL 28 Spline and Kernel method Gaussian Processes Mixture Modeling for Density Estimation.
Tea – Time - Talks Every Friday 3.30 pm ICS 432. We Need Speakers (you)! Please volunteer. Philosophy: a TTT (tea-time-talk) should approximately take.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
Hierarchical Models. Conceptual: What are we talking about? – What makes a statistical model hierarchical? – How does that fit into population analysis?
Prediction and Missing Data. Summarising Distributions ● Models are often large and complex ● Often only interested in some parameters – e.g. not so interested.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Markov Chain Monte Carlo in R
Missing data: Why you should care about it and what to do about it
Reducing Photometric Redshift Uncertainties Through Galaxy Clustering
Lev Tarasov¹, W.R. Peltier², R. Gyllencreutz³, O. Lohne³,
Lev Tarasov and W.R. Peltier
Bayesian data analysis
Chapter 13 – Ensembles and Uplift
CSC321: Neural Networks Lecture 19: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
Linear and generalized linear mixed effects models
Memorial University of Newfoundland
8.2 kyr: Model + Data -> North American Meltwater Discharge
MLP Based Feedback System for Gas Valve Control in a Madison Symmetric Torus Andrew Seltzman Dec 14, 2010.
Markov Chain Monte Carlo
Markov Chain Monte Carlo
Information Based Criteria for Design of Experiments
Filtering and State Estimation: Basic Concepts
Lev Tarasov and W.R. Peltier
Chap. 7 Regularization for Deep Learning (7.8~7.12 )
by K. Jun Tong, Sebastián Duchêne, Simon Y. W. Ho, and Nathan Lo
Particle Filters for Event Detection
Transient simulations of the last 30,000 years, within the GENIE earth-system framework D.J. Lunt (1) M.Williamson (2) A. Price (3) P.J. Valdes (1)
Cross-validation for the selection of statistical models
THE NEED FOR DATA/MODEL INTEGRATION
Robust Full Bayesian Learning for Neural Networks
Ensemble learning Reminder - Bagging of Trees Random Forest
Lev Tarasov, Radford Neal, and W. R. Peltier University of Toronto
Chapter 14 Monte Carlo Simulation
Bayesian Estimation of Toluene and Trichloroethylene Biodegradation Kinetic Parameters Feng Yu and Breda Munoz RTI International.
Yalchin Efendiev Texas A&M University
Probabilistic Surrogate Models
Presentation transcript:

Bayesian calibration of earth systems models Lev Tarasov¹, Radford Neal², and W. R. Peltier² (1) Memorial University of Newfoundland, (2) University of Toronto, Canada (1) Context: noisy data and model with significant computational cost Data: Relative Sea Level (RSL), geodetic (surface uplift), ice margin chronology, paleo-lake levels (strandlines),... Model: MUN/UofT Glacial Systems Model (GSM): 3D thermo-mechanically coupled ice-sheet model, visco- elastic bedrock response, surface drainage solver,... 32 ensemble parameters, non-linear system, large heterogeneous noisy constraint data set (4) Calibration Performance: MCMC MCMC chains sometimes get stuck around local minima Overall, MCMC sampling produced a much higher density of better fitting models than that of an ensemble with a Latin Hypercube set of parameters from the prior distribution (“random” in figure below). Figure 3a. Model results versus neural network predictions (3) Calibration validation: neural networks Networks generally captured most of the model response RSL networks had the weakest fits due to large regional coverage and associated complexity of response Nevertheless, overall misfit prediction was reasonably accurate when RSL network was not overloaded Figure 4. Full misfit metric values versus model runs Figure 1. RSL site weights and example data (5) Some lessons Start with kitchen sink -> shrink parameter set (using automatic relevance determination) Disaggregate poorly performing neural networks Start with a reduced constraint set and run multiple chains (10+). Consider filtering the constraint set. Issues: priors, error models for constraint data, aggregated metrics, and extra constraints (physicality and model stability) DATA+MODEL+CALIBRATION= MEANINGFUL PROBABILITY DISTRIBUTION FOR MODEL PREDICTIONS (2) Calibration procedure Sample over posterior probability distribution for the ensemble parameters given fits to observational data using Markov Chain Monte Carlo (MCMC) methods T References Neal, R.M. (2003), Slice sampling, Ann. of Statis., 31, 705-767 Tarasov, L., and W. R. Peltier (2005), Arctic freshwater forcing of the Younger Dryas cold reversal, Nature, 435, 662­665 Figure 3b. Neg. scaled Logliklihood: model versus network