Bayesian parameter estimation in cosmology with Population Monte Carlo By Darell Moodley (UKZN) Supervisor: Prof. K Moodley (UKZN) SKA Postgraduate conference,

Slides:



Advertisements
Similar presentations
Image Modeling & Segmentation
Advertisements

Markov Chain Monte Carlo Convergence Diagnostics: A Comparative Review By Mary Kathryn Cowles and Bradley P. Carlin Presented by Yuting Qi 12/01/2006.
CS479/679 Pattern Recognition Dr. George Bebis
METHODS FOR HAPLOTYPE RECONSTRUCTION
Bayesian Estimation in MARK
Efficient Cosmological Parameter Estimation with Hamiltonian Monte Carlo Amir Hajian Amir Hajian Cosmo06 – September 25, 2006 Astro-ph/
K Means Clustering , Nearest Cluster and Gaussian Mixture
Markov-Chain Monte Carlo
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
BAYESIAN INFERENCE Sampling techniques
Computing the Posterior Probability The posterior probability distribution contains the complete information concerning the parameters, but need often.
Bayesian Analysis of X-ray Luminosity Functions A. Ptak (JHU) Abstract Often only a relatively small number of sources of a given class are detected in.
USE OF LAPLACE APPROXIMATIONS TO SIGNIFICANTLY IMPROVE THE EFFICIENCY
End of Chapter 8 Neil Weisenfeld March 28, 2005.
1 Transforming the efficiency of Partial EVSI computation Alan Brennan Health Economics and Decision Science (HEDS) Samer Kharroubi Centre for Bayesian.
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
Bayesian Analysis for Extreme Events Pao-Shin Chu and Xin Zhao Department of Meteorology School of Ocean & Earth Science & Technology University of Hawaii-
Nonlinear Stochastic Programming by the Monte-Carlo method Lecture 4 Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO.
Robin McDougall, Ed Waller and Scott Nokleby Faculties of Engineering & Applied Science and Energy Systems & Nuclear Science 1.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Introduction to Monte Carlo Methods D.J.C. Mackay.
Bayes Factor Based on Han and Carlin (2001, JASA).
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
Model Inference and Averaging
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
1 A Bayesian Method for Guessing the Extreme Values in a Data Set Mingxi Wu, Chris Jermaine University of Florida September 2007.
Bayesian inference review Objective –estimate unknown parameter  based on observations y. Result is given by probability distribution. Bayesian inference.
Hierarchical Bayesian Modeling (HBM) in EEG and MEG source analysis Carsten Wolters Institut für Biomagnetismus und Biosignalanalyse, Westfälische Wilhelms-Universität.
Stochastic Linear Programming by Series of Monte-Carlo Estimators Leonidas SAKALAUSKAS Institute of Mathematics&Informatics Vilnius, Lithuania
Probabilistic Robotics Bayes Filter Implementations.
Particle Filters for Shape Correspondence Presenter: Jingting Zeng.
Finding Scientific topics August , Topic Modeling 1.A document as a probabilistic mixture of topics. 2.A topic as a probability distribution.
1 Gil McVean Tuesday 24 th February 2009 Markov Chain Monte Carlo.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Center for Radiative Shock Hydrodynamics Fall 2011 Review Assessment of predictive capability Derek Bingham 1.
Suppressing Random Walks in Markov Chain Monte Carlo Using Ordered Overrelaxation Radford M. Neal 발표자 : 장 정 호.
Bayesian Reasoning: Tempering & Sampling A/Prof Geraint F. Lewis Rm 560:
CAMELS CCDAS A Bayesian approach and Metropolis Monte Carlo method to estimate parameters and uncertainties in ecosystem models from eddy-covariance data.
Tracking Multiple Cells By Correspondence Resolution In A Sequential Bayesian Framework Nilanjan Ray Gang Dong Scott T. Acton C.L. Brown Department of.
CHAPTER 17 O PTIMAL D ESIGN FOR E XPERIMENTAL I NPUTS Organization of chapter in ISSO –Background Motivation Finite sample and asymptotic (continuous)
Cosmological Model Selection David Parkinson (with Andrew Liddle & Pia Mukherjee)
MCMC (Part II) By Marc Sobel. Monte Carlo Exploration  Suppose we want to optimize a complicated distribution f(*). We assume ‘f’ is known up to a multiplicative.
Lecture 2: Statistical learning primer for biologists
3rd International Workshop on Dark Matter, Dark Energy and Matter-Antimatter Asymmetry NTHU & NTU, Dec 27—31, 2012 Likelihood of the Matter Power Spectrum.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Bayesian Travel Time Reliability
Selecting a mass function by way of the Bayesian Razor Darell Moodley (UKZN), Dr. Kavilan Moodley (UKZN), Dr. Carolyn Sealfon (WCU)
Reducing MCMC Computational Cost With a Two Layered Bayesian Approach
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Statistical Models for Automatic Speech Recognition Lukáš Burget.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Regularization of energy-based representations Minimize total energy E p (u) + (1- )E d (u,d) E p (u) : Stabilizing function - a smoothness constraint.
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
Statistical Methods. 2 Concepts and Notations Sample unit – the basic landscape unit at which we wish to establish the presence/absence of the species.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
Combining Lensing with SNIa and CMB Sampling the Posterior Martin Kilbinger IAP Paris Martin Kilbinger IAP Paris Toronto, 13 June 2008 Upcoming lensing.
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
How many iterations in the Gibbs sampler? Adrian E. Raftery and Steven Lewis (September, 1991) Duke University Machine Learning Group Presented by Iulian.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Markov Chain Monte Carlo in R
MCMC Output & Metropolis-Hastings Algorithm Part I
Advanced Statistical Computing Fall 2016
Statistical Models for Automatic Speech Recognition
Remember that our objective is for some density f(y|) for observations where y and  are vectors of data and parameters,  being sampled from a prior.
Statistical Models for Automatic Speech Recognition
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Presentation transcript:

Bayesian parameter estimation in cosmology with Population Monte Carlo By Darell Moodley (UKZN) Supervisor: Prof. K Moodley (UKZN) SKA Postgraduate conference, 29 Nov 2010

 Estimate cosmological parameters for specified models efficiently.  To quantitatively discriminate one model from another in light of data (model testing).  Other relevant applications of parameter estimation include optimizing experimental configurations e.g. MeerKAT antenna parameters.

 Provides an expression for the posterior probability that contains the uncertainty regarding parameters of interest.  Difficult to evaluate posterior, due to the normalising constant.  Solution: Use a simulation to draw samples from this distribution.

Population Monte Carlo (PMC) Is an adaptive version of importance sampling. Constructs a sequence of samples to provide improved estimations of parameters. Based on the fundamental identity: are drawn from q, we estimate by,

Methodology Draw samples from importance function:, where and are D component weights that are proportions of the sample taken from each mixture density,, with parameters,. Allocate weights to samples:

Updating rule

Mixture densities Mixture densities with iteration The sum of the mixture densities iteratively approaches the target distribution.

Convergence of the importance function Convergence is reached when the importance function adequately resembles that of the target distribution.

 Both methods generate samples that are representative of complex distributions.  MCMC draws from a proposal distribution while PMC draws from an importance function, that can be chosen to be a mixture of densities.  PMC can reduce computational time and since chains are not correlated, there is no ‘burn-in’ period.  PMC, like MCMC, also has the ability to be parallelisable, hence computationally feasible.  Each iteration produces an independent sample, therefore it can be stopped at any time.

Illustrative example Banana shaped distribution that we wish to simulate draws from Updated importance function after 11 iterations Wraith et al. 2009

 Optimisation Search regions of high likelihood to determine optimal parameters. Determine maximum likelihood estimates.  Model selection Ability to compute Bayesian evidence from existing chains, hence compute Bayes’ factor for different models. Evidence is immediately accessible from the sample used for parameter estimation.

Model Testing Kilbinger et al PMC can be used to determine the Bayes’ Factor, used to discriminate between models Test extensions of the standard model with dark-energy and curvature scenarios

 Do a systematic study of the PMC method.  Examine the behaviour between algorithm dependencies and efficiency  A quantitative comparison between PMC and MCMC efficiency  Estimation of cosmological parameters using current data  As well as discriminate between cosmological models.

References

THE END THANK YOU

Adaptive Importance sampling Use the Kullback-Liebler distance measure Incorporate mixture densities D component weights,such that

Estimator for the Evidence Using importance sampling: where are the importance weights for importance distribution q. Variance is given by Want to choose optimal q such that σ is minimised.

Diagnostics Want to maximise so we use the perplexity as an estimate

Application to cosmology Compare the cosmological constant and flat ΛCDM model to Dark Matter models A Flat (Ω k =0) and Curved (Ω k ≠0) model is assumed for each Dark matter model.

Priors for dark energy and curvature models

Specifying the PMC parameters For the Dark energy models: T=10, but can increase if perplexity is still low. N=7 500 D=10 N/D should be chosen not too small to ensure numerically stable updating of the component.

Results Standard ΛCDM model is favoured.

Testing Stability Repeat the PMC runs 25 times.

Primordial fluctuation models Dark matter density fluctuations are given by the power spectrum with tensor modes Parametrise the parameters in terms of the slow-roll parameters.

Assumptions and results PriorsResults

Constraining parameters and Model discrimination PMC can be used to determine the Bayes’ Factor, used to discriminate between models PMC is also used to constrain the dark energy equation of state using various data. Kilbinger et al. 2010

Jeffreys’ scale