Monte Carlo Methods1 T-61.182 Special Course In Information Science II Tomas Ukkonen

Slides:



Advertisements
Similar presentations
Slice Sampling Radford M. Neal The Annals of Statistics (Vol. 31, No. 3, 2003)
Advertisements

Monte Carlo Methods and Statistical Physics
Bayesian Estimation in MARK
Monte Carlo Methods for Inference and Learning Guest Lecturer: Ryan Adams CSC 2535
Ch 11. Sampling Models Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by I.-H. Lee Biointelligence Laboratory, Seoul National.
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
Computer Vision Lab. SNU Young Ki Baik An Introduction to MCMC for Machine Learning (Markov Chain Monte Carlo)
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
Bayesian Reasoning: Markov Chain Monte Carlo
Bayesian statistics – MCMC techniques
Suggested readings Historical notes Markov chains MCMC details
Stochastic approximate inference Kay H. Brodersen Computational Neuroeconomics Group Department of Economics University of Zurich Machine Learning and.
BAYESIAN INFERENCE Sampling techniques
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
June 2, MARKOV CHAIN MONTE CARLO: A Workhorse for Modern Scientific Computation Xiao-Li Meng Department of Statistics Harvard University.
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
The University of Texas at Austin, CS 395T, Spring 2008, Prof. William H. Press IMPRS Summer School 2009, Prof. William H. Press 1 4th IMPRS Astronomy.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
The Monte Carlo Method: an Introduction Detlev Reiter Research Centre Jülich (FZJ) D Jülich
Impact Evaluation Session VII Sampling and Power Jishnu Das November 2006.
Monte Carlo Methods in Partial Differential Equations.
Computer Simulation A Laboratory to Evaluate “What-if” Questions.
Problem 1 Given a high-resolution computer image of a map of an irregularly shaped lake with several islands, determine the water surface area. Assume.
Introduction to Monte Carlo Methods D.J.C. Mackay.
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
Analysis of Monte Carlo Integration Fall 2012 By Yaohang Li, Ph.D.
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
Random Number Generators CISC/QCSE 810. What is random? Flip 10 coins: how many do you expect will be heads? Measure 100 people: how are their heights.
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
Priors, Normal Models, Computing Posteriors
Monte Carlo Simulation CWR 6536 Stochastic Subsurface Hydrology.
Chapter 14 Monte Carlo Simulation Introduction Find several parameters Parameter follow the specific probability distribution Generate parameter.
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
Monte Carlo I Previous lecture Analytical illumination formula This lecture Numerical evaluation of illumination Review random variables and probability.
Module 1: Statistical Issues in Micro simulation Paul Sousa.
Random Numbers and Simulation  Generating truly random numbers is not possible Programs have been developed to generate pseudo-random numbers Programs.
1 Gil McVean Tuesday 24 th February 2009 Markov Chain Monte Carlo.
Analysis of Exchange Ratio for Exchange Monte Carlo Method Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology Japan.
Monte Carlo Methods So far we have discussed Monte Carlo methods based on a uniform distribution of random numbers on the interval [0,1] p(x) = 1 0  x.
Suppressing Random Walks in Markov Chain Monte Carlo Using Ordered Overrelaxation Radford M. Neal 발표자 : 장 정 호.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
Bayes’ Nets: Sampling [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available.
Improved Cross Entropy Method For Estimation Presented by: Alex & Yanna.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 07: BAYESIAN ESTIMATION (Cont.) Objectives:
: An alternative representation of level of significance. - normal distribution applies. - α level of significance (e.g. 5% in two tails) determines the.
MCMC (Part II) By Marc Sobel. Monte Carlo Exploration  Suppose we want to optimize a complicated distribution f(*). We assume ‘f’ is known up to a multiplicative.
Ch. 14: Markov Chain Monte Carlo Methods based on Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009.; C, Andrieu, N, de Freitas,
An Introduction to Markov Chain Monte Carlo Teg Grenager July 1, 2004.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
Statistical Inference for the Mean Objectives: (Chapter 8&9, DeCoursey) -To understand the terms variance and standard error of a sample mean, Null Hypothesis,
CS498-EA Reasoning in AI Lecture #19 Professor: Eyal Amir Fall Semester 2011.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Introduction to Sampling based inference and MCMC
MCMC Output & Metropolis-Hastings Algorithm Part I
Introduction to Monte Carlo Method
Advanced Statistical Computing Fall 2016
Basic simulation methodology
Jun Liu Department of Statistics Stanford University
Markov chain monte carlo
Inference Inference: calculating some useful quantity from a joint probability distribution Examples: Posterior probability: Most likely explanation: B.
Ch13 Empirical Methods.
CS 188: Artificial Intelligence
Lecture 15 Sampling.
Presentation transcript:

Monte Carlo Methods1 T Special Course In Information Science II Tomas Ukkonen

Monte Carlo Methods2 Problem 1. generate samples from given probability distribution P(x) 2. estimate The second problem can be solved by using random samples from P(x)

Monte Carlo Methods3 Why sampling is hard? densities may be unscaled: hard to know how probable a certain point is when the rest of function is unknown curse of dimensionality

Monte Carlo Methods4 Brute force method why don’t just calculate expected value directly problem grows exponentially as the function of dimension d number states to check grow exponentially

Monte Carlo Methods5 Brute force method, cont. going through most of the cases is likely to be unnecessary high-dimensional, low entropy densities are often concentrated to small regions

Monte Carlo Methods6 Uniform sampling for small dimensional problems Just sample uniformly and weight with Required number of samples for reliable estimators still grows exponentially

Monte Carlo Methods7 Importance sampling idea: approximate complicated distribution with simpler one only works when correct shape of distribution is known doesn’t scale to high dimensions even when approximation is almost right

Monte Carlo Methods8 Rejection sampling Alternative approximation based sampling method sample uniformly from (x,u) = (x,cQ(x)) and reject samples where u > P(x) doesn’t scale to high dimensions

Monte Carlo Methods9 The Metropolis-Hastings method The previous approaches didn’t scale to high dimensions In Metropolis algorithm sampling distribution depends on samples sampled so far

Monte Carlo Methods10 The Metropolis-Hastings, cont. A new state is drawn from distribution and accepted with a certain probability which guarantees convergence to the target density The method doesn’t depend on dimensionality of a problem, but samples are correlated and a random walk based moving is slow

Monte Carlo Methods11 Gibbs sampling a special case of the metropolis method where only single dimension is updated per iteration useful when only conditional densities are known one dimensional distributions are easier to work with

Monte Carlo Methods12 Gibbs sampling, cont.

Monte Carlo Methods13 Slice sampling a newer method which is combination of rejection, Gibbs and Metropolis sampling still a random walk method but with a self tuning step length

Monte Carlo Methods14 Slice sampling, cont. faster integer based algorithm has been also developed

Monte Carlo Methods15 Slice sampling, cont.

Monte Carlo Methods16 Slice sampling, cont.

Monte Carlo Methods17 Practical issues Hard to know for certain when Monte Carlo simulation has converged Caculating normalization constant allocation of computational resources: one long simulation or more shorter ones?

Monte Carlo Methods18 Practical issues II, cont. Big Models Metropolis method & Gibbs sampling - update variables in batches How many samples - how much accuracy is needed? - typically samples is enough

Monte Carlo Methods19 Exercises & References exercise exercise NN.N. David J.C. Mackay: Information Theory, Inference, and Learning Algorithms, 2003