Download presentation
Presentation is loading. Please wait.
Published byGinger French Modified over 9 years ago
1
Monte Carlo Methods1 T-61.182 Special Course In Information Science II Tomas Ukkonen tomas.ukkonen@iki.fi
2
Monte Carlo Methods2 Problem 1. generate samples from given probability distribution P(x) 2. estimate The second problem can be solved by using random samples from P(x)
3
Monte Carlo Methods3 Why sampling is hard? densities may be unscaled: hard to know how probable a certain point is when the rest of function is unknown curse of dimensionality
4
Monte Carlo Methods4 Brute force method why don’t just calculate expected value directly problem grows exponentially as the function of dimension d number states to check grow exponentially
5
Monte Carlo Methods5 Brute force method, cont. going through most of the cases is likely to be unnecessary high-dimensional, low entropy densities are often concentrated to small regions
6
Monte Carlo Methods6 Uniform sampling for small dimensional problems Just sample uniformly and weight with Required number of samples for reliable estimators still grows exponentially
7
Monte Carlo Methods7 Importance sampling idea: approximate complicated distribution with simpler one only works when correct shape of distribution is known doesn’t scale to high dimensions even when approximation is almost right
8
Monte Carlo Methods8 Rejection sampling Alternative approximation based sampling method sample uniformly from (x,u) = (x,cQ(x)) and reject samples where u > P(x) doesn’t scale to high dimensions
9
Monte Carlo Methods9 The Metropolis-Hastings method The previous approaches didn’t scale to high dimensions In Metropolis algorithm sampling distribution depends on samples sampled so far
10
Monte Carlo Methods10 The Metropolis-Hastings, cont. A new state is drawn from distribution and accepted with a certain probability which guarantees convergence to the target density The method doesn’t depend on dimensionality of a problem, but samples are correlated and a random walk based moving is slow
11
Monte Carlo Methods11 Gibbs sampling a special case of the metropolis method where only single dimension is updated per iteration useful when only conditional densities are known one dimensional distributions are easier to work with
12
Monte Carlo Methods12 Gibbs sampling, cont.
13
Monte Carlo Methods13 Slice sampling a newer method which is combination of rejection, Gibbs and Metropolis sampling still a random walk method but with a self tuning step length
14
Monte Carlo Methods14 Slice sampling, cont. faster integer based algorithm has been also developed
15
Monte Carlo Methods15 Slice sampling, cont.
16
Monte Carlo Methods16 Slice sampling, cont.
17
Monte Carlo Methods17 Practical issues Hard to know for certain when Monte Carlo simulation has converged Caculating normalization constant allocation of computational resources: one long simulation or more shorter ones?
18
Monte Carlo Methods18 Practical issues II, cont. Big Models Metropolis method & Gibbs sampling - update variables in batches How many samples - how much accuracy is needed? - typically 10-1000 samples is enough
19
Monte Carlo Methods19 Exercises & References exercise 29.4. exercise NN.N. David J.C. Mackay: Information Theory, Inference, and Learning Algorithms, 2003
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.