Monte` Carlo Methods 1 MONTE` CARLO METHODS INTEGRATION and SAMPLING TECHNIQUES.

Slides:



Advertisements
Similar presentations
Sample Approximation Methods for Stochastic Program Jerry Shen Zeliha Akca March 3, 2005.
Advertisements

Monte Carlo Methods and Statistical Physics
Continuous Probability Distributions.  Experiments can lead to continuous responses i.e. values that do not have to be whole numbers. For example: height.
Randomized Algorithms Kyomin Jung KAIST Applied Algorithm Lab Jan 12, WSAC
Bayesian statistics – MCMC techniques
Outline Formulation of Filtering Problem General Conditions for Filtering Equation Filtering Model for Reflecting Diffusions Wong-Zakai Approximation.
Review of Basic Probability and Statistics
Chapter 1 Probability Theory (i) : One Random Variable
Monte Carlo Integration Robert Lin April 20, 2004.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
Sampling and Sampling distributions
A gentle introduction to Gaussian distribution. Review Random variable Coin flip experiment X = 0X = 1 X: Random variable.
Pricing an Option Monte Carlo Simulation. We will explore a technique, called Monte Carlo simulation, to numerically derive the price of an option or.
Statistics.
Lec 6, Ch.5, pp90-105: Statistics (Objectives) Understand basic principles of statistics through reading these pages, especially… Know well about the normal.
The moment generating function of random variable X is given by Moment generating function.
Maximum likelihood (ML)
Monte Carlo Methods in Partial Differential Equations.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
Analysis of Monte Carlo Integration Fall 2012 By Yaohang Li, Ph.D.
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
01/24/05© 2005 University of Wisconsin Last Time Raytracing and PBRT Structure Radiometric quantities.
Standard Statistical Distributions Most elementary statistical books provide a survey of commonly used statistical distributions. The reason we study these.
IRDM WS Chapter 2: Basics from Probability Theory and Statistics 2.1 Probability Theory Events, Probabilities, Random Variables, Distributions,
Continuous Probability Distributions  Continuous Random Variable  A random variable whose space (set of possible values) is an entire interval of numbers.
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
General Principle of Monte Carlo Fall 2013 By Yaohang Li, Ph.D.
1 Lesson 3: Choosing from distributions Theory: LLN and Central Limit Theorem Theory: LLN and Central Limit Theorem Choosing from distributions Choosing.
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
Sampling Methods  Sampling refers to how observations are “selected” from a probability distribution when the simulation is run. 1.
Monte Carlo I Previous lecture Analytical illumination formula This lecture Numerical evaluation of illumination Review random variables and probability.
Random Numbers and Simulation  Generating truly random numbers is not possible Programs have been developed to generate pseudo-random numbers Programs.
1 Lesson 8: Basic Monte Carlo integration We begin the 2 nd phase of our course: Study of general mathematics of MC We begin the 2 nd phase of our course:
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 26.
Monte Carlo Methods1 T Special Course In Information Science II Tomas Ukkonen
Probability Theory Overview and Analysis of Randomized Algorithms Prepared by John Reif, Ph.D. Analysis of Algorithms.
Convergence in Distribution
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
1 A Bayesian statistical method for particle identification in shower counters IX International Workshop on Advanced Computing and Analysis Techniques.
Math b (Discrete) Random Variables, Binomial Distribution.
Lesson 4: Computer method overview
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Lecture 2 Molecular dynamics simulates a system by numerically following the path of all particles in phase space as a function of time the time T must.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
7. Metropolis Algorithm. Markov Chain and Monte Carlo Markov chain theory describes a particularly simple type of stochastic processes. Given a transition.
The Markov Chain Monte Carlo Method Isabelle Stanton May 8, 2008 Theory Lunch.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Chapter 13 (Prototype Methods and Nearest-Neighbors )
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Gil McVean, Department of Statistics Thursday February 12 th 2009 Monte Carlo simulation.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Lesson 8: Basic Monte Carlo integration
Introductory Statistics and Data Analysis
Introduction to Monte Carlo Method
Probability Theory Overview and Analysis of Randomized Algorithms
4. Numerical Integration
Path Coupling And Approximate Counting
Monte Carlo Approximations – Introduction
Chebychev, Hoffding, Chernoff
Monte Carlo I Previous lecture Analytical illumination formula
4. Expectation and Variance Joint PMFs
Presentation transcript:

Monte` Carlo Methods 1 MONTE` CARLO METHODS INTEGRATION and SAMPLING TECHNIQUES

Monte` Carlo Methods2 THE BOOK by THE MAN

Monte` Carlo Methods3 PROBLEM STATEMENT System of equations and inequalities defines a region in m-spaceSystem of equations and inequalities defines a region in m-space Determine the volume of the regionDetermine the volume of the region

Monte` Carlo Methods4 HISTORY 19 th C. simple integral like E[X] using straight-forward sampling19 th C. simple integral like E[X] using straight-forward sampling System of PDE solved using sample paths of Markov ChainsSystem of PDE solved using sample paths of Markov Chains –Rayleigh 1899 –Markov 1931 Particles through a medium solved using Poisson Process and Random WalkParticles through a medium solved using Poisson Process and Random Walk –Manhattan Project Combinatorics in the ’80’s in RTP, NCCombinatorics in the ’80’s in RTP, NC

Monte` Carlo Methods5 GROOMING R = volumetric regionR = volumetric region R confined to [0,1] mR confined to [0,1] m  (R) = volume Generalized area-under-the- curve problemGeneralized area-under-the- curve problem

Monte` Carlo Methods6 ALGORITHM for i=1 to nfor i=1 to n –generate x in [0,1] m –is x in R? S=S+1S=S+1 endend  (R)=S/n

Monte` Carlo Methods7 MESH Generate x’s as a mesh of evenly spaced pointsGenerate x’s as a mesh of evenly spaced points Each point is 1/k from its nearest neighborEach point is 1/k from its nearest neighbor n=k mn=k m Many varieties of this method, generally called Multi-GridMany varieties of this method, generally called Multi-Grid

Monte` Carlo Methods8 ERROR CONTROL Define a(R) = the surface area of RDefine a(R) = the surface area of R a(R)/k = volume of a swath around the surface 1/k thicka(R)/k = volume of a swath around the surface 1/k thick a(R)/k=a(R)/(n 1/m ) bounds errora(R)/k=a(R)/(n 1/m ) bounds error

Monte` Carlo Methods9...more ERROR CONTROL

Monte` Carlo Methods10...more ERROR If we require error less than ...If we require error less than ... the required sample n grows like x mthe required sample n grows like x m

Monte` Carlo Methods11 PROBABLY NOT THAT BAD Reaction: the boundary of R isn’t usually so-alignedReaction: the boundary of R isn’t usually so-aligned Probability statement on the functions?Probability statement on the functions? –this math exists but is only marginally helpful with applied problems

Monte` Carlo Methods12 ALTERNATIVE Monte` Carlo MethodMonte` Carlo Method for i = 1 to nfor i = 1 to n –sample x from Uniform[0,1] m –is x in R? S = S + 1S = S + 1 endend  hat = S/n

Monte` Carlo Methods13 STATISTICAL TREATMENT S is now a RANDOM VARIABLES is now a RANDOM VARIABLE P[x in R] = P[x in R] =  –(volume of R)/(volume of unit hyper-cube) S is a sum of Bernoulli TrialsS is a sum of Bernoulli Trials S is Binomial(n, )S is Binomial(n, ) E[S] =  nE[S] =  n VAR[S] = n  (1-  )VAR[S] = n  (1-  )

Monte` Carlo Methods14 ESTIMATOR

Monte` Carlo Methods15 CHEBYCHEV’S INEQUALITY Bounds Tails of DistributionsBounds Tails of Distributions Z~F, E[Z]=0, VAR[Z]=  2,  > 0Z~F, E[Z]=0, VAR[Z]=  2,  > 0

Monte` Carlo Methods16 To get an error (statistical) bounded by ...To get an error (statistical) bounded by ...

Monte` Carlo Methods17 SIMPLER BOUNDS  (1-  ) is bounded by ¼ n = 1/(4  2  )n = 1/(4  2  ) Does not depend on m!Does not depend on m!

Monte` Carlo Methods18 SPREADSHEET Find the volume of a sphere centered at (0.5, 0.5, 0.5) with radius 0.5 in [0,1] 3Find the volume of a sphere centered at (0.5, 0.5, 0.5) with radius 0.5 in [0,1] 3 Chebyshev bounds look very loose compared with VAR( hat)Chebyshev bounds look very loose compared with VAR( hat) Use hat for in the sample size formulaUse hat for in the sample size formula Slow convergenceSlow convergence

Monte` Carlo Methods19 STRATIFIED SAMPLING Best of Mesh and Sampling MethodsBest of Mesh and Sampling Methods Very General application of Variance ReductionVery General application of Variance Reduction –survey sampling –experimental design –optimization via simulation

Monte` Carlo Methods20 PARAMETERS AND DEFINITIONS n = total number of sample pointsn = total number of sample points Sample region [0,1] m is divided into r subregions A 1, A 2,..., A rSample region [0,1] m is divided into r subregions A 1, A 2,..., A r p i = P[x in A i ]p i = P[x in A i ] k(x) =k(x) = –1 if x in R –0 otherwise –so E[k(x)] = –so E[k(x)] =

Monte` Carlo Methods21 DENSITY OF SAMPLES x f(x) is the m-dim density function of xf(x) is the m-dim density function of x –for generality –so we keep track of expectations –in our current scheme, f(x) = 1

Monte` Carlo Methods22 LAMBDA AYE

Monte` Carlo Methods23 STRATIFICATION old method: generate x’s across the whole regionold method: generate x’s across the whole region new method: generate the EXPECTED number of samples in each subregionnew method: generate the EXPECTED number of samples in each subregion

Monte` Carlo Methods24 let X j be the jth sample in the old methodlet X j be the jth sample in the old method capitols indicate random samples!

Monte` Carlo Methods25 VARIANCE OF THE ESTIMATOR

Monte` Carlo Methods26 STRATIFICATION Generate n 1, n 2,..., n r samples from A 1, A 2,..., A rGenerate n 1, n 2,..., n r samples from A 1, A 2,..., A r –on purpose n i = np in i = np i n i sum to nn i sum to n X i,j is jth sample from A iX i,j is jth sample from A i

Monte` Carlo Methods27  i is a conditional expectation

Monte` Carlo Methods28

Monte` Carlo Methods29

Monte` Carlo Methods30

Monte` Carlo Methods31 HOW THAT LAST BIT WORKED

Monte` Carlo Methods32...AND SO... Stratification reduces the variance of the estimatorStratification reduces the variance of the estimator A random quantity (the samples pulled from A i ) is replaced by its expectationA random quantity (the samples pulled from A i ) is replaced by its expectation This only works because of all of the SUMMATION and no other complicated functionsThis only works because of all of the SUMMATION and no other complicated functions

Monte` Carlo Methods33 FOR THE SPHERE PROBLEM 500 samples500 samples –Divide evenly in 64 cubes 4 X 4 X 44 X 4 X 4 7 or 8 samples in each cube7 or 8 samples in each cube –64 separate ’s –Add together How did we know to start with 500?How did we know to start with 500?

Monte` Carlo Methods 34 Discussion of applications...