Lecture 10 Outline Monte Carlo methods History of methods

Slides:



Advertisements
Similar presentations
CS433: Modeling and Simulation
Advertisements

Chapter 6 Sampling and Sampling Distributions
Monte Carlo Methods and Statistical Physics
Advanced Computer Graphics (Spring 2005) COMS 4162, Lectures 18, 19: Monte Carlo Integration Ravi Ramamoorthi Acknowledgements.
CF-3 Bank Hapoalim Jun-2001 Zvi Wiener Computational Finance.
Chapter 7 Sampling and Sampling Distributions
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Statistics. Large Systems Macroscopic systems involve large numbers of particles.  Microscopic determinism  Macroscopic phenomena The basis is in mechanics.
Chapter 6 Continuous Random Variables and Probability Distributions
Lecture 10 Outline Monte Carlo methods Monte Carlo methods History of methods History of methods Sequential random number generators Sequential random.
Statistics.
Molecular Kinesis CM2004 States of Matter: Gases.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Parallel Programming in C with MPI and OpenMP Michael J. Quinn.
Part III: Inference Topic 6 Sampling and Sampling Distributions
Experimental Evaluation
QMS 6351 Statistics and Research Methods Probability and Probability distributions Chapter 4, page 161 Chapter 5 (5.1) Chapter 6 (6.2) Prof. Vera Adamchik.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 6-1 Chapter 6 The Normal Distribution and Other Continuous Distributions.
The Monte Carlo Method: an Introduction Detlev Reiter Research Centre Jülich (FZJ) D Jülich
Lecture II-2: Probability Review
R. Kass/S07 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
Hamid R. Rabiee Fall 2009 Stochastic Processes Review of Elementary Probability Lecture I.
Body size distribution of European Collembola Lecture 9 Moments of distributions.
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
01/24/05© 2005 University of Wisconsin Last Time Raytracing and PBRT Structure Radiometric quantities.
Standard Statistical Distributions Most elementary statistical books provide a survey of commonly used statistical distributions. The reason we study these.
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Continuous Random Variables and Probability Distributions
Probability theory 2 Tron Anders Moger September 13th 2006.
General Principle of Monte Carlo Fall 2013 By Yaohang Li, Ph.D.
1 Lesson 3: Choosing from distributions Theory: LLN and Central Limit Theorem Theory: LLN and Central Limit Theorem Choosing from distributions Choosing.
Mid-Term Review Final Review Statistical for Business (1)(2)
Exam I review Understanding the meaning of the terminology we use. Quick calculations that indicate understanding of the basis of methods. Many of the.
Theory of Probability Statistics for Business and Economics.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
CS433: Modeling and Simulation Dr. Anis Koubâa Al-Imam Mohammad bin Saud University 15 October 2010 Lecture 05: Statistical Analysis Tools.
Module 1: Statistical Issues in Micro simulation Paul Sousa.
Monte Carlo Methods So far we have discussed Monte Carlo methods based on a uniform distribution of random numbers on the interval [0,1] p(x) = 1 0  x.
Statistical Methods II: Confidence Intervals ChE 477 (UO Lab) Lecture 4 Larry Baxter, William Hecker, & Ron Terry Brigham Young University.
Lecture 2 Basics of probability in statistical simulation and stochastic programming Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius,
Chapter 5 Parameter estimation. What is sample inference? Distinguish between managerial & financial accounting. Understand how managers can use accounting.
Monté Carlo Simulation  Understand the concept of Monté Carlo Simulation  Learn how to use Monté Carlo Simulation to make good decisions  Learn how.
Statistics Presentation Ch En 475 Unit Operations.
Risk Analysis & Modelling Lecture 2: Measuring Risk.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
B AD 6243: Applied Univariate Statistics Data Distributions and Sampling Professor Laku Chidambaram Price College of Business University of Oklahoma.
Molecular Modelling - Lecture 2 Techniques for Conformational Sampling Uses CHARMM force field Written in C++
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Sampling and estimation Petter Mostad
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
Probability and Distributions. Deterministic vs. Random Processes In deterministic processes, the outcome can be predicted exactly in advance Eg. Force.
Probability and Moment Approximations using Limit Theorems.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Gil McVean, Department of Statistics Thursday February 12 th 2009 Monte Carlo simulation.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Introduction to Computer Simulation of Physical Systems (Lecture 10) Numerical and Monte Carlo Methods (CONTINUED) PHYS 3061.
By Chris Towles.  After spending a lot of time trying to estimate them by pure combinatorial calculations, I wondered whether a more practical method.
Theoretical distributions: the Normal distribution.
Introduction to Probability - III John Rundle Econophysics PHYS 250
Computer Simulation Henry C. Co Technology and Operations Management,
Normal Distribution and Parameter Estimation
Expectations of Random Variables, Functions of Random Variables
1. What is a Monte Carlo method ?
Appendix A: Probability Theory
Parallel Programming in C with MPI and OpenMP
Chapter 7: Sampling Distributions
Lecture 2 – Monte Carlo method in finance
Further Topics on Random Variables: 1
Presentation transcript:

Lecture 10 Outline Monte Carlo methods History of methods Sequential random number generators Parallel random number generators Generating non-uniform random numbers Monte Carlo case studies

Monte Carlo Methods Monte Carlo is another name for statistical sampling methods of great importance to physics and computer science Applications of Monte Carlo Method Evaluating integrals of arbitrary functions of 6+ dimensions Predicting future values of stocks Solving partial differential equations Sharpening satellite images Modeling cell populations Finding approximate solutions to NP-hard problems

An Interesting History of Statistical Physics In 1738, Swiss physicist and mathematician Daniel Bernoulli published Hydrodynamica which laid the basis for the kinetic theory of gases: great numbers of molecules moving in all directions, that their impact on a surface causes the gas pressure that we feel, and that what we experience as heat is simply the kinetic energy of their motion. In 1859, Scottish physicist James Clerk Maxwell formulated the distribution of molecular velocities, which gave the proportion of molecules having a certain velocity in a specific range. This was the first-ever statistical law in physics. Maxwell used a simple thought experiment: particles must move independent of any chosen coordinates, hence the only possible distribution of velocities must be normal in each coordinate. In 1864, Ludwig Boltzmann, a young student in Vienna, came across Maxwell’s paper and was so inspired by it that he spent much of his long, distinguished, and tortured life developing the subject further.

History of Monte Carlo Method Credit for inventing the Monte Carlo method is shared by Stanislaw Ulam, John von Neuman and Nicholas Metropolis. Ulam, a Polish born mathematician, worked for John von Neumann on the Manhattan Project. Ulam is known for designing the hydrogen bomb with Edward Teller in 1951. In a thought experiment he conceived of the MC method in 1946 while pondering the probabilities of winning a card game of solitaire. Ulam, von Neuman, and Metropolis developed algorithms for computer implementations, as well as exploring means of transforming non-random problems into random forms that would facilitate their solution via statistical sampling. This work transformed statistical sampling from a mathematical curiosity to a formal methodology applicable to a wide variety of problems. It was Metropolis who named the new methodology after the casinos of Monte Carlo. Ulam and Metropolis published a paper called “The Monte Carlo Method” in Journal of the American Statistical Association, 44 (247), 335-341, in 1949.

Errors in Estimation and Two Important Questions for Monte Carlo Errors arise from two sources Statistical: using finite number of samples to estimate an inifinte sum. Computational: using finite state machines to produce statistically independent random numbers. Questions To ensure a level of statistical accuracy, i.e., m significant digits with probability 1-1/n, how many samples are needed? Given n samples, how statistically accurate is the estimate? “the best methods rarely offers more than 2-3 digit accuracy”-- G. Fishman, Monte Carlo Methods, Springer

Controlling Error Two tenets from statistics: Weak Law of Large Numbers value of a sum of i.i.d random variables converges to me n u ( u is mean). So as n grows approx error vanishes Central Limit Theorem Sum converges to a normal distribution with mean nu and variance n var sqrt(n) So as n grows error can be estimated using normal tables and theorems about tails of distribution. Caveats Convergence rates differ and can be complex Results in additional error estimates or need larger sample size than normal distribution would indicate

Sample Size and Chebyshev’s Theorem Theorem: If X is a discrete random variable with mean  and standard deviation , then for every positive real number or equivalently That is, the probability that a random variable will assume a value more than h standard deviations away from its mean is never greater than the square of the reciprocal of h.

An Example: 200 years ago, Comte de Buffon, proposed the following problem. If a needle of length l is dropped at random on the middle of a horizontal surface ruled with parallel lines a distance d > l apart, what is the probability that the needle will cross one of the lines?  The positioning of the needle relative to nearby lines can be described with a random vector (A, theta) is uniformly distributed on the region [0,d) [0,pi). Accordingly, it has probability density function above. The probability that the needle will cross one of the lines is given by the integral

Estimation and variance reduction This integral is easily solved to obtain the probability Suppose Buffon’s experiment is performed with the needle being dropped n times. Let M be the random variable for the number of times the needle crosses a line, Certain techniques can be used to reduce variance, e.g., an experimenter Fox used five inch needle and made the lines on the surface just two inches apart. Now, the needle could cross as many as three lines in a single toss. In 590 tosses, Fox obtained 939 line crossings. This was a precursor of today’s variance reduction techniques.

Variance Reduction There are several ways to reduce the variance Importance Sampling Stratified Sampling Quasi-random Sampling Metropolis Random Mutations

Simulation Often rely on Other Distributions Analytical transformations Box-Muller Transformation Rejection method

Analytical Transformation -probability density function f(x) -cumulative distribution F(x) In theory of probability, a quantile function of a distribution is the inverse of its cumulative distribution function.

Exponential Distribution: An exponential distribution arises naturally when modeling the time between independent events that happen at a constant average rate and are memoryless. One of the few cases where the quartile function is known analytically. 1.0

Example 1: Produce four samples from an exponential distribution with mean 3 Uniform sample: 0.540, 0.619, 0.452, 0.095 Take natural log of each value and multiply by -3 Exponential sample: 1.850, 1.440, 2.317, 7.072

Example 2: Simulation advances in time steps of 1 second Probability of an event happening is from an exponential distribution with mean 5 seconds What is probability that event will happen in next second? F(x=1/5) =1 - exp(-1/5)) = 0.181269247 Use uniform random number to test for occurrence of event (if u < 0.181 then ‘event’ else ‘no event’)

Normal Distributions: Box-Muller Transformation Cannot invert cumulative distribution function to produce formula yielding random numbers from normal (gaussian) distribution Box-Muller transformation produces a pair of standard normal deviates g1 and g2 from a pair of normal deviates u1 and u2

Box-Muller Transformation repeat v1  2u1 - 1 v2  2u2 - 1 r  v12 + v22 until r > 0 and r < 1 f  sqrt (-2 ln r /r ) g1  f v1 g2  f v2 This is a consequence of the fact that the chi-square distribution with two degrees of freedom is an easily-generated exponential random variable.

Example Produce four samples from a normal distribution with mean 0 and standard deviation 1 u1 u2 v1 v2 r f g1 g2 0.234 0.784 -0.532 0.568 0.605 1.290 -0.686 0.732 0.824 0.039 0.648 -0.921 1.269 0.430 0.176 -0.140 -0.648 0.439 1.935 -0.271 -1.254

Von Neumann’s Rejection Method

Case Studies (Topics Introduced) Temperature inside a 2-D plate (Random walk) Two-dimensional Ising model (Metropolis algorithm) Room assignment problem (Simulated annealing) Parking garage (Monte Carlo time) Traffic circle (Simulating queues)

Temperature Inside a 2-D Plate Random walk

Example of Random Walk 3 2 2 1 2 1 3 1

Parking Garage Parking garage has S stalls Car arrivals fit Poisson distribution with mean A: Exponentially distributed inter-arrival times Stay in garage fits a normal distribution with mean M and standard deviation M/S

Implementation Idea 64.2 Times Spaces Are Available 101.2 142.1 70.3 91.7 223.1 Current Time Car Count Cars Rejected 64.2 15 2

Summary Concepts revealed in case studies Monte Carlo time Random walk Metropolis algorithm Simulated annealing Modeling queues