Presentation is loading. Please wait.

Presentation is loading. Please wait.

#10 MONTE CARLO SIMULATION Systems 303 - Fall 2000 Instructor: Peter M. Hahn

Similar presentations


Presentation on theme: "#10 MONTE CARLO SIMULATION Systems 303 - Fall 2000 Instructor: Peter M. Hahn"— Presentation transcript:

1 #10 MONTE CARLO SIMULATION Systems 303 - Fall 2000 Instructor: Peter M. Hahn hahn@seas.upenn.edu

2 MONTE CARLO SIMULATION Based upon the generation of random numbers Used for solving stochastic (or deterministic) problems where the passage of time plays no substantive role Widely used to solve certain problems in statistics that are not analytically tractable Specifically used for determining the critical values for the Kolmogorov-Smirnov test We will describe the concept, determine the number of random number samples required and give several examples

3 MODELING ALTERNATIVES

4 MONTE CARLO SIMULATION Consider the problem of evaluating a definite integral where for simplicity r(x) ≥ 0 on [a,b], r(x) is bounded above by y = c and an analytical solution is not possible for r(x) The graph of r bounds a region R within the rectangle defined by x=a, x=b, y=0 and y=c We present a method for determining A

5 MONTE CARLO SIMULATION y =r(x) c ab y x R

6 METHOD FOR ESTIMATING A A is simply the area of region R Select point (x,y) at random in rectangle abc With probability p, (x,y) will satisfy y ≤ r(x) If we can determine p, then it is clear that The Monte Carlo method involves estimating p by generating statistically independent points (x i,y i ) from the uniform distribution of points on the rectangle abc

7 METHOD FOR ESTIMATING A The pdf of points (x i,y i ) is given by Actually, we can do the same by generating

8 METHOD FOR ESTIMATING A We have N samples (x 1,y 1 ), (x 2,y 2 ),…, (x N,y N ) An estimate of p is the fraction of points in R How well approximates p depends on N The larger N the better the estimate

9 BIASEDNESS OF ESTIMATOR OF p? If is an unbiased estimator, its variance is a measure of how good an estimate it is is indeed an unbiased estimator

10 HOW GOOD IS OUR ESTIMATOR? The variance of is computed as follows

11 CHEBYCHEV’S INEQUALITY (A BOUND)

12  -- A B 

13 SAMPLES REQUIRED FOR p ESTIMATE

14 But, we really don’t know p at the start What if p = 0.1 and we guess 0.3? A third try will give us the required N=3600

15 MONTE CARLO SIMULATION Monte Carlo is commonly used today to determine outcomes of complex processes The process is represented by steps or equations in a general purpose computer The computer model is then exercised in a fashion similar to the real process to estimate the desired probabilities Often Monte Carlo is used for determining probability of rare but important events The rarer the event, the longer the simulation

16 AMMUNITION DEPOT EXAMPLE Bombers attempting to destroy a depot of odd shape fly over it in the E-W direction The pdf of bomb impact is two-dimensional Gaussian centered around the aim point with  E-W = 600m and  N-S = 300m The problem is to determine the percentage of bombs that hit the depot This example is hand simulated in Section 2.3 of B,C,N&N

17 AMMUNITION DEPOT EXAMPLE

18

19 The aiming point is considered the (0,0) point of the following distribution X and Y values can be computer-generated with the ‘direct transformation technique’ Assuming p = 3/7,  = 0.01 and  = 0.5

20 DIRECT TRANSFORMATION

21 MONTE CARLO SIMULATION Suppose we can express the desired system measure as the RV X=g(Y 1,…,Y k,Z 1,…,Z m ) where the Y’s represent random inputs and the Z’s represent control variables. Using Monte Carlo we can estimate E[h(X)] or P(x 1 <h(X)<x 2 ). h(.) a real fn or constant. We generate Y 1,…,Y k using their statistical distributions and a random number generator We read in Z 1,…,Z m from a data file or key these in manually

22 MONTE CARLO ALGORITHM N=number of simulation iterations i=1, H = 0, P =0 (Initialize) Read in Z 1,…,Z m Do while (i ≤ N) { Generate Y 1i,…,Y ki X i =g(Y 1i,…,Y ki,Z 1,…,Z m ) H = H + h(X i ) where h(X i ) can be equal to X i If (x 1 <h(X i )<x 2 ) P = P + 1} E[h(X)] = H/N P(x 1 <h(X)<x 2 ) = P/N

23 MONTE CARLO SIMULATION Good practice - collect cumulative variance for averages and Chebychev bounds for probabilities in order to revise number of samples N, if needed For either we need the cumulative sample mean:

24 CUMULATIVE SAMPLE VARIANCE

25 COMPUTER PRODUCTION EXAMPLE How many computers P to produce? Where D = Market demand ~ 200 + exp(0.02) L = Total labor cost 100+P M= Total material cost 10+3P I = Income per computer ~ N(5,1) We wish to maximize profit g(P) = I·min{D,P} - L - M For HW#10 conduct a Monte Carlo simulation to estimate E[g(250)]. Assume N=10,000 provides an accurate estimate


Download ppt "#10 MONTE CARLO SIMULATION Systems 303 - Fall 2000 Instructor: Peter M. Hahn"

Similar presentations


Ads by Google