Presentation is loading. Please wait.

Presentation is loading. Please wait.

CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Michael Baron. Probability and Statistics for Computer Scientists,

Similar presentations


Presentation on theme: "CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Michael Baron. Probability and Statistics for Computer Scientists,"— Presentation transcript:

1 CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Michael Baron. Probability and Statistics for Computer Scientists, CRC 2006 Instructor Longin Jan Latecki Ch. 6 Simulations

2 What is a simulation? One uses a model to create specific situations in order to study the response of the model to them and then interprets this in terms of what would happen to the system “in the real world”. Models for such systems involve random variables, and we speak of probabilistic or stochastic models, and simulating them is stochastic simulation. Stochastic simulation of a system means generating values for all the random variables in the model, according to their specified distributions, and recording and analyzing what happens We refer to the generated values as realizations of the random variables

3 Quick exercise 6.1 Describe how you can simulate a coin toss when instead of a coin you have a die. Any ideas on how to simulate a roll of a die if you only have a coin?

4 Quick exercise 6.1 Describe how you can simulate a coin toss when instead of a coin you have a die. Any ideas on how to simulate a roll of a die if you only have a coin? Repeat the coin tosses if you get TTH or TTT.

5 Quick exercise 6.2 A random variable Y has outcomes 1, 3, and 4 with the following probabilities: P(Y = 1) = 3/5, P(Y = 3) = 1/5, and P(Y = 4) = 1/5. Describe how to construct Y from a U(0, 1) random variable.

6 6.2 Generating realizations: Bernoulli Simulations are almost always done by computer, which usually have one or more so- called (pseudo) random number generators. Which mimics U(0,1). Bernoulli random Variables Suppose U has a U(0,1) distribution. To construct a Ber(p) random variable for some 0 < p < 1. so that A Matlab code: U = rand; X = (U<p)

7 Binomial Binomial Random Variables We can obtain a binomial RV as a sum of n independent Bernoulli RVs. For this we start with n uniform RVs, for example: A Matlab code: n = 20; p = 0.68; U = rand(n,1); X = sum(U<p)

8 Geometric Geometric Random Variables A while loop of Bernoulli trials generates a geometric RV. We run the loop until the first success occurs. Variable X in the while loop counts the number of failures. A Matlab code: X=1;%at least one trail while rand > p%continue while there are failures X = X + 1; end;%stop at the first success X

9 Arbitrary discrete distribution Arbitrary Discrete Random Variable X that takes values x0, x1, … with probabilities p i = P(X=x i ) such that p0+p1+…=1. Algorithm 1.Divide interval [0, 1] into subintervals of lengths pi A0 = [0, p0) A1 = [p0, p0+p1) A2 = [p0+p1, p0+p1+p2), … 2. Obtain U from Uniform(0, 1), the standard uniform distribution 3. Set X = x i if U belongs to interval A i X has the desired distribution, since A0A1A2A3 01

10 GaussianEx1.mGaussianEx1.m plots histogram of samples drawn from a Gaussian with mean zero and std one and plots the Gaussian with mean 1 and std 1. Values of histogram are normalized so that the sum of the area of the histogram bars (rectangles) is equal to 1. This way both plots are directly comparable, since the area under the Gaussian curve is also one. x = -5:0.1:5; g=normpdf(x,0,1); samples=randn(1000,1); %samples = normrnd(0,1,1000,1); [hs, hx] = hist(samples,x); hs=(hs/1000)*(101/10); figure; bar(x,hs,'r'); hold on plot(x,g,'LineWidth',3); Here we drew samples from the target distribution, which is usually impossible. Continuous Random Variables

11 A simple yet surprising fact: Regardless of the initial distribution of RV X, the cdf of X, F X (X), has uniform distribution: Let X be a continues RV with a strictly increasing cdf F X (x). Define an RV as U = F X (X). Then the distribution of U is Uniform(0, 1). Proof: Since F X (x) is a cdf, 0 ≤ F X (x) ≤ 1 for all x. Hence values of U lie in [0, 1]. We show that the cdf of U is F U (u)=u, hence U is Uniform(0, 1): by definition of cdf

12 Generating Continuous Random Variables In order to generate a continues RV X with cdf F X, let us revert the formula U = F X (X). Then X can be obtained from the standard uniform variable U as Proof: Algorithm 1.Obtain a sample u from the standard uniform RV U 2.Compute a sample from X as since U is Uniform(0, 1)

13 Exponential random variables Applying this method to exponential distribution F(x) = 1 – e -λx if U has a U(0,1) distribution, then the random variable X is defined by So we can simulate a RV with Exp(λ) using a Uniform(0,1) RV F(x) = u ↔ x = ln (1- u) = F inv (u) X = F inv (U) = ln (1- U) = ln (U)

14 SamplingEx1.mSamplingEx1.m plots histogram of samples drawn from exponential distribution and plots the exponential distribution Values of histogram are normalized so that the sum of the area of the histogram bars (rectangles) is equal to one. This way both plots are directly comparable. x = 0:0.1:10; lambda=1; f=lambda*exp(-lambda*x); %sample from uniform dist. U(0,1) samples=rand(1000,1); %samples = normrnd(0,1,1000,1); samples=-(1/lambda)*log(1-samples); [hs, hx] = hist(samples,x); hs=(hs/1000)*(101/10); figure; bar(x,hs,'r'); hold on plot(x,f,'LineWidth',3);

15 Quick exercise 6.3 A distribution function F is 0 for x 3, and F(x) = 1/4 (x − 1) 2 if 1 ≤ x ≤ 3. Let U be a U(0, 1) random variable. Construct a random variable with distribution F from U.

16 Quick exercise 6.3 A distribution function F is 0 for x 3, and F(x) = 1/4 (x − 1) 2 if 1 ≤ x ≤ 3. Let U be a U(0, 1) random variable. Construct a random variable with distribution F from U.

17 6.4 A single server queue There is a well and people want to determine how powerful of a pump they should buy. The more power would mean less wait times but also increased cost Let T i represent the time between consecutive customers, called interarrival time, e.g., the time between customer 1 and 2 is T 2. Let S i be the length of time that customer i needs to use the pump The pump capacity v (liters per minute) is a model parameter that we wish to determine. If customer i requires R i liters of water, then S i = R i / v To complete the model description, we need to specify the distribution of T and R Interarrival times: every T i has an Exp(0.5) distribution (minutes) Service requirement: every R i has U(2,5) distribution (liters)

18 6.4 cont W i denotes the waiting time of customer i, for customer W 1 = 0 since he doesn’t have to wait. The waiting time of customer i depend on customer i-1: W i = max{W i-1 + S i-1 - T i, 0} We are interested in average waiting time: The average wait time for v = 2 turns out to be around 2 minutes The average wait time for v = 3 turns out to be around.5 minutes n=n= plot of (n, n ) for customers n = 1,2,…

19 6.4 Work in system One approach to determining how busy the pump is at any one time is to record at every moment how much work there is in the system. For example if I am halfway through filling my 4 liter container and 3 people are waiting who require 2, 3, and 5 liters, then there are 12 liters to go; at v = 2, there are 6 minutes of work in the system and at v = 3 there is just 4. The amount of work in the system just before a customer arrives equals the waiting time of the customer, this is also called virtual waiting time. Figure 6.8 illustrates the work in the system for v = 2 and v = 3


Download ppt "CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Michael Baron. Probability and Statistics for Computer Scientists,"

Similar presentations


Ads by Google