Simulation and Random Number Generation

Slides:



Advertisements
Similar presentations
RSLAB-NTU Lab for Remote Sensing Hydrology and Spatial Modeling 1 An Introduction to R Pseudo Random Number Generation (PRNG) Prof. Ke-Sheng Cheng Dept.
Advertisements

1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Statistics review of basic probability and statistics.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
CPSC 531:Random-Variate Generation
Random number generation Algorithms and Transforms to Univariate Distributions.
CDA6530: Performance Models of Computers and Networks Chapter 5: Generating Random Number and Random Variables TexPoint fonts used in EMF. Read the TexPoint.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Simulation Modeling and Analysis
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Chapter 6 Continuous Random Variables and Probability Distributions
Agenda Purpose Prerequisite Inverse-transform technique
Statistics.
Random Number Generation
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
Experimental Evaluation
1 Chapter 7 Generating and Processing Random Signals 第一組 電機四 B 蔡馭理 資工四 B 林宜鴻.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Random Number Generation Fall 2013
Random Number Generation Pseudo-random number Generating Discrete R.V. Generating Continuous R.V.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Simulation Output Analysis
Random variables Petter Mostad Repetition Sample space, set theory, events, probability Conditional probability, Bayes theorem, independence,
 1  Outline  stages and topics in simulation  generation of random variates.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
0 Simulation Modeling and Analysis: Input Analysis K. Salah 8 Generating Random Variates Ref: Law & Kelton, Chapter 8.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling STATISTICS Random Variables.
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
1 Lesson 3: Choosing from distributions Theory: LLN and Central Limit Theorem Theory: LLN and Central Limit Theorem Choosing from distributions Choosing.
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
CS433 Modeling and Simulation Lecture 15 Random Number Generator Dr. Anis Koubâa 24 May 2009 Al-Imam Mohammad Ibn Saud Islamic University College Computer.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
Convergence in Distribution
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
One Random Variable Random Process.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
ETM 607 – Random-Variate Generation
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
EE 5345 Multiple Random Variables
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Sampling and estimation Petter Mostad
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Joint Moments and Joint Characteristic Functions.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
One Function of Two Random Variables
Gil McVean, Department of Statistics Thursday February 12 th 2009 Monte Carlo simulation.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
Generating Random Variates
Appendix A: Probability Theory
Generating Random Variates
Presentation transcript:

Simulation and Random Number Generation

Summary Discrete Time vs Discrete Event Simulation Random number generation Generating a random sequence Generating random variates from a Uniform distribution Testing the quality of the random number generator Some probability results Evaluating integrals using Monte-Carlo simulation Generating random numbers from various distributions Generating discrete random variates from a given pmf Generating continuous random variates from a given distribution

Discrete Time vs. Discrete Event Simulation Solve the finite difference equation Given the initial conditions x0, x-1 and the input function uk, k=0,1,… we can simulate the output! This is a time driven simulation. For the systems we are interested in, this method has two main problems.

Issues with Time-Driven Simulation System Input System Dynamics u1(t) t u2(t) Δt t x(t) t Inefficient: At most steps nothing happens Accuracy: Event order in each interval

Time Driven vs. Event Driven Simulation Models Time Driven Dynamics Event Driven Dynamics In this case, time is divided in intervals of length Δt, and the state is updated at every step State is updated only at the occurrence of a discrete event

Pseudo-Random Number Generation It is difficult to construct a truly random number generator. Random number generators consist of a sequence of deterministically generated numbers that “look” random. A deterministic sequence can never be truly random. This property, though seemingly problematic, is actually desirable because it is possible to reproduce the results of an experiment! Multiplicative Congruential Method xk+1 = a xk modulo m The constants a and m should follow the following criteria For any initial seed x0, the resultant sequence should look like random For any initial seed, the period (number of variables that can be generated before repetition) should be large. The value can be computed efficiently on a digital computer.

Pseudo-Random Number Generation To fit the above requirements, m should be chosen to be a large prime number. Good choices for parameters are m= 231-1 and a=75 for 32-bit word machines m= 235-1 and a=55 for 36-bit word machines Mixed Congruential Method xk+1 = (a xk + c) modulo m A more general form: where n is a positive constant xk/m maps the random variates in the interval [0,1)

Quality of the Random Number Generator (RNG) The random variate uk=xk/m should be uniformly distributed in the interval [0,1). Divide the interval [0,1) into k subintervals and count the number of variates that fall within each interval. Run the chi-square test. The sequence of random variates should be uncorrelated Define the auto-correlation coefficient with lag k > 0 Ck and verify that E[Ck] approached 0 for all k=1,2,… where ui is the random variate in the interval [0,1).

Some Probability Review/Results Probability mass functions Cumulative mass functions X 1 2 3 Pr{X=1}=p1 Pr{X=2}=p2 Pr{X=3}=p3 1 X 1 2 3 p1 p1+ p2 p1+ p2 + p3

Some Probability Results/Review Probability density functions Cumulative distribution x fX(x) x1 Pr{x=x1}= 0! x FX(x) 1

Independence of Random Variables Joint cumulative probability distribution. Independent Random Variables. For discrete variables For continuous variables

Conditional Probability Conditional probability for two events. Bayes’ Rule. Total Probability

Expectation and Variance Expected value of a random variable Expected value of a function of a random variable Variance of a random variable

Covariance of two random variables The covariance between two random variables is defined as The covariance between two independent random variables is 0 (why?)

Markov Inequality If the random variable X takes only non-negative values, then for any value a > 0. Note that for any value 0<a < E[X], the above inequality says nothing!

Chebyshev’s Inequality If X is a random variable having mean μ and variance σ2, then for any value k > 0. σ μ x fX(x) kσ kσ This may be a very conservative bound! Note that for X~N(0,σ2), for k=2, from Chebyshev’s inequality Pr{.}<0.25, when in fact the actual value is Pr{.}< 0.05

The Law of Large Numbers Weak: Let X1, X2, … be a sequence of independent and identically distributed random variables having mean μ. Then, for any ε >0 Strong: with probability 1,

The Central Limit Theorem Let X1, X2, …Xn be a sequence of independent random variables with mean μi and variance σι2. Then, define the random variable X, Let Then, the distribution of X approaches the normal distribution as n increases, and if Xi are continuous, then

Chi-Square Test Let k be the number of subintervals, thus pi=1/k, i=1,…,k. Let Ni be the number of samples in each subinterval. Note that E[Ni]=Npi where N is the total number of samples Null Hypothesis H0: The probability that the observed random variates are indeed uniformly distributed in (0,1). Let T be Define p-value= PH0{T>t} indicate the probability of observing a value t assuming H0 is correct. For large N, T is approximated by a chi-square distribution with k-1 degrees of freedom, thus we can use this approximation to evaluate the p-value The H0 is accepted if p-value is greater than 0.05 or 0.01

Monte-Carlo Approach to Evaluating Integrals Suppose that you want to estimate θ, however it is rather difficult to analytically evaluate the integral. Suppose also that you don’t want to use numerical integration. Let U be a uniformly distributed random variable in the interval [0,1) and ui are random variates of the same distribution. Consider the following estimator. Strong Law of Large Numbers Also note that:

Monte-Carlo Approach to Evaluating Integrals Use Monte-Carlo simulation to estimate the following integral. Let y=(x-a)/(b-a), then the integral becomes What if Use the substitution y= 1/(1+x), What if

Example: Estimate the value of π. (-1,-1) (1,-1) (1,1) (-1,1) Let X, Y, be independent random variables uniformly distributed in the interval [-1,1] The probability that a point (X,Y) falls in the circle is given by SOLUTION Generate N pairs of uniformly distributed random variates (u1,u2) in the interval [0,1). Transform them to become uniform over the interval [-1,1), using (2u1-1,2u2-1). Form the ratio of the number of points that fall in the circle over N

Discrete Random Variates Suppose we would like to generate a sequence of discrete random variates according to a probability mass function x 1 u X Inverse Transform Method

Discrete Random Variate Algorithm (D-RNG-1) Generate u=U(0,1) If u<p0, set X=x0, return; If u<p0+p1, set X=x1, return; … Set X=xn, return; Recall the requirement for efficient implementation, thus the above search algorithm should be made as efficient as possible! Example: Suppose that X{0,1,…,n} and p0= p1 =…= pn = 1/(n+1), then

Discrete Random Variate Algorithm Assume p0= 0.1, p1 = 0.2, p2 = 0.4, p3 = 0.3. What is an efficient RNG? D-RNG-1: Version 1 Generate u=U(0,1) If u<0.1, set X=x0, return; If u<0.3, set X=x1, return; If u<0.7, set X=x2, return; Set X=x3, return; D-RNG-1: Version 2 Generate u=U(0,1) If u<0.4, set X=x2, return; If u<0.7, set X=x3, return; If u<0.9, set X=x1, return; Set X=x0, return; More Efficient

Geometric Random Variables Let p the probability of success and q=1-p the probability of failure, then X is the time of the first success with pmf Using the previous discrete random variate algorithm, X=j if As a result:

Poisson and Binomial Distributions Poisson Distribution with rate λ. Note: The binomial distribution (n,p) gives the number of successes in n trials given that the probability of success is p. Note:

Accept/Reject Method (D-RNG-AR) Suppose that we would like to generate random variates from a pmf {pj, j≥0} and we have an efficient way to generate variates from a pmf {qj, j≥0}. Let a constant c such that In this case, use the following algorithm D-RNG-AR: Generate a random variate Y from pmf {qj, j≥0}. Generate u=U(0,1) If u< pY/(cqY), set X=Y and return; Else repeat from Step 1.

Accept/Reject Method (D-RNG-AR) Show that the D-RNG-AR algorithm generates a random variate with the required pmf {pj, j≥0}. 1 pi/cqi qi Therefore

D-RNG-AR Example Determine an algorithm for generating random variates for a random variable that take values 1,2,..,10 with probabilities 0.11, 0.12, 0.09, 0.08, 0.12, 0.10, 0.09, 0.09, 0.10, 0.10 respectively. D-RNG-1: Generate u=U(0,1) k=1; while(u >cdf(k)) k=k+1; x(i)=k; D-RNG-AR: u1=U(0,1), u2=U(0,1) Y=floor(10*u1 + 1); while(u2 > p(Y)/c) u1= U(0,1); u2=U(0,1); Y=floor(10*rand + 1); y(i)=Y;

The Composition Approach Suppose that we have an efficient way to generate variates from two pmfs {qj, j≥0} and {rj, j≥0} Suppose that we would like to generate random variates for a random variable having pmf, a (0,1). Let X1 have pmf {qj, j≥0} and X2 have pmf {rj, j≥0} and define Algorithm D-RNG-C: Generate u=U(0,1) If u <= a generate X1 Else generate X2

Continuous Random Variates Inverse Transform Method Suppose we would like to generate a sequence of continuous random variates having density function FX(x) Algorithm C-RNG-1: Let U be a random variable uniformly distributed in the interval (0,1). For any continuous distribution function, the random variate X is given by x 1 FX(x) u X

Example: Exponentially Distributed Random Variable Suppose we would like to generate a sequence of random variates having density function Solution Find the cumulative distribution Let a uniformly distributed random variable u Equivalently, since 1-u is also uniformly distributed in (0,1)

Convolution Techniques and the Erlang Distribution Suppose the random variable X is the sum of a number of independent identically distributed random variables Algorithm C-RNG-Cv: Generate Y1,…,Yn from the given distribution X=Y1+Y2+…+Yn. An example of such random variable is the Erlang with order n which is the sum of n iid exponential random variables with rate λ.

Accept/Reject Method (C-RNG-AR) Suppose that we would like to generate random variates from a pdf fX(x) and we have an efficient way to generate variates from a pdf gX(x). Let a constant c such that In this case, use the following algorithm C-RNG-AR: Generate a random variate Y from density gX(x). Generate u=U(0,1) If u< fX(Y)/(cgX(Y)), set X=Y and return; Else repeat from Step 1.

Accept/Reject Method (C-RNG-AR) The C-RNG-AR is similar to the D-RNG-AR algorithm except the comparison step where rather than comparing the two probabilities we compare the values of the density functions. Theorem The random variates generated by the Accept/Reject method have density fX(x). The number of iterations of the algorithm that are needed is a geometric random variable with mean c Note: The constant c is important since is implies the number of iterations needed before a number is accepted, therefore it is required that it is selected so that it has its minimum value.

C-RNG-AR Example Use the C-RNG-AR method to generate random variates X that are normally distributed with mean 0 and variance 1, N(0,1). First consider the pdf of the absolute value of |X|. We know how to generate exponentially distributed random variates Y with rate λ=1. Determine c such that it is equal to the maximum of the ratio

C-RNG-AR Example C-RNG-AR for N(0,1): u1=U(0,1), u2=U(0,1); Y= -log(u1); while(u2 > exp(-0.5(Y-1)*(Y-1))) u1= U(0,1); u2=U(0,1); u3= U(0,1); If u3 < 0.5 X=Y; Else X= -Y; Suppose we would like Z~N(μ, σ2), then

Generating a Homogeneous Poisson Processes A homogenous Poisson process is a sequence of points (events) where the inter-even times are exponentially distributed with rate λ (The Poisson process will be studied in detail during later classes) Let ti denote the ith point of a Poisson process, then the algorithm for generating the first N points of the sequence {ti, i=1,2,…,N} is given by Algorithm Poisson-λ: k=0, t(k)=0; While k<N k= k+1; Generate u=U(0,1) t(k)= t(k-1) – log(u)/lambda; Return t.

Generating a Non-Homogeneous Poisson Processes Suppose that the process is non-homogeneous i.e., the rate varies with time, i.e., λ(t) ≤λ, for all t<T. Let ti denote the ith point of a Poisson process, and τ the actual time, then the algorithm for generating the first N points of the sequence {ti, i=1,2,…,N} is given by Algorithm Thinning Poisson-λ: k=0, t(k)=0, tau= 0; While k<N Generate u1=U(0,1); tau= tau – log(u1)/lambda; Generate u2= U(0,1); If(lambda(tau)\lambda < u2) k= k+1, t(k)= tau; Return t.

Generating a Non-Homogeneous Poisson Processes Again, suppose that the process is non-homogeneous i.e., the rate varies with time, i.e., λ(t) ≤λ, for all t<T but now we would like to generate all points ti directly, without thinning. Assuming that we are at point ti, then the question that we need to answer is what is the cdf of Si where Si is the time until the next event Thus, to simulate this process, we start from t0 and generate S1 from FS1 to go to t1=t0+S1. Then, from t1, we generate S2 from FS2 to go to t2=t1+S2 and so on.

Example of Non-Homogeneous Poisson Processes Suppose that λ(t)= 1/(t+α), t ≥0, for some positive constant a. Generate variates from this non-homogeneous Poisson process. First, let us determine the rate of the cdf Inverting this yields

Example of Non-Homogeneous Poisson Processes Inverse Transform Thus we start from t0=0 …

The Composition Approach Suppose that we have an efficient way to generate variates from cdfs G1(x),…, Gn(x). Suppose that we would like to generate random variates for a random variable having cdf Algorithm C-RNG-C: Generate u=U(0,1) If u<p1, get X from G1(x), return; If u<p1+p2, get X from G2(x), return; …

Polar Method for Generating Normal Random Variates Let X and Y be independent normally distributed random variables with zero mean and variance 1. Then the joint density function is given by X Y Then make a variable change θ R The new joint density function is Exponential with rate 1/2 Uniform in the interval [0,2π]

Polar Method for Generating Normal Random Variates Algorithm C-RNG-N1: Generate u1=U(0,1), u2=U(0,1); R= -2*log(u1); W= 2*pi*u2; X= sqrt(R) cos(W); Y= sqrt(R) sin(W); Generates 2 independent RVs But, sine and cosine evaluations are inefficient! Algorithm C-RNG-N2: Generate u1=U(0,1), u2=U(0,1); Set V1= 2*u1-1, V2= 2*u2-1; S=V1*V1+V2*V2; If S > 1, Go to 1 R= sqrt(-2*log(S)/S); X= R*V1; Y= R*V2; (-1,1) (-1,-1) (1,1) (1,-1) (V1,V2)

Simulation of Discrete Event Systems INITIALIZE CLOCK STRUCTURE RNG STATE TIME EVENT CALENDAR e1 t1 e2 t2 … Update Time t’=t1 Update State x’=f(x,e1) Add New Feasible Events Delete Infeasible Events

Verification of a Simulation Program Standard debugging techniques Debug “modules” or subroutines Create simple special cases, where you know what to expect as an output from each of the modules Often choosing carefully the system parameters, the simulation model can be evaluated analytically. Create a trace which keeps track of the state variables, the event list and other variables.