CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 B: Michael Baron. Probability and Statistics for Computer Scientists,

Slides:



Advertisements
Similar presentations
DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Advertisements

Ch. 19 Unbiased Estimators Ch. 20 Efficiency and Mean Squared Error CIS 2033: Computational Probability and Statistics Prof. Longin Jan Latecki Prepared.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Chapter 5 Discrete Random Variables and Probability Distributions
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Michael Maurizi Instructor Longin Jan Latecki C9:
Review of Basic Probability and Statistics
Chapter 4 Discrete Random Variables and Probability Distributions
Probability Distributions
Random-Variate Generation. Need for Random-Variates We, usually, model uncertainty and unpredictability with statistical distributions Thereby, in order.
Test 2 Stock Option Pricing
#11 QUEUEING THEORY Systems Fall 2000 Instructor: Peter M. Hahn
Prof. Bart Selman Module Probability --- Part d)
Discrete Event Simulation How to generate RV according to a specified distribution? geometric Poisson etc. Example of a DEVS: repair problem.
Probability Distributions
TDC 369 / TDC 432 April 2, 2003 Greg Brewster. Topics Math Review Probability –Distributions –Random Variables –Expected Values.
1 Review of Probability Theory [Source: Stanford University]
Probability Distributions Random Variables: Finite and Continuous A review MAT174, Spring 2004.
Probability Distributions Random Variables: Finite and Continuous Distribution Functions Expected value April 3 – 10, 2003.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki Chapter 5: Continuous Random Variables.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C12: The Poisson process.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Discrete Random Variables and Probability Distributions
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Exponential Distribution & Poisson Process
1 Exponential Distribution & Poisson Process Memorylessness & other exponential distribution properties; Poisson process and compound P.P.’s.
CPSC 531: Probability Review1 CPSC 531:Distributions Instructor: Anirban Mahanti Office: ICT Class Location: TRB 101.
Chapter 5 Statistical Models in Simulation
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Review and Preview This chapter combines the methods of descriptive statistics presented in.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Michael Baron. Probability and Statistics for Computer Scientists,
Lecture 19 Nov10, 2010 Discrete event simulation (Ross) discrete and continuous distributions computationally generating random variable following various.
COMP 170 L2 L17: Random Variables and Expectation Page 1.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
MATH 3033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Sean Hercus Instructor Longin Jan Latecki Ch. 6 Simulations.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Sampling and estimation Petter Mostad
Expected values of discrete Random Variables. The function that maps S into S X in R and which is denoted by X(.) is called a random variable. The name.
Random Variables Example:
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki Chapter 5: Continuous Random Variables.
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Simulation Chapter 16 of Quantitative Methods for Business, by Anderson, Sweeney and Williams Read sections 16.1, 16.2, 16.3, 16.4, and Appendix 16.1.
SS r SS r This model characterizes how S(t) is changing.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
Random Variables r Random variables define a real valued function over a sample space. r The value of a random variable is determined by the outcome of.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 5.2: Simulation of Random Variables CIS Computational.
CHAPTER 6 Random Variables
ONE DIMENSIONAL RANDOM VARIABLES
Exponential Distribution & Poisson Process
C4: DISCRETE RANDOM VARIABLES
Engineering Probability and Statistics - SE-205 -Chap 3
CHAPTER 6 Random Variables
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 5.2: Simulation of Random Variables CIS Computational.
MATH 3033 based on Dekking et al
CIS 2033 based on Dekking et al
CIS 2033 based on Dekking et al
Geometric Poisson Negative Binomial Gamma
MATH 3033 based on Dekking et al
CIS 2033 based on Dekking et al
Presentation transcript:

CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics B: Michael Baron. Probability and Statistics for Computer Scientists, CRC 2006 Instructor Longin Jan Latecki Ch. 6 Simulations

What is a simulation? One uses a model to create specific situations in order to study the response of the model to them and then interprets this in terms of what would happen to the system “in the real world”. Models for such systems involve random variables, and we speak of probabilistic or stochastic models, and simulating them is stochastic simulation. Stochastic simulation of a system means generating values for all the random variables in the model, according to their specified distributions, and recording and analyzing what happens We refer to the generated values as realizations of the random variables

6.2 Generating realizations: Bernoulli Simulations are almost always done by computer, which usually have one or more so- called (pseudo) random number generators. Which mimics U(0,1). Ex: How do you simulate a coin toss with a die. Bernoulli random Variables Suppose U has a U(0,1) distribution. To construct a Ber(p) random variable for some 0 < p < 1. so that A Matlab code: U = rand; X = (U<p)

Binomial Binomial Random Variables We can obtain a binomial RV as a sum of n independent Bernoulli RVs. For this we start with n uniform RVs, for example: A Matlab code: n = 20; p = 0.68; U = rand(n,1); X = sum(U<p)

Geometric Geometric Random Variables A while loop of Bernoulli trials generates a geometric RV. We run the loop until the first success occurs. Variable X in the while loop counts the number of failures. A Matlab code: X=1;%at least one trail while rand > p%continue while there are failures X = X + 1; end;%stop at the first success X

Arbitrary discrete distribution Arbitrary Discrete Random Variable X that takes values x0, x1, … with probabilities p i = P(X=x i ) such that p0+p1+…=1. Algorithm 1.Divide interval [0, 1] into subintervals of lengths pi A0 = [0, p0) A1 = [p0, p0+p1) A2 = [p0+p1, p0+p1+p2), … 2. Obtain U from Uniform(0, 1), the standard uniform distribution 3. Set X = x i if U belongs to interval A i X has the desired distribution, since

Continuous Random Variables A simple yet surprising fact: Regardless of the initial distribution of RV X, the cdf of X, F X (X), has uniform distribution: Let X be a continues RV with a strictly increasing cdf F X (x). Define an RV as U = F X (X). Then the distribution of U is Uniform(0, 1). Proof: Since F X (x) is a cdf, 0 ≤ F X (x) ≤ 1 for all x. Hence values of U lie in [0, 1]. We show that the cdf of U is F U (u)=u, hence U is Uniform(0, 1): by definition of cdf

Generating Continuous Random Variables In order to generate a continues RV X with cdf F X, let us revert the formula U = F X (X). Then X can be obtained from the standard uniform variable U as Proof: Algorithm 1.Obtain a sample u from the standard uniform RV U 2.Compute a sample from X as since U is Uniform(0, 1)

Exponential random variables Applying this method to exponential distribution F(x) = 1 – e -λx if U has a U(0,1) distribution, then the random variable X is defined by So we can simulate a RV with Exp(λ) using a Uniform(0,1) RV F(x) = u ↔ x = ln (1- u) = F inv (u) X = F inv (U) = ln (1- U) = ln (U)

6.4 A single server queue There is a well and people want to determine how powerful of a pump they should buy. The more power would mean less wait times but also increased cost Let T i represent the time between consecutive customers, called interarrival time, e.g., the time between customer 1 and 2 is T 2. Let S i be the length of time that customer i needs to use the pump The pump capacity v (liters per minute) is a model parameter that we wish to determine. If customer i requires R i liters of water, then S i = R i / v To complete the model description, we need to specify the distribution of T and R Interarrival times: every T i has an Exp(0.5) distribution (minutes) Service requirement: every R i has U(2,5) distribution (liters)

6.4 cont W i denotes the waiting time of customer i, for customer W 1 = 0 since he doesn’t have to wait. Then the following waiting times depend on customer i-1: W i = max{W i-1 + S i-1 + T i, 0} See pg.81 We are interested in average waiting time: This is used for fig 6.7, which is the plot of (n, n ) for n = 1,2,… The average wait time for v = 2 turns out to be around 2 minutes The average wait time for v = 3 turns out to be around.5 minutes n=n=

6.4 Work in system One approach to determining how busy the pump is at any one time is to record at every moment how much work there is in the system. For example if I am halfway through filling my 4 liter container and 3 people are waiting who require 2, 3, and 5 liters, then there are 12 liters to go; at v = 2, there are 6 minutes of work in the system and at v = 3 there is just 4. The amount of work in the system just before a customer arrives equals the waiting time of the customer, this is also called virtual waiting time. Figures 6.8 and 6.9 illustrate the work in the system for v = 2 and v = 3