CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Michael Baron. Probability and Statistics for Computer Scientists,

Slides:



Advertisements
Similar presentations
Random variables 1. Note  there is no chapter in the textbook that corresponds to this topic 2.
Advertisements

DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Ch. 19 Unbiased Estimators Ch. 20 Efficiency and Mean Squared Error CIS 2033: Computational Probability and Statistics Prof. Longin Jan Latecki Prepared.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Chapter 5 Discrete Random Variables and Probability Distributions
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Michael Maurizi Instructor Longin Jan Latecki C9:
Lec 18 Nov 12 Probability – definitions and simulation.
Review of Basic Probability and Statistics
Chapter 4 Discrete Random Variables and Probability Distributions
Probability Distributions
Random-Variate Generation. Need for Random-Variates We, usually, model uncertainty and unpredictability with statistical distributions Thereby, in order.
Test 2 Stock Option Pricing
Prof. Bart Selman Module Probability --- Part d)
Discrete Event Simulation How to generate RV according to a specified distribution? geometric Poisson etc. Example of a DEVS: repair problem.
Probability Distributions
TDC 369 / TDC 432 April 2, 2003 Greg Brewster. Topics Math Review Probability –Distributions –Random Variables –Expected Values.
1 Review of Probability Theory [Source: Stanford University]
Probability Distributions Random Variables: Finite and Continuous A review MAT174, Spring 2004.
Probability Distributions Random Variables: Finite and Continuous Distribution Functions Expected value April 3 – 10, 2003.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki Chapter 5: Continuous Random Variables.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C12: The Poisson process.
Ka-fu Wong © 2004 ECON1003: Analysis of Economic Data Lesson6-1 Lesson 6: Sampling Methods and the Central Limit Theorem.
Discrete Random Variables and Probability Distributions
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Probability Distributions: Finite Random Variables.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Week71 Discrete Random Variables A random variable (r.v.) assigns a numerical value to the outcomes in the sample space of a random phenomenon. A discrete.
Week 41 Continuous Probability Spaces Ω is not countable. Outcomes can be any real number or part of an interval of R, e.g. heights, weights and lifetimes.
Chapter 5 Statistical Models in Simulation
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Review and Preview This chapter combines the methods of descriptive statistics presented in.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Lecture 19 Nov10, 2010 Discrete event simulation (Ross) discrete and continuous distributions computationally generating random variable following various.
COMP 170 L2 L17: Random Variables and Expectation Page 1.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
MATH 3033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Sean Hercus Instructor Longin Jan Latecki Ch. 6 Simulations.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Ka-fu Wong © 2003 Chap 6- 1 Dr. Ka-fu Wong ECON1003 Analysis of Economic Data.
Sampling and estimation Petter Mostad
Random Variables Example:
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki Chapter 5: Continuous Random Variables.
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Simulation Chapter 16 of Quantitative Methods for Business, by Anderson, Sweeney and Williams Read sections 16.1, 16.2, 16.3, 16.4, and Appendix 16.1.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
2.2 Discrete Random Variables 2.2 Discrete random variables Definition 2.2 –P27 Definition 2.3 –P27.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics B: Michael Baron. Probability and Statistics for Computer Scientists,
Chapter Five The Binomial Probability Distribution and Related Topics
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 5.2: Simulation of Random Variables CIS Computational.
CHAPTER 6 Random Variables
C4: DISCRETE RANDOM VARIABLES
Engineering Probability and Statistics - SE-205 -Chap 3
CHAPTER 6 Random Variables
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 5.2: Simulation of Random Variables CIS Computational.
MATH 3033 based on Dekking et al
MATH 3033 based on Dekking et al
CIS 2033 based on Dekking et al
CIS 2033 based on Dekking et al
Geometric Poisson Negative Binomial Gamma
MATH 3033 based on Dekking et al
MATH 3033 based on Dekking et al
Presentation transcript:

CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Michael Baron. Probability and Statistics for Computer Scientists, CRC 2006 Instructor Longin Jan Latecki Ch. 6 Simulations

What is a simulation? One uses a model to create specific situations in order to study the response of the model to them and then interprets this in terms of what would happen to the system “in the real world”. Models for such systems involve random variables, and we speak of probabilistic or stochastic models, and simulating them is stochastic simulation. Stochastic simulation of a system means generating values for all the random variables in the model, according to their specified distributions, and recording and analyzing what happens We refer to the generated values as realizations of the random variables

Quick exercise 6.1 Describe how you can simulate a coin toss when instead of a coin you have a die. Any ideas on how to simulate a roll of a die if you only have a coin?

Quick exercise 6.1 Describe how you can simulate a coin toss when instead of a coin you have a die. Any ideas on how to simulate a roll of a die if you only have a coin? Repeat the coin tosses if you get TTH or TTT.

Quick exercise 6.2 A random variable Y has outcomes 1, 3, and 4 with the following probabilities: P(Y = 1) = 3/5, P(Y = 3) = 1/5, and P(Y = 4) = 1/5. Describe how to construct Y from a U(0, 1) random variable.

6.2 Generating realizations: Bernoulli Simulations are almost always done by computer, which usually have one or more so- called (pseudo) random number generators. Which mimics U(0,1). Bernoulli random Variables Suppose U has a U(0,1) distribution. To construct a Ber(p) random variable for some 0 < p < 1. so that A Matlab code: U = rand; X = (U<p)

Binomial Binomial Random Variables We can obtain a binomial RV as a sum of n independent Bernoulli RVs. For this we start with n uniform RVs, for example: A Matlab code: n = 20; p = 0.68; U = rand(n,1); X = sum(U<p)

Geometric Geometric Random Variables A while loop of Bernoulli trials generates a geometric RV. We run the loop until the first success occurs. Variable X in the while loop counts the number of failures. A Matlab code: X=1;%at least one trail while rand > p%continue while there are failures X = X + 1; end;%stop at the first success X

Arbitrary discrete distribution Arbitrary Discrete Random Variable X that takes values x0, x1, … with probabilities p i = P(X=x i ) such that p0+p1+…=1. Algorithm 1.Divide interval [0, 1] into subintervals of lengths pi A0 = [0, p0) A1 = [p0, p0+p1) A2 = [p0+p1, p0+p1+p2), … 2. Obtain U from Uniform(0, 1), the standard uniform distribution 3. Set X = x i if U belongs to interval A i X has the desired distribution, since A0A1A2A3 01

GaussianEx1.mGaussianEx1.m plots histogram of samples drawn from a Gaussian with mean zero and std one and plots the Gaussian with mean 1 and std 1. Values of histogram are normalized so that the sum of the area of the histogram bars (rectangles) is equal to 1. This way both plots are directly comparable, since the area under the Gaussian curve is also one. x = -5:0.1:5; g=normpdf(x,0,1); samples=randn(1000,1); %samples = normrnd(0,1,1000,1); [hs, hx] = hist(samples,x); hs=(hs/1000)*(101/10); figure; bar(x,hs,'r'); hold on plot(x,g,'LineWidth',3); Here we drew samples from the target distribution, which is usually impossible. Continuous Random Variables

A simple yet surprising fact: Regardless of the initial distribution of RV X, the cdf of X, F X (X), has uniform distribution: Let X be a continues RV with a strictly increasing cdf F X (x). Define an RV as U = F X (X). Then the distribution of U is Uniform(0, 1). Proof: Since F X (x) is a cdf, 0 ≤ F X (x) ≤ 1 for all x. Hence values of U lie in [0, 1]. We show that the cdf of U is F U (u)=u, hence U is Uniform(0, 1): by definition of cdf

Generating Continuous Random Variables In order to generate a continues RV X with cdf F X, let us revert the formula U = F X (X). Then X can be obtained from the standard uniform variable U as Proof: Algorithm 1.Obtain a sample u from the standard uniform RV U 2.Compute a sample from X as since U is Uniform(0, 1)

Exponential random variables Applying this method to exponential distribution F(x) = 1 – e -λx if U has a U(0,1) distribution, then the random variable X is defined by So we can simulate a RV with Exp(λ) using a Uniform(0,1) RV F(x) = u ↔ x = ln (1- u) = F inv (u) X = F inv (U) = ln (1- U) = ln (U)

SamplingEx1.mSamplingEx1.m plots histogram of samples drawn from exponential distribution and plots the exponential distribution Values of histogram are normalized so that the sum of the area of the histogram bars (rectangles) is equal to one. This way both plots are directly comparable. x = 0:0.1:10; lambda=1; f=lambda*exp(-lambda*x); %sample from uniform dist. U(0,1) samples=rand(1000,1); %samples = normrnd(0,1,1000,1); samples=-(1/lambda)*log(1-samples); [hs, hx] = hist(samples,x); hs=(hs/1000)*(101/10); figure; bar(x,hs,'r'); hold on plot(x,f,'LineWidth',3);

Quick exercise 6.3 A distribution function F is 0 for x 3, and F(x) = 1/4 (x − 1) 2 if 1 ≤ x ≤ 3. Let U be a U(0, 1) random variable. Construct a random variable with distribution F from U.

Quick exercise 6.3 A distribution function F is 0 for x 3, and F(x) = 1/4 (x − 1) 2 if 1 ≤ x ≤ 3. Let U be a U(0, 1) random variable. Construct a random variable with distribution F from U.

6.4 A single server queue There is a well and people want to determine how powerful of a pump they should buy. The more power would mean less wait times but also increased cost Let T i represent the time between consecutive customers, called interarrival time, e.g., the time between customer 1 and 2 is T 2. Let S i be the length of time that customer i needs to use the pump The pump capacity v (liters per minute) is a model parameter that we wish to determine. If customer i requires R i liters of water, then S i = R i / v To complete the model description, we need to specify the distribution of T and R Interarrival times: every T i has an Exp(0.5) distribution (minutes) Service requirement: every R i has U(2,5) distribution (liters)

6.4 cont W i denotes the waiting time of customer i, for customer W 1 = 0 since he doesn’t have to wait. The waiting time of customer i depend on customer i-1: W i = max{W i-1 + S i-1 - T i, 0} We are interested in average waiting time: The average wait time for v = 2 turns out to be around 2 minutes The average wait time for v = 3 turns out to be around.5 minutes n=n= plot of (n, n ) for customers n = 1,2,…

6.4 Work in system One approach to determining how busy the pump is at any one time is to record at every moment how much work there is in the system. For example if I am halfway through filling my 4 liter container and 3 people are waiting who require 2, 3, and 5 liters, then there are 12 liters to go; at v = 2, there are 6 minutes of work in the system and at v = 3 there is just 4. The amount of work in the system just before a customer arrives equals the waiting time of the customer, this is also called virtual waiting time. Figure 6.8 illustrates the work in the system for v = 2 and v = 3