Monte Carlo Approximations – Introduction

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Independence of random variables
Review of Basic Probability and Statistics
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete random variables Probability mass function Distribution function (Secs )
The moment generating function of random variable X is given by Moment generating function.
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Lecture II-2: Probability Review
Joint Distribution of two or More Random Variables
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 26.
Convergence in Distribution
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Probability Refresher COMP5416 Advanced Network Technologies.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Random Variables Example:
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Probability and Moment Approximations using Limit Theorems.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Chapter 31 Conditional Probability & Conditional Expectation Conditional distributions Computing expectations by conditioning Computing probabilities by.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Conditional Expectation
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics B: Michael Baron. Probability and Statistics for Computer Scientists,
Discrete Random Variable Random Process. The Notion of A Random Variable We expect some measurement or numerical attribute of the outcome of a random.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
R. Kass/W04 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
Expectations of Random Variables, Functions of Random Variables
Monte Carlo Methods Some example applications in C++
Inequalities, Covariance, examples
Standard Errors Beside reporting a value of a point estimate we should consider some indication of its precision. For this we usually quote standard error.
Introduction to Monte Carlo Method
Expectations of Random Variables, Functions of Random Variables
Applied Discrete Mathematics Week 11: Relations
3.1 Expectation Expectation Example
Conditional Probability on a joint discrete distribution
Some Rules for Expectation
Lecture 2 – Monte Carlo method in finance
Multinomial Distribution
Introduction to Probability & Statistics The Central Limit Theorem
Some Discrete Probability Distributions
Chernoff bounds The Chernoff bound for a random variable X is
Distributions and expected value
Independence of random variables
ASV Chapters 1 - Sample Spaces and Probabilities
Data Analysis Statistical Measures Industrial Engineering
Applied Discrete Mathematics Week 12: Discrete Probability
9. Limit Theorems.
11. Conditional Density Functions and Conditional Expected Values
Chapter 2. Random Variables
11. Conditional Density Functions and Conditional Expected Values
9. Two Functions of Two Random Variables
Further Topics on Random Variables: 1
Further Topics on Random Variables: Covariance and Correlation
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Further Topics on Random Variables: Derived Distributions
Experiments, Outcomes, Events and Random Variables: A Revisit
Simulation Berlin Chen
Further Topics on Random Variables: Covariance and Correlation
Applied Statistics and Probability for Engineers
Further Topics on Random Variables: Derived Distributions
Moments of Random Variables
Presentation transcript:

Monte Carlo Approximations – Introduction Suppose X1, X2,… is a sequence of independent and identically distributed random variables with unknown mean µ. Let The laws of large numbers indicate that for large n, Mn ≈ µ. Therefore, it is possible to use Mn as an estimator or approximation of µ. Estimators like Mn can also be used to estimate purely mathematical quantities that are too difficult to compute directly. Such estimators are called Monte Carlo approximations. STA347 - week 12 1

Example Suppose we wish to evaluate the integral This integral cannot easily be solved exactly, but it can be computed approximately using a Monte Carlo approximation. We first note that, I = E(X 2 cos(X 2)) where X ~ Exponential(25). Hence, for large n the integral I is approximately equal to , where , with X1, X2,…i.i.d Exponential(25). Further, in the example on slide 10, we established a method to simulate X ~ Exponential(25). 2 STA347 - week 12

Putting things together we obtain the following algorithm for approximating the integral I. Step 1: Select a large positive integer n. Step 2: Obtain Ui ~ Uniform[0, 1], independently for i = 1, …, n. Step 3: Set , for i = 1, …, n. Step 4: Set , for i = 1, …, n. Step 5: Estimate I by For large n this algorithm will provide a good estimate of the integral I. STA347 - week 12 3

Example Using R The following is the R code for approximating the integral I in the example above. > U = runif(100000) > X = -(1/25)*log((1-U), base = exp(1)) > T = X^2*cos(X^2) > I = mean(T) > I [1] 0.00318922 STA347 - week 12 4

Assessing Error of MC Approximations Any time we approximate or estimate a quantity, we must also indicate how much error there is in the estimate. However, we cannot say what the error is exactly since we are approximating an unknown quantity. Nevertheless, the central limit theorem provide a natural approach to assessing this error, using three times the standard error of the estimate. Thus, we can approximate an unknown quantity such as the integral I in the example above by quoting Mn and the interval STA347 - week 12 5

Assessing Error Using R The following is the R code for assessing the error in the approximation of the integral I. > llimit = I - 3*sd(T)/sqrt(100000) > llimit [1] 0.003121188 > ulimit = I + 3*sd(T)/sqrt(100000) > ulimit [1] 0.003257252   Conclusion: the value of I is approximately 0.003189 and the true value is almost certainly in the interval (0.003121, 0.003257). STA347 - week 12 6

Conditional Probability on a joint discrete distribution Given the joint pmf of X and Y, we want to find and These are the base for defining conditional distributions… STA347 - week 12

Definition For X, Y discrete random variables with joint pmf pX,Y(x,y) and marginal mass function pX(x) and pY(y). If x is a number such that pX(x) > 0, then the conditional pmf of Y given X = x is Is this a valid pmf? Similarly, the conditional pmf of X given Y = y is Note, from the above conditional pmf we get Summing both sides over all possible values of Y we get This is an extremely useful application of the law of total probability. Note: If X, Y are independent random variables then PX|Y(x|y) = PX(x). STA347 - week 12

Example Suppose we roll a fair die; whatever number comes up we toss a coin that many times. What is the distribution of the number of heads? Let X = number of heads, Y = number on die. We know that Want to find pX(x). The conditional probability function of X given Y = y is given by for x = 0, 1, …, y. By the Law of Total Probability we have Possible values of x: 0,1,2,…,6. STA347 - week 12

Conditional densities If X, Y jointly distributed continuous random variables, the conditional density function of Y | X is defined to be if fX(x) > 0 and 0 otherwise. If X, Y are independent then . Also, Integrating both sides over x we get This is a useful application of the law of total probability for the continuous case. STA347 - week 12

Example Consider the joint density Find the conditional density of X given Y and the conditional density of Y given X. STA347 - week 12

Conditional Expectation For X, Y discrete random variables, the conditional expectation of Y given X = x is and the conditional variance of Y given X = x is where these are defined only if the sums converges absolutely. In general, STA347 - week 12

For X, Y continuous random variables, the conditional expectation of Y given X = x is and the conditional variance of Y given X = x is In general, STA347 - week 12

Example Suppose X, Y are continuous random variables with joint density function Find E(X | Y = 2). STA347 - week 12

More on Conditional Expectation Assume that E(Y | X = x) exists for every x in the range of X. Then, E(Y | X ) is a random variable. The expectation of this random variable is E [E(Y | X )] Theorem E [E(Y | X )] = E(Y) This is called the “Law of Total Expectation”. Proof: STA347 - week 12

Example Suppose we roll a fair die; whatever number comes up we toss a coin that many times. What is the expected number of heads? STA347 - week 12

Theorem For random variables X, Y V(Y) = V [E(Y|X)] + E[V(Y|X)] Proof: STA347 - week 12

Example Let X ~ Geometric(p). Given X = x, let Y have conditionally the Binomial(x, p) distribution. Scenario: doing Bernoulli trails with success probability p until 1st success so X : number of trails. Then do x more trails and count the number of success which is Y. Find, E(Y), V(Y). STA347 - week 12