Distributions of Functions of Random Variables November 18, 2015 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 1/2) Distributions of Functions of Random Variables November 18, 2015
Outline 5.1 Functions of One Random Variable 5.2 Transformation of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
Functions of One Random Variable Let X be a continuous random variable with pdf f(x). If we consider a function of X, say, Y=u(X), then Y must also be a random variable with its own distribution The cdf of Y is G(y) = P(Y<=y) = P(u(X)<=y) The pdf of Y is g(y) = G’(y) (where apostrophe ’ denotes derivative)
Functions of One Random Variable Change-of-variable technique Let X be a continuous random variable with pdf f(x) with support c1<x<c2. We begin this discussion by taking Y=u(X) as a continuous increasing function of X with inverse function X=v(Y). Say the support of X maps onto the support of Y, d1=u(c1)<y<d2=u(c2). Then, the cdf of Y is Thus,
Functions of One Random Variable The derivative, g(y)=G’(y), of such an expression is given by Suppose now the function Y=u(X) and its inverse X=v(Y) are continuous decreasing functions. Then, Thus,
Functions of One Random Variable Thus, for both increasing and decreasing cases, we can write the pdf of Y:
Functions of One Random Variable Example 1 All intervals on the distribution's support are equally probable => uniform distribution.
Functions of One Random Variable Theorem 1: Suppose that a random variable X has a continuous distribution for which the cdf is F. Then, the random variable Y=F(X) has a uniform distribution Random variables from any given continuous distribution can be converted to random variables having a uniform distribution, and vice versa
Functions of One Random Variable cdf F of N(0,1) pdf of N(0,1) X ~ N(0,1) Y = F(X) ~ U(0,1)
Functions of One Random Variable Theorem 1 (converse statement): If U is a uniform random variable on (0,1), then random variable X=F-1(U) has cdf F (where F is a continuous cdf and F-1 is its inverse function) Proof: P(X<=x) = P(F-1(U)<=x) = P(U<=F(x)) = F(x)
Functions of One Random Variable Theorem 1 (converse statement) can be used to generate random variables of any distribution To generate values of X which are distributed according to the cdf F: 1. Generate a random number u from U (uniform random variable on (0,1)) 2. Compute the value x such that F(x) = u 3. Take x to be the random number distributed according to the cdf F
Functions of One Random Variable Example 2 (the transformation Y=u(X) is not one-to-one): Let Y=X2, where X is Cauchy, then where Thus, In this case of two-to-one transformation, there is a need to sum two terms, each of which is similar to the one-to-one case
Functions of One Random Variable Consider the discrete case Let X be a discrete random variable with pmf f(x)=P(X=x). Let Y=u(X) be a one-to-one transformation with inverse X=v(Y). Then, the pmf of Y is Note that, in the discrete case, the derivative |v’(y)| is not needed
Functions of One Random Variable Example 3: Let X be a uniform random variable on {1,2,…,n}. Then Y=X+a is a uniform random variable on {a+1,a+2,…,a+n}
Transformations of Two Random Variables If X1 and X2 are two continuous random variables with joint pdf f(x1,x2), and if Y1=u1(X1,X2), Y2=u2(X1,X2) has the single-valued inverse X1=v1(Y1,Y2), X2=v2(Y1,Y2), then the joint pdf of Y1 and Y2 is where | | denotes absolute value and J is the Jacobian given by where | | denotes the determinant of a matrix The derivative is replaced by the Jacobian.
Transformations of Two Random Variables Example 1: Let X1 and X2 be independent random variables, each with pdf Hence, their joint pdf is Let Y1=X1-X2, Y2=X1+X2. Thus, x1=(y1+y2)/2, x2=(y2-y1)/2, and the Jacobian
Transformations of Two Random Variables Then, the joint pdf of Y1 and Y2 is
Transformations of Two Random Variables Example 2 (Box-Muller Transformation): Let X1 and X2 be i.i.d. U(0,1). Let Thus, where Q=Z12+Z22, and the Jacobian
Transformations of Two Random Variables Since the joint pdf of X1 and X2 is it follows that the joint pdf of Z1 and Z2 is This is the joint pdf of two i.i.d. N(0,1) random variables Hence, we can generate two i.i.d. N(0,1) random variables from two i.i.d. U(0,1) random variables using this method
Random Samples Assume that we conduct an experiment n times independently. Let Xk be the random variable corresponding to the outcome of the k-th run of experiment. Then X1, X2,…, Xn form a set of random samples of size n
Random Samples For example, if we toss a die n times and let X1, X2,…, Xn be the random variables corresponding to the outcome of the k-th tossing. Then X1, X2,…, Xn form a set of random samples of size n
Theorems about Independent Random Variables Let X1, X2,…, Xn be n independent discrete random variables and h() be a function of n variables. Then, the expected value of random variable Z=h(X1, X2,…, Xn) is equal to
Theorems about Independent Random Variables Likewise, if X1, X2,…, Xn are independent continuous random variables, then
Theorems about Independent Random Variables Theorem: If X1, X2,…, Xn are independent random variables and, for i = 1, 2,…, n, E[hi(Xi)] exists, then E[h1(X1) h2(X2) … hn(Xn)] = E[h1(X1)] E[h2(X2)]… E[hn(Xn)]
Theorems about Independent Random Variables Proof for the discrete cases: The proof for the continuous cases can be derived similarly
Theorems about Independent Random Variables Theorem: Assume that X1, X2,…, Xn are n independent random variables with respective means μ1, μ2,…, μn and variances σ12, σ22,…, σn2. Then, the mean and variance of random variable where a1,a2,…,an are real constants, are and
Theorems about Independent Random Variables Proof:
Theorems about Independent Random Variables Since Xi and Xj are independent where i≠j, Therefore,
Moment-Generating Function Technique Let X be a random variable. The moment-generating function (mgf) of X is defined as It is called mgf because all of the moments of X can be obtained by successively differentiating MX(t)
Moment-Generating Function Technique For example, Thus, Similarly,
Moment-Generating Function Technique In general, the nth derivative of MX(t) evaluated at t=0 equals E[Xn], i.e., where denotes the nth derivative of
Moment-Generating Function Technique Moment generating function uniquely determines the distribution. That is, there exists a one-to-one correspondence between the moment generating function (mgf) and the distribution function (pmf/pdf) of a random variable
Moment-Generating Function Technique Example 1 (mgf of N(0,1)): where the last equality follows from the fact that the expression in the integral is the pdf of a normal random variable with mean t and variance 1 which integrates to one
Moment-Generating Function Technique Exercise (mgf of N(m,s2)):
Moment-Generating Function Technique Theorem: If X1,X2,…,Xn are independent random variables with respective mgfs, Mi(t), i=1,2,…,n, then the mgf of is
Moment-Generating Function Technique Proof:
Moment-Generating Function Technique Corollary: If X1,X2,…,Xn correspond to independent random samples from a distribution with mgf M(t), then
Moment-Generating Function Technique The mgf of the sum of independent random variables is just the product of the individual mgfs
Moment-Generating Function Technique Example 2: Recall that, let Z1, Z2, …, Zn be independent N(0,1). Then, W=Z12+Z22+…+Zn2 has a distribution that is chi-square with n degrees of freedom, denoted by Let X1, X2, …, Xn be independent chi-square random variables with r1, r2, …, rn degrees of freedom, respectively. Show that Y=X1+X2+…+Xn is
Moment-Generating Function Technique Use the moment-generating function technique: which is the mgf of a Thus, Y is