Download presentation
Presentation is loading. Please wait.
Published byHelen Cook Modified over 9 years ago
1
Functions of Random Variables
2
Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating function method 3.Transformation method
3
Distribution function method Let X, Y, Z …. have joint density f(x,y,z, …) Let W = h( X, Y, Z, …) First step Find the distribution function of W G(w) = P[W ≤ w] = P[h( X, Y, Z, …) ≤ w] Second step Find the density function of W g(w) = G'(w).
4
Example 1 Let X have a normal distribution with mean 0, and variance 1. (standard normal distribution) Let W = X 2. Find the distribution of W.
5
First step Find the distribution function of W G(w) = P[W ≤ w] = P[ X 2 ≤ w] where
6
Second step Find the density function of W g(w) = G'(w).
7
Thus if X has a standard Normal distribution then W = X 2 has density This distribution is the Gamma distribution with = ½ and = ½. This distribution is also the 2 distribution with = 1 degree of freedom.
8
Example 2 Suppose that X and Y are independent random variables each having an exponential distribution with parameter (mean 1/ ) Let W = X + Y. Find the distribution of W.
9
First step Find the distribution function of W = X + Y G(w) = P[W ≤ w] = P[ X + Y ≤ w]
14
Second step Find the density function of W g(w) = G'(w).
15
Hence if X and Y are independent random variables each having an exponential distribution with parameter then W has density This distribution can be recognized to be the Gamma distribution with parameters = 2 and
16
Example: Student’s t distribution Let Z and U be two independent random variables with: 1. Z having a Standard Normal distribution and 2. U having a 2 distribution with degrees of freedom Find the distribution of
17
The density of Z is: The density of U is:
18
Therefore the joint density of Z and U is: The distribution function of T is:
19
Therefore:
20
Illustration of limits U U z z t > 0
21
Now: and:
22
Using:
23
Using the fundamental theorem of calculus: then If then
24
Hence Using or
25
Hence and
26
or where
27
Student’s t distribution where
28
Student – W.W. Gosset Worked for a distillery Not allowed to publish Published under the pseudonym “Student
29
t distribution standard normal distribution
30
Distribution of the Max and Min Statistics
31
Let x 1, x 2, …, x n denote a sample of size n from the density f(x). Let M = max(x i ) then determine the distribution of M. Repeat this computation for m = min(x i ) Assume that the density is the uniform density from 0 to .
32
Hence and the distribution function
33
Finding the distribution function of M.
34
Differentiating we find the density function of M. f(x)f(x)g(t)g(t)
35
Finding the distribution function of m.
36
Differentiating we find the density function of m. f(x)f(x)g(t)g(t)
37
The probability integral transformation This transformation allows one to convert observations that come from a uniform distribution from 0 to 1 to observations that come from an arbitrary distribution. Let U denote an observation having a uniform distribution from 0 to 1.
38
Find the distribution of X. Let Let f(x) denote an arbitrary density function and F(x) its corresponding cumulative distribution function. Hence.
39
has density f(x). Thus if U has a uniform distribution from 0 to 1. Then U
40
The Transformation Method Theorem Let X denote a random variable with probability density function f(x) and U = h(X). Assume that h(x) is either strictly increasing (or decreasing) then the probability density of U is:
41
Proof Use the distribution function method. Step 1 Find the distribution function, G(u) Step 2 Differentiate G (u ) to find the probability density function g(u)
42
hence
43
or
44
Example Suppose that X has a Normal distribution with mean and variance 2. Find the distribution of U = h(x) = e X. Solution:
45
hence This distribution is called the log-normal distribution
46
log-normal distribution
47
The Transfomation Method (many variables) Theorem Let x 1, x 2,…, x n denote random variables with joint probability density function f(x 1, x 2,…, x n ) Let u 1 = h 1 (x 1, x 2,…, x n ). u 2 = h 2 (x 1, x 2,…, x n ). u n = h n (x 1, x 2,…, x n ). define an invertible transformation from the x’s to the u’s
48
Then the joint probability density function of u 1, u 2,…, u n is given by: where Jacobian of the transformation
49
Example Suppose that x 1, x 2 are independent with density functions f 1 (x 1 ) and f 2 (x 2 ) Find the distribution of u 1 = x 1 + x 2 u 2 = x 1 - x 2 Solving for x 1 and x 2 we get the inverse transformation
50
The Jacobian of the transformation
51
The joint density of x 1, x 2 is f(x 1, x 2 ) = f 1 (x 1 ) f 2 (x 2 ) Hence the joint density of u 1 and u 2 is:
52
From We can determine the distribution of u 1 = x 1 + x 2
53
Hence This is called the convolution of the two densities f 1 and f 2.
54
Example: The ex-Gaussian distribution 1. X has an exponential distribution with parameter. 2. Y has a normal (Gaussian) distribution with mean and standard deviation . Let X and Y be two independent random variables such that: Find the distribution of U = X + Y. This distribution is used in psychology as a model for response time to perform a task.
55
Now The density of U = X + Y is :.
56
or
58
Where V has a Normal distribution with mean and variance 2. Hence Where (z) is the cdf of the standard Normal distribution
59
g(u)g(u) The ex-Gaussian distribution
60
Use of moment generating functions
61
Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function p(x) if discrete) Then m X (t) = the moment generating function of X
62
The distribution of a random variable X is described by either 1.The density function f(x) if X continuous (probability mass function p(x) if X discrete), or 2.The cumulative distribution function F(x), or 3.The moment generating function m X (t)
63
Properties 1. m X (0) = 1 2. 3.
64
4. Let X be a random variable with moment generating function m X (t). Let Y = bX + a Then m Y (t) = m bX + a (t) = E(e [bX + a]t ) = e at E(e X[ bt ] ) = e at m X (bt) 5. Let X and Y be two independent random variables with moment generating function m X (t) and m Y (t). Then m X+Y (t) = E(e [X + Y]t ) = E(e Xt e Yt ) = E(e Xt ) E(e Yt ) = m X (t) m Y (t)
65
6. Let X and Y be two random variables with moment generating function m X (t) and m Y (t) and two distribution functions F X (x) and F Y (y) respectively. Let m X (t) = m Y (t) then F X (x) = F Y (x). This ensures that the distribution of a random variable can be identified by its moment generating function
66
M. G. F.’s - Continuous distributions
67
M. G. F.’s - Discrete distributions
68
Moment generating function of the gamma distribution where
69
using or
70
then
71
Moment generating function of the Standard Normal distribution where thus
72
We will use
73
Note: Also
74
Note: Also
75
Equating coefficients of t k, we get
76
Using of moment generating functions to find the distribution of functions of Random Variables
77
Example Suppose that X has a normal distribution with mean and standard deviation . Find the distribution of Y = aX + b Solution: = the moment generating function of the normal distribution with mean a + b and variance a 2 2.
78
Thus Z has a standard normal distribution. Special Case: the z transformation Thus Y = aX + b has a normal distribution with mean a + b and variance a 2 2.
79
Example Suppose that X and Y are independent each having a normal distribution with means X and Y, standard deviations X and Y Find the distribution of S = X + Y Solution: Now
80
or = the moment generating function of the normal distribution with mean X + Y and variance Thus Y = X + Y has a normal distribution with mean X + Y and variance
81
Example Suppose that X and Y are independent each having a normal distribution with means X and Y, standard deviations X and Y Find the distribution of L = aX + bY Solution: Now
82
or = the moment generating function of the normal distribution with mean a X + b Y and variance Thus Y = aX + bY has a normal distribution with mean a X + b Y and variance
83
Special Case: Thus Y = X - Y has a normal distribution with mean X - Y and variance a = +1 and b = -1.
84
Example (Extension to n independent RV’s) Suppose that X 1, X 2, …, X n are independent each having a normal distribution with means i, standard deviations i (for i = 1, 2, …, n) Find the distribution of L = a 1 X 1 + a 1 X 2 + …+ a n X n Solution: Now (for i = 1, 2, …, n)
85
or = the moment generating function of the normal distribution with mean and variance Thus Y = a 1 X 1 + … + a n X n has a normal distribution with mean a 1 1 + …+ a n n and variance
86
In this case X 1, X 2, …, X n is a sample from a normal distribution with mean , and standard deviations and Special case:
87
Thus and variance has a normal distribution with mean
88
If x 1, x 2, …, x n is a sample from a normal distribution with mean , and standard deviations then Summary and variance has a normal distribution with mean
89
Population Sampling distribution of
90
Suppose x 1, x 2, …, x n is a sample (independent identically distributed – i.i.d.) from a distribution with mean , The Law of Large Numbers Then Let Proof: Previously we used Tchebychev’s Theorem. This assumes ( 2 ) is finite.
91
We will use the following fact: Let m 1 (t), m 2 (t), … denote a sequence of moment generating functions corresponding to the sequence of distribution functions: F 1 (x), F 2 (x), … Let m(t) be a moment generating function corresponding to the distribution function F(x) then if Proof: (use moment generating functions) then
92
Let x 1, x 2, … denote a sequence of independent random variables coming from a distribution with moment generating function m(t) and distribution function F(x). Let S n = x 1 + x 2 + … + x n then
93
using L’Hopitals rule
94
is the moment generating function of a random variable that takes on the value with probability 1.
95
Now Q.E.D.
96
If x 1, x 2, …, x n is a sample from a distribution with mean , and standard deviations then if n is large The Central Limit theorem and variance has a normal distribution with mean
97
We will use the following fact: Let m 1 (t), m 2 (t), … denote a sequence of moment generating functions corresponding to the sequence of distribution functions: F 1 (x), F 2 (x), … Let m(t) be a moment generating function corresponding to the distribution function F(x) then if Proof: (use moment generating functions) then
98
Let x 1, x 2, … denote a sequence of independent random variables coming from a distribution with moment generating function m(t) and distribution function F(x). Let S n = x 1 + x 2 + … + x n then
103
Is the moment generating function of the standard normal distribution Thus the limiting distribution of z is the standard normal distribution Q.E.D.
104
The Central Limit theorem illustrated
105
If x 1, x 2, …, x n is a sample from a distribution with mean , and standard deviations then if n is large The Central Limit theorem and variance has a normal distribution with mean
106
If x 1, x 2 are independent from the uniform distirbution from 0 to 1. Find the distribution of: The Central Limit theorem illustrated let
107
Now
108
Now: The density of
109
n = 1 1 0 1 0 n = 2 n = 3 1 0
110
Distributions of functions of Random Variables Gamma distribution, c 2 distribution, Exponential distribution
111
Therorem Let X and Y denote a independent random variables each having a gamma distribution with parameters (, 1 ) and (, 2 ). Then W = X + Y has a gamma distribution with parameters (, 1 + 2 ). Proof:
112
Recognizing that this is the moment generating function of the gamma distribution with parameters (, 1 + 2 ) we conclude that W = X + Y has a gamma distribution with parameters (, 1 + 2 ).
113
Therorem (extension to n RV’s) Let x 1, x 2, …, x n denote n independent random variables each having a gamma distribution with parameters (, i ), i = 1, 2, …, n. Then W = x 1 + x 2 + … + x n has a gamma distribution with parameters (, 1 + 2 +… + n ). Proof:
114
Recognizing that this is the moment generating function of the gamma distribution with parameters (, 1 + 2 +…+ n ) we conclude that W = x 1 + x 2 + … + x n has a gamma distribution with parameters (, 1 + 2 +…+ n ). Therefore
115
Therorem Suppose that x is a random variable having a gamma distribution with parameters (, ). Then W = ax has a gamma distribution with parameters ( /a, ). Proof:
116
1.Let X and Y be independent random variables having an exponential distribution with parameter then X + Y has a gamma distribution with = 2 and Special Cases 2.Let x 1, x 2,…, x n, be independent random variables having a exponential distribution with parameter then S = x 1 + x 2 +…+ x n has a gamma distribution with = n and 3.Let x 1, x 2,…, x n, be independent random variables having a exponential distribution with parameter then has a gamma distribution with = n and n
117
Distribution of population – Exponential distribution Another illustration of the central limit theorem
118
4.Let X and Y be independent random variables having a 2 distribution with 1 and 2 degrees of freedom respectively then X + Y has a 2 distribution with degrees of freedom 1 + 2. Special Cases -continued 5.Let x 1, x 2,…, x n, be independent random variables having a 2 distribution with 1, 2,…, n degrees of freedom respectively then x 1 + x 2 +…+ x n has a 2 distribution with degrees of freedom 1 +…+ n. Both of these properties follow from the fact that a 2 random variable with degrees of freedom is a random variable with = ½ and = /2.
119
If z has a Standard Normal distribution then z 2 has a 2 distribution with 1 degree of freedom. Recall Thus if z 1, z 2,…, z are independent random variables each having Standard Normal distribution then has a 2 distribution with degrees of freedom.
120
Therorem Suppose that U 1 and U 2 are independent random variables and that U = U 1 + U 2 Suppose that U 1 and U have a 2 distribution with degrees of freedom 1 and respectively. ( 1 < ) Then U 2 has a 2 distribution with degrees of freedom 2 = - 1 Proof:
121
Q.E.D.
122
Tables for Standard Normal distribution
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.