Download presentation
Presentation is loading. Please wait.
Published byMelinda Jacobs Modified over 9 years ago
1
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …). 2.Identify the distribution of W from its moment generating function This procedure works well for sums, linear combinations etc.
2
Therorem Let X and Y denote a independent random variables each having a gamma distribution with parameters (, 1 ) and (, 2 ). Then W = X + Y has a gamma distribution with parameters (, 1 + 2 ). Proof:
3
Recognizing that this is the moment generating function of the gamma distribution with parameters (, 1 + 2 ) we conclude that W = X + Y has a gamma distribution with parameters (, 1 + 2 ).
4
Therorem (extension to n RV’s) Let x 1, x 2, …, x n denote n independent random variables each having a gamma distribution with parameters (, i ), i = 1, 2, …, n. Then W = x 1 + x 2 + … + x n has a gamma distribution with parameters (, 1 + 2 +… + n ). Proof:
5
Recognizing that this is the moment generating function of the gamma distribution with parameters (, 1 + 2 +…+ n ) we conclude that W = x 1 + x 2 + … + x n has a gamma distribution with parameters (, 1 + 2 +…+ n ). Therefore
6
Therorem Suppose that x is a random variable having a gamma distribution with parameters (, ). Then W = ax has a gamma distribution with parameters ( /a, ). Proof:
7
1.Let X and Y be independent random variables having a 2 distribution with 1 and 2 degrees of freedom respectively then X + Y has a 2 distribution with degrees of freedom 1 + 2. Special Cases 2.Let x 1, x 2,…, x n, be independent random variables having a 2 distribution with 1, 2,…, n degrees of freedom respectively then x 1 + x 2 +…+ x n has a 2 distribution with degrees of freedom 1 +…+ n. Both of these properties follow from the fact that a 2 random variable with degrees of freedom is a random variable with = ½ and = /2.
8
If z has a Standard Normal distribution then z 2 has a 2 distribution with 1 degree of freedom. Recall Thus if z 1, z 2,…, z are independent random variables each having Standard Normal distribution then has a 2 distribution with degrees of freedom.
9
Therorem Suppose that U 1 and U 2 are independent random variables and that U = U 1 + U 2 Suppose that U 1 and U have a 2 distribution with degrees of freedom 1 and respectively. ( 1 < ) Then U 2 has a 2 distribution with degrees of freedom 2 = - 1 Proof:
10
Q.E.D.
11
Distribution of the sample variance
12
Properties of the sample variance Proof:
14
Special Cases 1.Setting a = 0. Computing formula
15
2.Setting a = .
16
Distribution of the sample variance Let x 1, x 2, …, x n denote a sample from the normal distribution with mean and variance 2. Let Then has a 2 distribution with n degrees of freedom.
17
Note: or U = U 2 + U 1 has a 2 distribution with n degrees of freedom.
18
has normal distribution with mean and variance 2 /n Thus has a 2 distribution with 1 degree of freedom. We also know that has a Standard Normal distribution and
19
If we can show that U 1 and U 2 are independent then has a 2 distribution with n - 1 degrees of freedom. The final task would be to show that are independent
20
Summary Let x 1, x 2, …, x n denote a sample from the normal distribution with mean and variance 2. 2. has a 2 distribution with = n - 1 degrees of freedom. 1.than has normal distribution with mean and variance 2 /n
21
The Transformation Method Theorem Let X denote a random variable with probability density function f(x) and U = h(X). Assume that h(x) is either strictly increasing (or decreasing) then the probability density of U is:
22
Proof Use the distribution function method. Step 1 Find the distribution function, G(u) Step 2 Differentiate G (u ) to find the probability density function g(u)
23
hence
24
or
25
Example Suppose that X has a Normal distribution with mean and variance 2. Find the distribution of U = h(x) = e X. Solution:
26
hence This distribution is called the log-normal distribution
27
log-normal distribution
28
The Transfomation Method (many variables) Theorem Let x 1, x 2,…, x n denote random variables with joint probability density function f(x 1, x 2,…, x n ) Let u 1 = h 1 (x 1, x 2,…, x n ). u 2 = h 2 (x 1, x 2,…, x n ). u n = h n (x 1, x 2,…, x n ). define an invertible transformation from the x’s to the u’s
29
Then the joint probability density function of u 1, u 2,…, u n is given by: where Jacobian of the transformation
30
Example Suppose that x 1, x 2 are independent with density functions f 1 (x 1 ) and f 2 (x 2 ) Find the distribution of u 1 = x 1 + x 2 u 2 = x 1 - x 2 Solving for x 1 and x 2 we get the inverse transformation
31
The Jacobian of the transformation
32
The joint density of x 1, x 2 is f(x 1, x 2 ) = f 1 (x 1 ) f 2 (x 2 ) Hence the joint density of u 1 and u 2 is:
33
From We can determine the distribution of u 1 = x 1 + x 2
34
Hence This is called the convolution of the two densities f 1 and f 2.
35
Example: The ex-Gaussian distribution 1. X has an exponential distribution with parameter. 2. Y has a normal (Gaussian) distribution with mean and standard deviation . Let X and Y be two independent random variables such that: Find the distribution of U = X + Y. This distribution is used in psychology as a model for response time to perform a task.
36
Now The density of U = X + Y is :.
37
or
39
Where V has a Normal distribution with mean and variance 2. Hence Where (z) is the cdf of the standard Normal distribution
40
g(u)g(u) The ex-Gaussian distribution
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.