Download presentation
Presentation is loading. Please wait.
Published byEunice Collins Modified over 9 years ago
1
Multivariate distributions
2
The Normal distribution
3
1.The Normal distribution – parameters and (or 2 ) Comment: If = 0 and = 1 the distribution is called the standard normal distribution Normal distribution with = 50 and =15 Normal distribution with = 70 and =20
4
The probability density of the normal distribution If a random variable, X, has a normal distribution with mean and variance 2 then we will write:
5
The multivariate Normal distribution
6
Let = a random vector Let = a vector of constants (the mean vector)
7
Let = a p × p positive definite matrix
8
Definition The matrix A is positive semi definite if Further the matrix A is positive definite if
9
Suppose that the joint density of the random vector The random vector, [x 1, x 2, … x p ] is said to have a p-variate normal distribution with mean vector and covariance matrix We will write:
10
Example: the Bivariate Normal distribution withand
11
Now and
13
Hence where
14
Note: is constant when is constant. This is true when x 1, x 2 lie on an ellipse centered at 1, 2.
16
Surface Plots of the bivariate Normal distribution
17
Contour Plots of the bivariate Normal distribution
18
Scatter Plots of data from the bivariate Normal distribution
19
Trivariate Normal distribution - Contour map x1x1 x2x2 x3x3 mean vector
20
Trivariate Normal distribution x1x1 x2x2 x3x3
21
x1x1 x2x2 x3x3
22
x1x1 x2x2 x3x3
23
example In the following study data was collected for a sample of n = 183 females on the variables Age, Height (Ht), Weight (Wt), Birth control pill use (Bpl - 1=no pill, 2=pill) and the following Blood Chemistry measurements Cholesterol (Chl), Albumin (Abl), Calcium (Ca) and Uric Acid (UA). The data are tabulated next page:
24
The data :
26
Alb, Chl, Bp
27
Marginal and Conditional distributions
28
Theorem: (Marginal distributions for the Multivariate Normal distribution) have p-variate Normal distribution with mean vector and Covariance matrix Then the marginal distribution of is q i -variate Normal distribution (q 1 = q, q 2 = p - q) with mean vector and Covariance matrix
29
Theorem: (Conditional distributions for the Multivariate Normal distribution) have p-variate Normal distribution with mean vector and Covariance matrix Then the conditional distribution of given is q i -variate Normal distribution with mean vector and Covariance matrix
30
Proof: (of Previous two theorems) is where, The joint density of and
31
where, and
32
also, and
33
,
35
The marginal distribution of is
36
The conditional distribution of given is:
37
is called the matrix of partial variances and covariances. is called the partial covariance (variance if i = j) between x i and x j given x 1, …, x q. is called the partial correlation between x i and x j given x 1, …, x q.
38
is called the matrix of regression coefficients for predicting x q+1, x q+2, …, x p from x 1, …, x q. Mean vector of x q+1, x q+2, …, x p given x 1, …, x q is:
39
Example: Suppose that Is 4-variate normal with
40
The marginal distribution of is bivariate normal with The marginal distribution of is trivariate normal with
41
Find the conditional distribution of given Now and
43
The matrix of regression coefficients for predicting x 3, x 4 from x 1, x 2.
45
Thus the conditional distribution of given is bivariate Normal with mean vector And partial covariance matrix
46
Using SPSS Note: The use of another statistical package such as Minitab is similar to using SPSS
47
The first step is to input the data. The data is usually contained in some type of file. 1.Text files 2.Excel files 3.Other types of files
48
After starting the SSPS program the following dialogue box appears:
49
If you select Opening an existing file and press OK the following dialogue box appears
50
Once you selected the file and its type
51
The following dialogue box appears:
52
If the variable names are in the file ask it to read the names. If you do not specify the Range the program will identify the Range: Once you “click OK”, two windows will appear
53
A window containing the output
54
The other containing the data:
55
To perform any statistical Analysis select the Analyze menu:
56
To compute correlations select Correlate then Bivariate To compute partial correlations select Correlate then Partial
57
for Bivariate correlation the following dialogue appears
58
the output for Bivariate correlation:
59
for partial correlation the following dialogue appears
60
- - - P A R T I A L C O R R E L A T I O N C O E F F I C I E N T S - - - Controlling for.. AGE HT WT CHL ALB CA UA CHL 1.0000.1299.2957.2338 ( 0) ( 178) ( 178) ( 178) P=. P=.082 P=.000 P=.002 ALB.1299 1.0000.4778.1226 ( 178) ( 0) ( 178) ( 178) P=.082 P=. P=.000 P=.101 CA.2957.4778 1.0000.1737 ( 178) ( 178) ( 0) ( 178) P=.000 P=.000 P=. P=.020 UA.2338.1226.1737 1.0000 ( 178) ( 178) ( 178) ( 0) P=.002 P=.101 P=.020 P=. (Coefficient / (D.F.) / 2-tailed Significance) ". " is printed if a coefficient cannot be computed the output for partial correlation:
61
Compare these with the bivariate correlation:
62
CHL ALB CA UA CHL 1.0000.1299.2957.2338 ALB.1299 1.0000.4778.1226 CA.2957.4778 1.0000.1737 UA.2338.1226.1737 1.0000 Partial Correlations Bivariate Correlations
63
In the last example the bivariate and partial correlations were roughly in agreement. This is not necessarily the case in all stuations An Example: The following data was collected on the following three variables: 1.Age 2.Calcium Intake in diet (CAI) 3.Bone Mass density (BMI)
64
The data
65
Bivariate correlations
66
Partial correlations
67
Scatter plot CAI vs BMI (r = -0.447)
68
25 35 45 55 65 75
69
3D Plot Age, CAI and BMI
73
Independence
74
Note: two vectors,, are independent if If is multivariate Normal with mean vector and Covariance matrix Then the conditional distribution of given is equal to the marginal distribution of Then the two vectors,, are independent if
75
The components of the vector,, are independent if ij = 0 for all i and j (i ≠ j ) i. e. is a diagonal matrix
76
Transformations
77
Theorem Let x 1, x 2,…, x n denote random variables with joint probability density function f(x 1, x 2,…, x n ) Let u 1 = h 1 (x 1, x 2,…, x n ). u 2 = h 2 (x 1, x 2,…, x n ). u n = h n (x 1, x 2,…, x n ). define an invertible transformation from the x’s to the u’s
78
Then the joint probability density function of u 1, u 2,…, u n is given by: where Jacobian of the transformation
79
Example Suppose that u 1, u 2 are independent with a uniform distribution from 0 to 1 Find the distribution of Solving for u 1 and u 2 we get the inverse transformation
80
also Hence and
81
The Jacobian of the transformation
83
The joint density of u 1, u 2 is f(u 1, u 2 ) = f 1 (u 1 ) f 2 (u 2 ) Hence the joint density of z 1 and z 2 is:
84
Thus z 1 and z 2 are independent Standard normal. The transformation is useful for converting uniform RV’s into independent standard normal RV’s
85
Example Suppose that x 1, x 2 are independent with density functions f 1 (x 1 ) and f 2 (x 2 ) Find the distribution of u 1 = x 1 + x 2 u 2 = x 1 - x 2 Solving for x 1 and x 2 we get the inverse transformation
86
The Jacobian of the transformation
87
The joint density of x 1, x 2 is f(x 1, x 2 ) = f 1 (x 1 ) f 2 (x 2 ) Hence the joint density of u 1 and u 2 is:
88
Theorem Let x 1, x 2,…, x n denote random variables with joint probability density function f(x 1, x 2,…, x n ) Let u 1 = a 11 x 1 + a 12 x 2 +…+ a 1n x n + c 1 u 2 = a 21 x 1 + a 22 x 2 +…+ a 2n x n + c 2 u n = a n1 x 1 + a n2 x 2 +…+ a nn x n + c n define an invertible linear transformation from the x’s to the u’s
89
Then the joint probability density function of u 1, u 2,…, u n is given by: where
90
Theorem Suppose that The random vector, [x 1, x 2, … x p ] has a p-variate normal distribution with mean vector and covariance matrix then has a p-variate normal distribution with mean vector and covariance matrix
91
Theorem Suppose that The random vector, [x 1, x 2, … x p ] has a p-variate normal distribution with mean vector and covariance matrix then has a p-variate normal distribution with mean vector and covariance matrix
92
Proof then
93
since Also and hence QED
94
Theorem (Linear transformations of Normal RV’s) Suppose that The random vector, has a p-variate normal distribution with mean vector and covariance matrix with mean vector and covariance matrix then has a p-variate normal distribution Let A be a q × p matrix of rank q ≤ p
95
proof then is invertible. and covariance matrix Let B be a (p - q) × p matrix so that is p–variate normal with mean vector
96
Thus the marginal distribution of and covariance matrix is q–variate normal with mean vector
97
Summary – Distribution Theory for Multivariate Normal Marginal distribution Conditional distribution
98
(Linear transformations of Normal RV’s) Suppose that The random vector, has a p-variate normal distribution with mean vector and covariance matrix with mean vector and covariance matrix then has a p-variate normal distribution Let A be a q × p matrix of rank q ≤ p
99
Recall: Definition of eigenvector, eigenvalue Let A be an n × n matrix Let then is called an eigenvalue of A and and is called an eigenvector of A and
100
Thereom If the matrix A is symmetric with distinct eigenvalues, 1, …, n, with corresponding eigenvectors Assume
101
Applications of these results to Statistics Suppose that The random vector, [x 1, x 2, … x p ] has a p-variate normal distribution with mean vector and covariance matrix Then and covariance matrix is positive definite. Suppose 1, …, p are the eigenvalues of corresponding eigenvectors of unit length Note 1 > 0, …, p > 0
102
Let
105
Suppose that the random vector, [x 1, x 2, … x p ] has a p-variate normal distribution with mean vector and covariance matrix then has a p-variate normal distribution with mean vector and covariance matrix
106
Thus the components of are independent normal with mean 0 and variance 1. and Has a 2 distribution with p degrees of freedom
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.