Download presentation
Presentation is loading. Please wait.
Published bySara Mason Modified over 9 years ago
1
Marginal and Conditional distributions
2
Theorem: (Marginal distributions for the Multivariate Normal distribution) have p-variate Normal distribution with mean vector and Covariance matrix Then the marginal distribution of is q i -variate Normal distribution (q 1 = q, q 2 = p - q) with mean vector and Covariance matrix
3
Theorem: (Conditional distributions for the Multivariate Normal distribution) have p-variate Normal distribution with mean vector and Covariance matrix Then the conditional distribution of given is q i -variate Normal distribution with mean vector and Covariance matrix
4
is called the matrix of partial variances and covariances. is called the partial covariance (variance if i = j) between x i and x j given x 1, …, x q. is called the partial correlation between x i and x j given x 1, …, x q.
5
is called the matrix of regression coefficients for predicting x q+1, x q+2, …, x p from x 1, …, x q. Mean vector of x q+1, x q+2, …, x p given x 1, …, x q is:
6
Example: Suppose that Is 4-variate normal with
7
The marginal distribution of is bivariate normal with The marginal distribution of is trivariate normal with
8
Find the conditional distribution of given Now and
10
The matrix of regression coefficients for predicting x 3, x 4 from x 1, x 2.
12
Thus the conditional distribution of given is bivariate Normal with mean vector And partial covariance matrix
13
Using SPSS Note: The use of another statistical package such as Minitab is similar to using SPSS
14
The first step is to input the data. The data is usually contained in some type of file. 1.Text files 2.Excel files 3.Other types of files
15
After starting the SSPS program the following dialogue box appears:
16
If you select Opening an existing file and press OK the following dialogue box appears
17
Once you selected the file and its type
18
The following dialogue box appears:
19
If the variable names are in the file ask it to read the names. If you do not specify the Range the program will identify the Range: Once you “click OK”, two windows will appear
20
A window containing the output
21
The other containing the data:
22
To perform any statistical Analysis select the Analyze menu:
23
To compute correlations select Correlate then Bivariate To compute partial correlations select Correlate then Partial
24
for Bivariate correlation the following dialogue appears
25
the output for Bivariate correlation:
26
for partial correlation the following dialogue appears
27
- - - P A R T I A L C O R R E L A T I O N C O E F F I C I E N T S - - - Controlling for.. AGE HT WT CHL ALB CA UA CHL 1.0000.1299.2957.2338 ( 0) ( 178) ( 178) ( 178) P=. P=.082 P=.000 P=.002 ALB.1299 1.0000.4778.1226 ( 178) ( 0) ( 178) ( 178) P=.082 P=. P=.000 P=.101 CA.2957.4778 1.0000.1737 ( 178) ( 178) ( 0) ( 178) P=.000 P=.000 P=. P=.020 UA.2338.1226.1737 1.0000 ( 178) ( 178) ( 178) ( 0) P=.002 P=.101 P=.020 P=. (Coefficient / (D.F.) / 2-tailed Significance) ". " is printed if a coefficient cannot be computed the output for partial correlation:
28
Compare these with the bivariate correlation:
29
CHL ALB CA UA CHL 1.0000.1299.2957.2338 ALB.1299 1.0000.4778.1226 CA.2957.4778 1.0000.1737 UA.2338.1226.1737 1.0000 Partial Correlations Bivariate Correlations
30
In the last example the bivariate and partial correlations were roughly in agreement. This is not necessarily the case in all stuations An Example: The following data was collected on the following three variables: 1.Age 2.Calcium Intake in diet (CAI) 3.Bone Mass density (BMI)
31
The data
32
Bivariate correlations
33
Partial correlations
34
Scatter plot CAI vs BMI (r = -0.447)
35
25 35 45 55 65 75
36
3D Plot Age, CAI and BMI
40
Transformations Theorem Let x 1, x 2,…, x n denote random variables with joint probability density function f(x 1, x 2,…, x n ) Let u 1 = h 1 (x 1, x 2,…, x n ). u 2 = h 2 (x 1, x 2,…, x n ). u n = h n (x 1, x 2,…, x n ). define an invertible transformation from the x’s to the u’s
41
Then the joint probability density function of u 1, u 2,…, u n is given by: where Jacobian of the transformation
42
Example Suppose that x 1, x 2 are independent with density functions f 1 (x 1 ) and f 2 (x 2 ) Find the distribution of u 1 = x 1 + x 2 u 2 = x 1 - x 2 Solving for x 1 and x 2 we get the inverse transformation
43
The Jacobian of the transformation
44
The joint density of x 1, x 2 is f(x 1, x 2 ) = f 1 (x 1 ) f 2 (x 2 ) Hence the joint density of u 1 and u 2 is:
45
Theorem Let x 1, x 2,…, x n denote random variables with joint probability density function f(x 1, x 2,…, x n ) Let u 1 = a 11 x 1 + a 12 x 2 +…+ a 1n x n + c 1 u 2 = a 21 x 1 + a 22 x 2 +…+ a 2n x n + c 2 u n = a n1 x 1 + a n2 x 2 +…+ a nn x n + c n define an invertible linear transformation from the x’s to the u’s
46
Then the joint probability density function of u 1, u 2,…, u n is given by: where
47
Theorem Suppose that The random vector, [x 1, x 2, … x p ] has a p-variate normal distribution with mean vector and covariance matrix then has a p-variate normal distribution with mean vector and covariance matrix
48
Theorem Suppose that The random vector, [x 1, x 2, … x p ] has a p-variate normal distribution with mean vector and covariance matrix then has a p-variate normal distribution with mean vector and covariance matrix
49
Proof then
50
since Also and hence QED
51
Theorem Suppose that The random vector, has a p-variate normal distribution with mean vector and covariance matrix with mean vector and covariance matrix then has a p-variate normal distribution Let A be a q p matrix of rank q ≤ p
52
proof then is invertible. and covariance matrix Let B be a (p - q) p matrix so that is p–variate normal with mean vector
53
Thus the marginal distribution of and covariance matrix is q–variate normal with mean vector
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.