Download presentation
Presentation is loading. Please wait.
Published byJoleen Johns Modified over 9 years ago
1
Orthogonal Linear Contrasts A technique for partitioning ANOVA sum of squares into individual degrees of freedom
2
Definition Let 1, 2,..., p denote p means and c 1, c 2,..., c p denote p coefficients such that: c 1 + c 2 +... + c p = 0, Then the linear combination L = c 1 1 + c 2 2 +... + c p p is called a linear contrast of the p means 1, 2,..., p.
3
Definition Let A = a 1 1 + a 2 2 +... + a p p and B = b 1 1 + b 2 2 +... + b p p be two linear contrasts of the p means 1, 2,..., p. Then A and B are called Orthogonal Linear Contrasts if in addition to: a 1 + a 2 +... + a p = 0 and b 1 + b 2 +... + b p = 0, it is also true that: a 1 b 1 + a 2 b 2 +... + a p b p = 0.
4
Definition Let A = a 1 1 + a 2 2 +... + a p p, B= b 1 1 + b 2 2 +... + b p p,..., and L= l 1 1 + l 2 2 +... + l p p be a set linear contrasts of the p means 1, 2,..., p. Then the set is called a set of Mutually Orthogonal Linear Contrasts if each linear contrast in the set is orthogonal to any other linear contrast.
5
Theorem: The maximum number of linear contrasts in a set of Mutually Orthogonal Linear Contrasts of the quantities 1, 2,..., p is p - 1. p - 1 is called the degrees of freedom (d.f.) for comparing quantities 1, 2,..., p.
6
Comments 1.Linear contrasts are making comparisons amongst the p values 1, 2,..., p 2.Orthogonal Linear Contrasts are making independent comparisons amongst the p values 1, 2,..., p. 3.The number of independent comparisons amongst the p values 1, 2,..., p is p – 1.
7
Definition Let denote a linear contrast of the p means Let where each mean,, is calculated from n observations.
8
Then the Sum of Squares for testing the Linear Contrast L, i.e. H 0 : L = 0 against H A : L 0 is defined to be:
9
the degrees of freedom (df) for testing the Linear Contrast L, is defined to be the F-ratio for testing the Linear Contrast L, is defined to be:
10
To test if a set of mutually orthogonal linear contrasts are zero: i.e. H 0 : L 1 = 0, L 2 = 0,..., L k = 0 then the Sum of Squares is: the degrees of freedom (df) is the F-ratio is:
11
Theorem: Let L 1, L 2,..., L p-1 denote p-1 mutually orthogonal Linear contrasts for comparing the p means. Then the Sum of Squares for comparing the p means based on p – 1 degrees of freedom, SS Between, satisfies:
12
Comment Defining a set of Orthogonal Linear Contrasts for comparing the p means allows the researcher to "break apart" the Sum of Squares for comparing the p means, SS Between, and make individual tests of each the Linear Contrast.
13
Helmert contrasts Contrastcoefficients L1L1 1000 L2L2 200 L3L3 30 L4L4 4 Contrastexplanation L1L1 2 nd versus 1 st L2L2 3 rd versus 1 st and 2 nd L3L3 4 th versus 1 st, 2 nd and 3 rd L4L4 5 th versus 1 st, 2 nd, 3 rd and 4 th
14
The Diet-Weight Gain example The sum of Squares for comparing the 6 means is given in the Anova Table:
15
Five mutually orthogonal contrasts are given below (together with a description of the purpose of these contrasts) : (A comparison of the High protein diets with Low protein diets) (A comparison of the Beef source of protein with the Pork source of protein)
16
(A comparison of the Meat (Beef - Pork) source of protein with the Cereal source of protein) (A comparison representing interaction between Level of protein and Source of protein for the Meat source of Protein) (A comparison representing interaction between Level of protein with the Cereal source of Protein)
17
Table of Coefficients Contrast diet 123456 L1L1 111 L2L2 10 10 L3L3 2 2 L4L4 10 01 L5L5 2 1-21 Note: L 4 = L 1 × L 2 and L 5 = L 1 × L 3 L 1 is the 1 df for the Level main effect L 2 and L 3 are the 2 df for the Source main effect L 4 and L 5 are the 2 df for the Source-Level interaction
18
The Anova Table for Testing these contrasts is given below: The Mutually Orthogonal contrasts that are eventually selected should be determine prior to observing the data and should be determined by the objectives of the experiment
19
Another Five mutually orthogonal contrasts are given below (together with a description of the purpose of these contrasts) : (A comparison of the Beef source of protein with the Pork source of protein) (A comparison of the Meat (Beef - Pork) source of protein with the Cereal source of protein)
20
(A comparison of the high and low protein diets for the Beef source of protein) (A comparison of the high and low protein diets for the Cereal source of protein) (A comparison of the high and low protein diets for the Pork source of protein)
21
Table of Coefficients Contrast diet 123456 L1L1 1010 L2L2 1-211 1 L3L3 10000 L4L4 0100 0 L5L5 00100 Note: L 1 and L 2 are the 2 df for the Source main effect L 3,L 4 and L 5 are the 3 df comparing the Level within the Source.
22
The Anova Table for Testing these contrasts is given below:
23
Orthogonal Linear Contrasts Polynomial Regression
24
Let 1, 2,..., p denote p means and consider the first differences i = i - i-1 if 1 = 2 =... = p then i = i - i-1 = 0 If the points (1, 1 ), (2, 2 ) … (p, p ) lie on a straight line with non-zero slope then i = i - i-1 0 but equal.
25
Consider the 2 nd differences 2 i = ( i - i-1 )-( i -1 - i-2 ) = i - 2 i-1 + i-2 If the points (1, 1 ), (2, 2 ) … (p, p ) lie on a straight line then 2 i = i - 2 i-1 + i-2 = 0 If the points (1, 1 ), (2, 2 ) … (p, p ) lie on a quadratic curve then 2 i = i - 2 i-1 + i-2 0 but equal.
26
Consider the 3 rd differences 3 i = i - 3 i-1 + 3 i-2 - i-3 If the points (1, 1 ), (2, 2 ) … (p, p ) lie on a quadratic curve then 3 i = i - 3 i-1 + 3 i-2 - i-3 = 0 If the points (1, 1 ), (2, 2 ) … (p, p ) lie on a cubic curve then 3 i = i - 3 i-1 + 3 i-2 - i-3 0 but equal.
27
Continuing, 4 th differences, 4 i will be non- zero but equal if the points (1, 1 ), (2, 2 ) … (p, p ) lie on a quartic curve (4 th degree). 5 th differences, 5 i will be non- zero but equal if the points (1, 1 ), (2, 2 ) … (p, p ) lie on a quintic curve (5 th degree). etc.
28
Let L = a 2 2 + a 3 3 + … + a p p Q 2 = b 3 2 3 + … + b p 2 p C = c 4 3 4 + … + c p 3 p Q 4 = d 5 4 5 + … + d p 4 p etc. Where a 2, …, a p, b 1, …, b p, c 1, … etc are chosen so that L, Q 2, C, Q 4, … etc are mutually orthogonal contrasts.
29
If the means are equal then L = Q 2 = C = Q 4 = … = 0. If the means are linear then L 0 but Q 2 = C = Q 4 = … = 0. If the means are quadratic then Q 2 0 but C = Q 4, … = 0. If the means are cubic then C 0 but Q 4 = … = 0.
30
Orthogonal Linear Contrasts for Polynomial Regression
32
Example In this example we are measuring the “Life” of an electronic component and how it depends on the temperature on activation
33
The Anova Table SourceSSdfMSF Treat6604165.023.57 Linear187.501187.5026.79 Quadratic433.931433.9361.99 Cubic0.0010.000.00 Quartic38.57138.575.51 Error70107.00 Total73014 L = 25.00Q 2 = -45.00C = 0.00Q 4 = 30.00
34
The Anova Tables for Determining degree of polynomial Testing for effect of the factor
35
Testing for departure from Linear Q 2 + C + Q 4
36
Testing for departure from Quadratic C + Q 4
39
The Analysis of Covariance ANACOVA
40
Multiple Regression 1.Dependent variable Y (continuous) 2.Continuous independent variables X 1, X 2, …, X p The continuous independent variables X 1, X 2, …, X p are quite often measured and observed (not set at specific values or levels)
41
Analysis of Variance 1.Dependent variable Y (continuous) 2.Categorical independent variables (Factors) A, B, C,… The categorical independent variables A, B, C,… are set at specific values or levels.
42
Analysis of Covariance 1.Dependent variable Y (continuous) 2.Categorical independent variables (Factors) A, B, C,… 3.Continuous independent variables (covariates) X 1, X 2, …, X p
43
Example 1.Dependent variable Y – weight gain 2.Categorical independent variables (Factors) i.A = level of protein in the diet (High, Low) ii.B = source of protein (Beef, Cereal, Pork) 3.Continuous independent variables (covariates) i.X 1 = initial wt. of animal.
44
Statistical Technique Independent variables continuouscategorical Multiple Regression× ANOVA× ANACOVA×× Dependent variable is continuous It is possible to treat categorical independent variables in Multiple Regression using Dummy variables.
45
The Multiple Regression Model
46
The ANOVA Model
47
The ANACOVA Model
48
ANOVA Tables
49
The Multiple Regression Model SourceS.S.d.f. RegressionSS Reg p ErrorSS Error n – p - 1 TotalSS Total n - 1
50
The ANOVA Model SourceS.S.d.f. Main Effects ASS A a - 1 BSS B b - 1 Interactions ABSS AB (a – 1)(b – 1) ErrorSS Error n – p - 1 TotalSS Total n - 1
51
The ANACOVA Model SourceS.S.d.f. CovariatesSS Covaraites p Main Effects ASS A a - 1 BSS B b - 1 Interactions ABSS AB (a – 1)(b – 1) ErrorSS Error n – p - 1 TotalSS Total n - 1
52
Example 1.Dependent variable Y – weight gain 2.Categorical independent variables (Factors) i.A = level of protein in the diet (High, Low) ii.B = source of protein (Beef, Cereal, Pork) 3.Continuous independent variables (covariates) X = initial wt. of animal.
53
The data
54
The ANOVA Table
55
Using SPSS to perform ANACOVA
56
The data file
57
Select Analyze General Linear Model Univariate
58
Choose the Dependent Variable, the Fixed Factor(s) and the Covaraites
59
The following ANOVA table appears
60
Covariate Dependent variable The Process of Analysis of Covariance
61
Covariate Adjusted Dependent variable The Process of Analysis of Covariance
62
The dependent variable (Y) is adjusted so that the covariate takes on its average value for each case The effect of the factors ( A, B, etc) are determined using the adjusted value of the dependent variable.
63
ANOVA and ANACOVA can be handled by Multiple Regression Package by the use of Dummy variables to handle the categorical independent variables. The results would be the same.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.