1 Orthogonality One way to delve further into the impact a factor has on the yield is to break the Sum of Squares (SSQ) into “orthogonal” components. If.

Slides:



Advertisements
Similar presentations
Multiple-choice question
Advertisements

Week 2 – PART III POST-HOC TESTS. POST HOC TESTS When we get a significant F test result in an ANOVA test for a main effect of a factor with more than.
Copyright 2004 David J. Lilja1 Comparing Two Alternatives Use confidence intervals for Before-and-after comparisons Noncorresponding measurements.
General Linear Model Introduction to ANOVA.
Incomplete Block Designs. Randomized Block Design We want to compare t treatments Group the N = bt experimental units into b homogeneous blocks of size.
1 Chapter 4 Experiments with Blocking Factors The Randomized Complete Block Design Nuisance factor: a design factor that probably has an effect.
Chapter 4 Randomized Blocks, Latin Squares, and Related Designs
i) Two way ANOVA without replication
Sample Size Power and other methods. Non-central Chisquare.
Analysis – Regression The ANOVA through regression approach is still the same, but expanded to include all IVs and the interaction The number of orthogonal.
Lesson #23 Analysis of Variance. In Analysis of Variance (ANOVA), we have: H 0 :  1 =  2 =  3 = … =  k H 1 : at least one  i does not equal the others.
DOCTORAL SEMINAR, SPRING SEMESTER 2007 Experimental Design & Analysis Contrasts, Trend Analysis and Effects Sizes February 6, 2007.
Experimental Design Terminology  An Experimental Unit is the entity on which measurement or an observation is made. For example, subjects are experimental.
Lecture 9: One Way ANOVA Between Subjects
8. ANALYSIS OF VARIANCE 8.1 Elements of a Designed Experiment
Analysis of Variance & Multivariate Analysis of Variance
Statistical Methods in Computer Science Hypothesis Testing II: Single-Factor Experiments Ido Dagan.
Introduction to Analysis of Variance (ANOVA)
Statistical Analysis. Purpose of Statistical Analysis Determines whether the results found in an experiment are meaningful. Answers the question: –Does.
If = 10 and = 0.05 per experiment = 0.5 Type I Error Rates I.Per Comparison II.Per Experiment (frequency) = error rate of any comparison = # of comparisons.
Linear Contrasts and Multiple Comparisons (Chapter 9)
Psy B07 Chapter 1Slide 1 ANALYSIS OF VARIANCE. Psy B07 Chapter 1Slide 2 t-test refresher  In chapter 7 we talked about analyses that could be conducted.
Statistics Design of Experiment.
Factorial Experiments
Analysis of Variance Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama.
QNT 531 Advanced Problems in Statistics and Research Methods
When we think only of sincerely helping all others, not ourselves,
Stats Lunch: Day 7 One-Way ANOVA. Basic Steps of Calculating an ANOVA M = 3 M = 6 M = 10 Remember, there are 2 ways to estimate pop. variance in ANOVA:
CHAPTER 12 Analysis of Variance Tests
ANOVA (Analysis of Variance) by Aziza Munir
Orthogonal Linear Contrasts This is a technique for partitioning ANOVA sum of squares into individual degrees of freedom.
Psychology 301 Chapters & Differences Between Two Means Introduction to Analysis of Variance Multiple Comparisons.
Analysis of Variance (ANOVA) Randomized Block Design.
Everyday is a new beginning in life. Every moment is a time for self vigilance.
Orthogonal Linear Contrasts
Chapter 10: Analyzing Experimental Data Inferential statistics are used to determine whether the independent variable had an effect on the dependent variance.
Orthogonal Linear Contrasts This is a technique for partitioning ANOVA sum of squares into individual degrees of freedom.
Design Of Experiments With Several Factors
1 Always be mindful of the kindness and not the faults of others.
1 Always be contented, be grateful, be understanding and be compassionate.
1 Joyful mood is a meritorious deed that cheers up people around you like the showering of cool spring breeze.
Chapter 10: Analysis of Variance: Comparing More Than Two Means.
Specific Comparisons This is the same basic formula The only difference is that you are now performing comps on different IVs so it is important to keep.
ANOVA P OST ANOVA TEST 541 PHL By… Asma Al-Oneazi Supervised by… Dr. Amal Fatani King Saud University Pharmacy College Pharmacology Department.
Three or More Factors: Latin Squares
Hypothesis test flow chart frequency data Measurement scale number of variables 1 basic χ 2 test (19.5) Table I χ 2 test for independence (19.9) Table.
One-Way Analysis of Variance Recapitulation Recapitulation 1. Comparing differences among three or more subsamples requires a different statistical test.
Introduction to ANOVA Research Designs for ANOVAs Type I Error and Multiple Hypothesis Tests The Logic of ANOVA ANOVA vocabulary, notation, and formulas.
Designs for Experiments with More Than One Factor When the experimenter is interested in the effect of multiple factors on a response a factorial design.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Experimental Design and Analysis of Variance Chapter 11.
Experimental Designs The objective of Experimental design is to reduce the magnitude of random error resulting in more powerful tests to detect experimental.
DSCI 346 Yamasaki Lecture 4 ANalysis Of Variance.
CHAPTER 4 Analysis of Variance (ANOVA)
SPSS Homework SPSS Homework 12.1 Practice Data from exercise ) Use linear contrasts to compare 5 days vs 20 and 35 days 2) Imagine you.
Everyday is a new beginning in life.
Factorial Experiments
Two way ANOVA with replication
i) Two way ANOVA without replication
Comparing Three or More Means
Two way ANOVA with replication
Chapter 10: Analysis of Variance: Comparing More Than Two Means
Chapter 5 Introduction to Factorial Designs
Linear Contrasts and Multiple Comparisons (§ 8.6)
Statistics for the Social Sciences
Chi Square (2) Dr. Richard Jackson
Chapter 13 Group Differences
Review Questions III Compare and contrast the components of an individual score for a between-subject design (Completely Randomized Design) and a Randomized-Block.
Joyful mood is a meritorious deed that cheers up people around you
Inference for Two Way Tables
Multiple comparisons - multiple pairwise tests - orthogonal contrasts
Presentation transcript:

1 Orthogonality One way to delve further into the impact a factor has on the yield is to break the Sum of Squares (SSQ) into “orthogonal” components. If SSB col has (C-1) df (which corresponds with having C levels, or C columns ), the SSB col can be broken up into (C-1) individual SSQ values, each with a single degree of freedom, each addressing a different inquiry into the data’s message.

2 If each “question” asked of the data is orthogonal to all the other “questions”, two generally desirable properties result: 1. Each “question” is independent of each other one; the probabilities of Type I and Type II errors in the ensuing hypothesis tests are independent, and “stand alone”. Orthogonality

3 Consider 4 column means: Grand Mean = 2 2. The (C-1) SSQ values are guaranteed to add up exactly to the total SSB col you started with. What? How? -Watch! Orthogonality 1234

4 Call these values: Y1, Y2, Y3, Y4, 4 and define  1 =  a 1 j Y j, j=1 4  2 =  a  j Y j, j =1 4 and    =  a  j Y j, j=1

5 Under what conditions will 3 4  2 i =  Y j - Y) 2 ? i=1 j=1 one answer: 4 (1)  a ij = 1 for all i j=1 (i=1,2,3) 4 (2)  a ij = 0 for all i j=1 2 4 (3.)  a i 1 j. a i 2 j = 0 for all i 1, i 2, j=1 i 1 = i 2. A linear combination of treatment means satisfying (2) is called a contrast. orthogonal

6 Writing the a ij ’s as a “matrix”, one possibility among many: 1/2 1/2 -1/2 -1/2 1/2 -1/2 1/2 -1/2 1/2 -1/2 -1/2 1/  2 Y 1 Y 2 Y 3 Y 4 Y= 2  1 = 1/2 1/2 -1/2 -1/  2 = 1/2 -1/2 1/2 -1/2 3 9  3  = 1/2 -1/2 -1/2 1/

7  Y j -Y) 2 = (6-2) 2 + (4-2) 2 + (1-2) 2 + (-3-2) 2 = = OK! How does this help us? 46

8 Consider the following data, which, let’s say, are the column means of a one factor ANOVA, with the one factor being “DRUG”: Y. 1 Y. 2 Y. 3 Y. 4 Y.. = and  (Y. j - Y..) 2 = 14. (SSB c = 14. R, where R = # rows)

9 Consider the following two examples: Example Placebo Sulfa Type S 1 Sulfa Type S 2 Anti- biotic Type A Suppose the questions of interest are (1) Placebo vs. Non-placebo (2) S 1 vs. S 2 (3) (Average) S vs. A

10 How would you combine columns to analyze the question? P vs. P: S 1 vs. S 2 : S vs. A: Note Conditions 2 & 3 Satisfied PS1S1 S2S2 A

11 divide top row by middle row by bottom row by                            (to satisfy condition 1)

12 Y. 1 Y. 2 Y. 3 Y. 4 Z i 2 3  12 1  12 1  12 1  212 121 616 1616 262 Placebo vs. drugs S 1 vs. S 2 Average S vs. A P S 1 S 2 A

13 Example 2: Y. 1 Y. 2 Y. 3 Y. 4 antibiotic type antibiotic type sulfa type sulfa type S 1 S 2 A 1 A 2

14 Exercise: Suppose the questions of interest are: 1.The difference between sulfa types 2.The difference between antibiotic types 3.The difference between sulfa and antibiotic types, on average. Write down the three corresponding contrasts. Are they orthogonal? If not, can we make them orthogonal?

15 OK! Now to the analysis: Y. 1 Y. 2 Y. 3 Y. 4 Z i 2 S 1 vs. S 2 A 1 vs. A 2 Ave. S vs. Ave. A (5)(6)(7)(10) S 1 S 2 A 1 A 212 121 212 1212 1414 1414 1414 141

16 Example: { R=8 Placebo. 5 ASP 1. 6 ASP 2. 7 Buff. 10 Y..= 7

17 ANOVA F.05 (3,28)=2.95

18 Now, an orthogonal breakdown: Placebo vs. others ASP 1 vs. ASP 2 ASP vs. Buff  12 1  12 1  2  6  6 2  6 1  12 1  12 0   2 7  6 8  12 Placebo ASP 1 ASP 2 Buff Z Pl’bo vs Bff00  2  2  2

19 Z 8  12 Z 2 Z 2 x  7   /2

20 ANOVA Source SSQ df MSQ F Drugs Error { Z1Z2Z3Z1Z2Z3 112 { { F (1,28)=4.20 Z4Z F 1-.05/3 (1,28)<7.64

21 Another Example: The variable (coded) is mileage per gallon. Gasoline I II III IV V YIELD Standard Gasoline Standard, plus additive A made by P Standard, plus additive B made by P Standard, plus additive A made by Q Standard, plus additive B made by Q A significant difference between Placebo and the rest, and between ASP’s and BUFF, but not between the two different ASP’s.

22 Questions actually chosen: Standard gasoline vs gasoline with an additive P vs. Q Between the two additives of P Between the two additives of Q (Z1)(Z2)(Z3)(Z4)

23 With appropriate orthogonal matrix and Z 2 values: 121 II III IVV Z 2 i 1  20 1  212 1212 Z1Z2Z3Z4Z1Z2Z3Z4 I + 4  414 0 1414 0 1  20 1414 0 1  20 1414 0 1212 By far, the largest part of the total variability in yields is associated with standard gasoline vs. gasoline with an additive.

24 Let n=4. Let the four observed yields be the four yields of a 2 2 factorial experiment: Orthogonal Breakdowns In 2 k and 2 k-p designs Y 1 = 1Y 2 = aY 3 = bY 4 = ab

25 Example: Miles per Gallon by Gas Type and Auto Make Group

26 Suppose: 1   a23 3  b18 4  ab 24 A = Gas Type = 0, 1 B = Auto Make = 0, 1 (2 2 design)

27 Earlier we formed estimate of 2A estimate of 2B estimate of 2AB 1 a b ab

28 Which for present purposes we replace by: 1 a b ab Z Now we can see that these coefficients of the yields are elements of the orthogonal matrix. So, A, B and AB constitute orthogonal estimates. A B AB

29 Standard one-way ANOVA: F.95 (3,20)  3.1 SourceSSQdfMSQF Col Error

30 Then, A =( ) = B=( ) = AB=( ) = A 2 = 49, B 2 = 4, AB 2 =

31 Multiply each of these by the number of data points in each column: A 2  6(49)=294 B 2  6(4)=24 AB 2  6(1)=6 TOTAL : 324

32 And: ANOVA: F.95 (1,20)  4.3 SourceSSQdfMSF calc Col3243 Error ABAB { { {

33 If: 1  c15 2  a23 3  b18 4  abc 24 A = Gas Type B = Auto Make C = Highway (2 3-1 design)

34 ANOVA: We’d get the same breakdown of the SSQ, but being the + block of I = ABC, SourceSSQdf A+BC2941 B+AC2431 AB+C61 Error40020 { { e t c.

35 What if contrasts of interest are not orthogonal? *Bonferroni Method: The same F test (SSQ = RxZ i ^2) but using  = a/k, where a is the overall error rate. *Scheffe Method: p.108, skipped. Reference: Statistical Principles of Research Design and Analysis by Robert O. Kuehl. Let k be the number of contrasts of interest. 1.If k <= c-1  Bonferroni method 2.If k > c-1  Bonferroni or Scheffe method

36 1.Can k be larger than c-1? 2.Can k be smaller than c-1? If k contrasts are orthogonal, No. Yes. For case 2, do the same F test (but the sum of SSQ will not be equal to SSB). See Slide 16.