ANOVA and Multiple Comparison Tests

Slides:



Advertisements
Similar presentations
Week 2 – PART III POST-HOC TESTS. POST HOC TESTS When we get a significant F test result in an ANOVA test for a main effect of a factor with more than.
Advertisements

ANALYSIS OF VARIANCE (ONE WAY)
I OWA S TATE U NIVERSITY Department of Animal Science Using Basic Graphical and Statistical Procedures (Chapter in the 8 Little SAS Book) Animal Science.
A. The Basic Principle We consider the multivariate extension of multiple linear regression – modeling the relationship between m responses Y 1,…,Y m and.
Factorial ANOVA More than one categorical explanatory variable.
Independent t -test Features: One Independent Variable Two Groups, or Levels of the Independent Variable Independent Samples (Between-Groups): the two.
Other Analysis of Variance Designs Chapter 15. Chapter Topics Basic Experimental Design Concepts  Defining Experimental Design  Controlling Nuisance.
Analysis of variance (ANOVA)-the General Linear Model (GLM)
Chapter 10 Analysis of Variance (ANOVA) Part III: Additional Hypothesis Tests Renee R. Ha, Ph.D. James C. Ha, Ph.D Integrative Statistics for the Social.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Smith/Davis (c) 2005 Prentice Hall Chapter Thirteen Inferential Tests of Significance II: Analyzing and Interpreting Experiments with More than Two Groups.
Multiple regression analysis
ANOVA notes NR 245 Austin Troy
Part I – MULTIVARIATE ANALYSIS
Intro to Statistics for the Behavioral Sciences PSYC 1900
Mean Comparison With More Than Two Groups
Lecture 9: One Way ANOVA Between Subjects
Inferences About Process Quality
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Analysis of Variance & Multivariate Analysis of Variance
Assumption of Homoscedasticity
Repeated Measures ANOVA Used when the research design contains one factor on which participants are measured more than twice (dependent, or within- groups.
Two-Way Analysis of Variance STAT E-150 Statistical Methods.
Chapter 12: Analysis of Variance
Analysis of Variance. ANOVA Probably the most popular analysis in psychology Why? Ease of implementation Allows for analysis of several groups at once.
Chapter Outline 5.0 PROC GLM
Introduction to SAS Essentials Mastering SAS for Data Analytics
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 14 Comparing Groups: Analysis of Variance Methods Section 14.2 Estimating Differences.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved OPIM 303-Lecture #9 Jose M. Cruz Assistant Professor.
Chapter 11 HYPOTHESIS TESTING USING THE ONE-WAY ANALYSIS OF VARIANCE.
January 31 and February 3,  Some formulae are presented in this lecture to provide the general mathematical background to the topic or to demonstrate.
STA305 week21 The One-Factor Model Statistical model is used to describe data. It is an equation that shows the dependence of the response variable upon.
ANOVA (Analysis of Variance) by Aziza Munir
Testing Multiple Means and the Analysis of Variance (§8.1, 8.2, 8.6) Situations where comparing more than two means is important. The approach to testing.
Effect Size Estimation in Fixed Factors Between-Groups ANOVA
Effect Size Estimation in Fixed Factors Between- Groups Anova.
Psychology 301 Chapters & Differences Between Two Means Introduction to Analysis of Variance Multiple Comparisons.
GENERAL LINEAR MODELS Oneway ANOVA, GLM Univariate (n-way ANOVA, ANCOVA)
Randomized Block Design (Kirk, chapter 7) BUSI 6480 Lecture 6.
Lab 5 instruction.  a collection of statistical methods to compare several groups according to their means on a quantitative response variable  Two-Way.
Copyright © 2004 Pearson Education, Inc.
Chapter 10: Analyzing Experimental Data Inferential statistics are used to determine whether the independent variable had an effect on the dependent variance.
Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, October 15, 2013 Analysis of Variance (ANOVA)
6/2/2016Slide 1 To extend the comparison of population means beyond the two groups tested by the independent samples t-test, we use a one-way analysis.
The Completely Randomized Design (§8.3)
1 G Lect 11a G Lecture 11a Example: Comparing variances ANOVA table ANOVA linear model ANOVA assumptions Data transformations Effect sizes.
ANOVA: Analysis of Variance.
Chapter 14 Repeated Measures and Two Factor Analysis of Variance
Analysis of Variance (One Factor). ANOVA Analysis of Variance Tests whether differences exist among population means categorized by only one factor or.
11/19/2015Slide 1 We can test the relationship between a quantitative dependent variable and two categorical independent variables with a two-factor analysis.
+ Comparing several means: ANOVA (GLM 1) Chapter 11.
Chapter Seventeen. Figure 17.1 Relationship of Hypothesis Testing Related to Differences to the Previous Chapter and the Marketing Research Process Focus.
Copyright © Cengage Learning. All rights reserved. 12 Analysis of Variance.
Chapter 12 Introduction to Analysis of Variance PowerPoint Lecture Slides Essentials of Statistics for the Behavioral Sciences Eighth Edition by Frederick.
Chapter 13 Repeated-Measures and Two-Factor Analysis of Variance
1/5/2016Slide 1 We will use a one-sample test of proportions to test whether or not our sample proportion supports the population proportion from which.
Experimental Statistics - week 3
Hypothesis test flow chart frequency data Measurement scale number of variables 1 basic χ 2 test (19.5) Table I χ 2 test for independence (19.9) Table.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Lecture Slides Elementary Statistics Eleventh Edition and the Triola.
One-Way Analysis of Variance Recapitulation Recapitulation 1. Comparing differences among three or more subsamples requires a different statistical test.
Handout Six: Sample Size, Effect Size, Power, and Assumptions of ANOVA EPSE 592 Experimental Designs and Analysis in Educational Research Instructor: Dr.
Analysis of Variance STAT E-150 Statistical Methods.
CRD, Strength of Association, Effect Size, Power, and Sample Size Calculations BUSI 6480 Lecture 4.
Slide Slide 1 Copyright © 2007 Pearson Education, Inc Publishing as Pearson Addison-Wesley. Overview and One-Way ANOVA.
Formula for Linear Regression y = bx + a Y variable plotted on vertical axis. X variable plotted on horizontal axis. Slope or the change in y for every.
Slide Slide 1 Copyright © 2007 Pearson Education, Inc Publishing as Pearson Addison-Wesley. Lecture Slides Elementary Statistics Tenth Edition and the.
Slide 1 Copyright © 2004 Pearson Education, Inc. Chapter 11 Multinomial Experiments and Contingency Tables 11-1 Overview 11-2 Multinomial Experiments:
Chapter 12 Introduction to Analysis of Variance
Comparing Three or More Means
Analysis of Variance (ANOVA)
Presentation transcript:

ANOVA and Multiple Comparison Tests BUSI 6480 Lecture 3 ANOVA and Multiple Comparison Tests

Contrasts Among Means A contrast (or comparison) among means is a difference among means. The symbol yi is used to denote the i-th contrast: yi = mj – mk, estimated as: Sometimes contrasts involve the comparison of the average of two experimental groups with a control group:

Contrast equations, pairwise comparisons, and orthogonal contrasts Contrasts can be expressed as linear combinations of the group means, in the form of contrast equations: When all coefficients cj except two are equal to zero, the contrast becomes a pairwise comparison: Contrasts that are mutually non-redundant are called orthogonal contrasts. In those, the coefficient matrix of the contrast equations is of full rank.

A priori and a Posteriori contrasts Confirmatory data analysis typically involves a priori tests, which are tests the designer of the experiment had in mind during the planning phase. Exploratory data analysis typically involves a posteriori tests. These are used in the preliminary stages of a research program, when the researcher does not have the knowledge to formulate testable models.

Error rates When the experiment involves multiple contrasts, the probability of making the Type I error (false significance) becomes harder to control. For example, for c independent contrasts, Probability of one or more Type I errors = 1-(1-a)c A number of multiple comparison procedures have been developed. These aim to control the per-contrast, familywise, or per-family error rate, and also have a good combination of: conceptual simplicity, ease of computation, excellent power, availability of confidence intervals, and robustness (see Table 4.1.2).

Contrasts Contradicting ANOVA Results Is it possible to get a 'significant' result from a post test even when the overall ANOVA was not significant? Yes it is possible. The exception is Scheffe's test. It is intertwined with the overall F test. If the overall ANOVA has a p-value greater than 0.05, then the Scheffe's test won't find any significant post tests. In this case, performing post tests following an overall nonsignificant ANOVA is a waste of time but won't lead to invalid conclusions. But other multiple comparison tests can find significant differences (sometimes) even when the overall ANOVA showed no significant differences among groups.

Contrasts Contradicting ANOVA Results How can I understand the apparent contradiction between an ANOVA saying, in effect, that all group means are identical and a post test finding differences? The overall one-way ANOVA tests the null hypothesis that all the treatment groups have identical mean values, so any difference you happened to observe is due to random sampling. Each post test tests the null hypothesis that two particular groups have identical means. The post tests are more focused, so have power to find differences between groups even when the overall ANOVA is not significant.

Contrasts Contradicting ANOVA Results Are the results of the overall ANOVA useful at all? ANOVA tests the overall null hypothesis that all the data come from groups that have identical means. If that is your experimental question -- does the data provide convincing evidence that the means are not all identical -- then ANOVA is exactly what you want. More often, your experimental questions are more focused and answered by multiple comparison tests (post tests). Note that the multiple comparison calculations all use the mean-square result from the ANOVA table. So even if you don't care about the value of F or the p-value, the post tests still require that the ANOVA table be computed.

Difference Between Estimate and Contrast Statements in SAS Contrast in SAS proc glm data=myAnovaData; class Level; model Resp = Level ; Estimate '1and2Versus3' -.5 -.5 1; Contrast '1and2Versus3' -.5 -.5 1; ods output overallanova = atable; run; OUTPUT Contrast DF Contrast SS Mean Square F Value Pr > F 1and2Versus3 1 3.75000000 3.75000000 0.53 0.4732 Standard Parameter Estimate Error t Value Pr > |t| 1and2Versus3 -0.75000000 1.03091114 -0.73 0.4732

Different Ways of Representing Contrast Coefficients proc glm data=myAnovaData; class Level; model Resp = Level ; Means Level/ Sidak Bon; estimate 'Diff 1_2_3 minus 4_5' Level -2 -2 -2 3 3/ divisor = 6; estimate 'Diff 1_2_3 minus 4_5' Level -.33 -.33 -.34 .5 .5; estimate 'Diff 1_2_3 minus 4_5' Level -2 -2 -2 3 3; ods output overallanova = atable;

Data Input: Y in One Column, Grouping Variable in Another You always need to input data with Y in one column and grouping variable in another whether using Minitab, SAS or SPSS. One way to translate data to single column with Y and grouping variable using SAS: Data myAnovaData; Set Treat1 Treat2 Treat3; If _N_ <= 30 then Level = 3; If _N_ <= 20 then Level = 2; If _N_ <= 10 then Level = 1; Keep resp Level;

Data Input in SAS Data Treat1; set Learning; resp = a1; Level = 1; Data Treat2; resp = a2; Level = 2; Data Treat3; resp = a3; Level = 3; Data myAnovaData; Set Treat1 Treat2 Treat3; Keep resp Level; Another Way to Translate Data to Single Column with Y and Grouping Variable Using SAS:

The One-Way Procedure in SPSS Quick to use, but doesn’t have as many options as the Univariate GLM

Univariate Contrasts in SPSS Analyze > General Linear Model > Univariate But you can only select one contrast per run, unlike the one-way ANOVA.

Univariate Contrasts in SPSS Simple Contrast Compares Each Level to a Reference Category. NOTE: The Change button needs to be clicked to select contrast type.

Contrast types in SPSS Definitions of Contrast Types in SPSS are given below. If different contrasts are requested, they must be programmed.

Deviation Contrasts Deviation contrasts compares a level with the mean of the other three levels.  A contrast is not made for the reference category.

Simple Contrasts Simple contrasts compare each cell with a reference cell. The reference cell can be either the last level of the factor or the first level of the factor

Difference Contrasts Difference contrasts compare:  level 1 with level 2, level 3 with the mean of previous two levels, level 4 with the mean of previous three levels, and so forth.

Helmert contrasts Helmert contrasts compare:  the first level of the factor with all later levels, the second level with all later levels, the third level with all later levels, and so forth.

Repeated Contrasts Repeated contrasts compare consecutive pairs of levels, for example level 1 with level 2, level 2 with level 3, level 3 with level 4, and so forth.

Omega-square Omega squared is an estimate of the model variance accounted for by the variability of the treatment levels. For a contrast, w2 can be computed as: w2 = (SS(contrast) - MSerror)) / (MSerror + SStotal) Not difficult to compute, if you know the group means and ANOVA output : SS(contrast) = (estimate of contrast)2 / (sum of contrast coefficients squared/group size)

Partial Eta and Eta Squared Eta Squared: It is the proportion of the total variance that is attributed to an effect.   It is calculated as the ratio of the effect variance (SSeffect) to the total variance (SStotal): h2 = SSeffect / SStotal Partial Eta Squared: The formula differs from the Eta squared formula in that the denominator includes the SSeffect plus the SSerror rather than the SStotal. hp2 = SSeffect / (SSeffect + SSerror) Partial Eta squared, hp2 - same as R-square for one-way ANOVA.

Random Effects Model ANOVA Same ANOVA table as for Fixed Effects Model – This is true for only the one-way ANOVA. Interpretation is different. Since effects for a random effects model were selected at random, multiple comparisons and contrasts are not of interest.

Random Factors: No multiple comparisons Note that in SPSS, the Post Hoc dialog box doesn’t show any variables under Factor(s) when the grouping variable is placed in Random Factors.

Random Factors: Intraclass correlation For random effects, use Intra-class correlation instead of omega squared. Easy computation: rI = (F-1)/ ((n-1) + F)

Box Plots The generic SAS code is: PROC BOXPLOT DATA=test;    PLOT response * group; RUN; QUIT; In SPSS, click Graphs > Boxplots.