Multiple comparisons - multiple pairwise tests - orthogonal contrasts

Slides:



Advertisements
Similar presentations
Week 2 – PART III POST-HOC TESTS. POST HOC TESTS When we get a significant F test result in an ANOVA test for a main effect of a factor with more than.
Advertisements

Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests Linear Contrasts Orthogonal Contrasts Trend Analysis Bonferroni t Fisher Least Significance.
Multiple comparisons - multiple pairwise tests - orthogonal contrasts - independent tests - labelling conventions.
Analytic Comparisons & Trend Analyses Analytic Comparisons –Simple comparisons –Complex comparisons –Trend Analyses Errors & Confusions when interpreting.
More on ANOVA. Overview ANOVA as Regression Comparison Methods.
Analyses of K-Group Designs : Analytic Comparisons & Trend Analyses Analytic Comparisons –Simple comparisons –Complex comparisons –Trend Analyses Errors.
If = 10 and = 0.05 per experiment = 0.5 Type I Error Rates I.Per Comparison II.Per Experiment (frequency) = error rate of any comparison = # of comparisons.
Psy B07 Chapter 1Slide 1 ANALYSIS OF VARIANCE. Psy B07 Chapter 1Slide 2 t-test refresher  In chapter 7 we talked about analyses that could be conducted.
Stats Lunch: Day 7 One-Way ANOVA. Basic Steps of Calculating an ANOVA M = 3 M = 6 M = 10 Remember, there are 2 ways to estimate pop. variance in ANOVA:
Post Hoc Tests. What is a Post Hoc Test? Review: – Adjusting Alpha Level – Multiple A Priori Comparisons What makes a test Post Hoc? – Many tests could.
I. Statistical Tests: A Repetive Review A.Why do we use them? Namely: we need to make inferences from incomplete information or uncertainty þBut we want.
Analysis of Variance (ANOVA) Brian Healy, PhD BIO203.
Three Statistical Issues (1) Observational Study (2) Multiple Comparisons (3) Censoring Definitions.
Research Methods and Data Analysis in Psychology Spring 2015 Kyle Stephenson.
Chapter 11: Test for Comparing Group Means: Part I.
Six Easy Steps for an ANOVA 1) State the hypothesis 2) Find the F-critical value 3) Calculate the F-value 4) Decision 5) Create the summary table 6) Put.
ANOVA: Analysis of Variation
Comparing Multiple Groups:
ANOVA: Analysis of Variation
Comparing Multiple Factors:
ANOVA: Analysis of Variation
Step 1: Specify a null hypothesis
MEASURES OF CENTRAL TENDENCY Central tendency means average performance, while dispersion of a data is how it spreads from a central tendency. He measures.
Statistical Data Analysis - Lecture /04/03
ANOVA: Analysis of Variation
Week 2 – PART III POST-HOC TESTS.
Analysis of Variance (ANOVA)
Two-way ANOVA with significant interactions
Why is this important? Requirement Understand research articles
Statistical Data Analysis - Lecture /04/03
Comparing several means: ANOVA (GLM 1)
Multiple comparisons
Comparing Three or More Means
Group Comparisons What is the probability that group mean differences occurred by chance? With Excel (or any statistics program) we computed p values to.
Hypothesis testing using contrasts
Multiple Comparisons Q560: Experimental Methods in Cognitive Science Lecture 10.
Two-Factor Full Factorial Designs
POSC 202A: Lecture Lecture: Substantive Significance, Relationship between Variables 1.
Planned Comparisons & Post Hoc Tests
Differences Among Group Means: One-Way Analysis of Variance
Ch. 14: Comparisons on More Than Two Conditions
Comparing Multiple Groups: Analysis of Variance ANOVA (1-way)
What if. . . You were asked to determine if psychology and sociology majors have significantly different class attendance (i.e., the number of days a person.
Linear Contrasts and Multiple Comparisons (§ 8.6)
Studentized Range Statistic
Comparing Groups.
Comparing Several Means: ANOVA
CPSC 531: System Modeling and Simulation
Data Presentation Carey Williamson Department of Computer Science
More About ANOVA BPS 7e Chapter 30 © 2015 W. H. Freeman and Company.
Analysis of Variance (ANOVA)
Statistics review Basic concepts: Variability measures Distributions
Reasoning in Psychology Using Statistics
I. Statistical Tests: Why do we use them? What do they involve?
Statistical Inference about Regression
Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests
Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests
One-Way Analysis of Variance
Multiple comparisons - multiple pairwise tests - orthogonal contrasts
Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests Linear Contrasts Orthogonal Contrasts Trend Analysis Bonferroni t Fisher Least Significance.
Carey Williamson Department of Computer Science University of Calgary
Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests
Analysis of Variance: repeated measures
Conceptual Understanding
SPSS SPSS Problem (Part 1). SPSS SPSS Problem (Part 1)
1-Way Analysis of Variance - Completely Randomized Design
Experiments with More Than Two Groups
Post Hoc Tests.
Multiple comparisons - multiple pairwise tests - orthogonal contrasts
STATISTICS INFORMED DECISIONS USING DATA
Presentation transcript:

Multiple comparisons - multiple pairwise tests - orthogonal contrasts - independent tests - labelling conventions

Multiple tests Problem: Because we examine the same data in multiple comparisons, the result of the first comparison affects our expectation of the next comparison. This is a general problem in statistics. We often need to do multiple tests on our data. But when we do two tests that involve the same data, the outcome of one affects the outcome of the other…they are not INDEPENDENT.

Look at the effect of each treatment on Orang-utan births This is a general problem in statistics. We often need to do multiple tests on our data. But when we do two tests that involve the same data, the outcome of one affects the outcome of the other…they are not INDEPENDENT. 3 treatments:- a, no active management b, selective logging c, replanting Look at the effect of each treatment on Orang-utan births

Multiple tests ANOVA gives f(2,27) = 5.82, p < 0.05. Therefore at least one different, but which one(s)? Births per km2 One of the most common instances of multiple tests are “post hoc” tests after a significant treatment effect in an ANOVA. Remember, an ANOVA just tests if one level is different from the rest…you then have to figure out which one. Often we do all possible combinations, as shown here… T-tests of all pairwise combinations significant Not significant significant

Multiple tests Births per km2 However, as we just discovered in the card example, these are non-independent. Setting alpha - 0.05 tells us that although the second treatment could be higher than the first just because of a fluke, this will happen less than 5% of the time. But if it just happened to be high, that would affect the probability of the second being higher than the third…so the alpha for our second test is not 0.05 anymore as our expectations have changed T-test: <5% chance that this difference was a fluke… affects likelihood of finding a difference in this pair!

Multiple tests Solution: Make alpha your overall “experiment-wise” error rate Births per km2 So how do we change our alphas to get over the problem of non-independence? T-test: <5% chance that this difference was a fluke… affects likelihood (alpha) of finding a difference in this pair!

Multiple tests Solution: Make alpha your overall “experiment-wise” error rate e.g. simple Bonferroni: Divide alpha by number of tests Births per km2 This simplest solution is a Bonferroni test. The regression and multiple comparison summary sheet gives some variations on this theme, but they all have a basic purpose…adjust our expectations for no-independent tests. Alpha / 3 = 0.0167 Alpha / 3 = 0.0167 Alpha / 3 = 0.0167

Multiple tests Tukey post-hoc testing does this for you Uses the q-distribution Does all the pairwise comparisons. Births per km2 1-2, p < 0.05 1-3, p > 0.05 2-3, p < 0.05

Orthogonal contrasts Orthogonal = perpendicular = independent Contrast = comparison Example. We compare the growth of three types of plants: Legumes, graminoids, and asters. These 2 contrasts are orthogonal: 1. Legumes vs. non-legumes (graminoids, asters) 2. Graminoids vs. asters Independent is also called orthogonal. The reasons for this will be clear to students who have taken linear algebra, but otherwise we’ll just say that orthogonal is perpindicular, and two axes that are perpindicular to each other are uncorrelated, i.e. independent. Here’s a stats example totally analogous to the card example…

Trick for determining if contrasts are orthogonal: 1. In the first contrast, label all treatments in one group with “+” and all treatments in the other group with “-” (doesn’t matter which way round). Legumes Graminoids Asters + - - How do we figure out if our contrasts (=comparisons) are orthogonal? We could try to reason it out, or, here’s an easy trick…

Trick for determining if contrasts are orthogonal: 1. In the first contrast, label all treatments in one group with “+” and all treatments in the other group with “-” (doesn’t matter which way round). 2. In each group composed of t treatments, put the number 1/t as the coefficient. If treatment not in contrast, give it the value “0”. Legumes Graminoids Asters +1 - 1/2 -1/2

Trick for determining if contrasts are orthogonal: 1. In the first contrast, label all treatments in one group with “+” and all treatments in the other group with “-” (doesn’t matter which way round). 2. In each group composed of t treatments, put the number 1/t as the coefficient. If treatment not in contrast, give it the value “0”. 3. Repeat for all other contrasts. Legumes Graminoids Asters +1 - 1/2 -1/2 0 +1 -1

Trick for determining if contrasts are orthogonal: 4. Multiply each column, then sum these products. Legumes Graminoids Asters +1 - 1/2 -1/2 0 +1 -1 0 - 1/2 +1/2 Sum of products = 0

Trick for determining if contrasts are orthogonal: 4. Multiply each column, then sum these products. 5. If this sum = 0 then the contrasts were orthogonal! If the sum is zero, then all the tests are orthogonal to each other. Legumes Graminoids Asters +1 - 1/2 -1/2 0 +1 -1 0 - 1/2 +1/2 Sum of products = 0

What about these contrasts? 1. Monocots (graminoids) vs. dicots (legumes and asters). 2. Legumes vs. non-legumes Group work! Try these two contrasts…are the orthogonal to each other? Hint for those who are lost: non-legumes are graminoids and asters

Important! You need to assess orthogonality in each pairwise combination of contrasts. So if 4 contrasts: Contrast 1 and 2, 1 and 3, 1 and 4, 2 and 3, 2 and 4, 3 and 4.

} } How do you program contrasts in JMP (etc.)? Treatment SS Orthogonal contrasts are just ways to divide up the treatment effect (SS). You can perfectly divide up k levels of a treament into k-1 contrasts (e.g. the example we did a few minutes ago). } Contrast 2

How do you program contrasts in JMP (etc.)? Legumes vs. non-legumes Normal treatments “There was a significant treatment effect (F…). About 53% of the variation between treatments was due to differences between legumes and non-legumes (F1,20 = 6.7).” Legume 1 1 Graminoid 2 2 Aster 3 2 SStreat 122 67 Df treat 2 1 MStreat 60 MSerror 10 Df error 20 Sound hard? Not really. Just code the treatments that you are averaging over in the same way. For example… Then run your ANOVA again with your new “treatment”. JMP will give you a new SS. What about df? Simple rule of thumb: each contrast costs 1 df (JMP will tell you this anyways). Then look at the F test…significant? Q: What is the SS for the graminoid vs aster treatment? 122-67 = 55. F1,20 = (67)/1 = 6.7 10 From full model!

Even different statistical tests may not be independent ! Example. We examined effects of fertilizer on growth of dandelions in a pasture using an ANOVA. We then repeated the test for growth of grass in the same plots. Problem? Problem is that if the overall data gives you one trend, then subsets are likely to do so as well!

Multiple tests b Convention: Treatments with a common letter are not significantly different a a,b Births per km2 significant Not significant Not significant

The data on the next graph is from a study of forestry effects on Cameroonian butterflies (Diane’s data).

Challenge: The first person to give Spencer the correct datasheet (all combinations identified as sig or ns) can hand in the lab of their choice 2 days late without penalty! Reminder: We need to take the link to the answer sheet off the website!