Multiple comparisons - multiple pairwise tests - orthogonal contrasts - independent tests - labelling conventions.

Slides:



Advertisements
Similar presentations
Analysis of Variance (ANOVA)
Advertisements

Lesson #24 Multiple Comparisons. When doing ANOVA, suppose we reject H 0 :  1 =  2 =  3 = … =  k Next, we want to know which means differ. This does.
1-Way Analysis of Variance
Analysis of Variance (ANOVA) ANOVA methods are widely used for comparing 2 or more population means from populations that are approximately normal in distribution.
Analytic Comparisons & Trend Analyses Analytic Comparisons –Simple comparisons –Complex comparisons –Trend Analyses Errors & Confusions when interpreting.
Quantitative Data Analysis: Hypothesis Testing
Analysis of Variance (ANOVA) Statistics for the Social Sciences Psychology 340 Spring 2010.
One-Way Between Subjects ANOVA. Overview Purpose How is the Variance Analyzed? Assumptions Effect Size.
More on ANOVA. Overview ANOVA as Regression Comparison Methods.
ANOVA notes NR 245 Austin Troy
Independent Sample T-test Formula
Analysis of Variance: ANOVA. Group 1: control group/ no ind. Var. Group 2: low level of the ind. Var. Group 3: high level of the ind var.
Analysis of Variance Chapter 15 - continued Two-Factor Analysis of Variance - Example 15.3 –Suppose in Example 15.1, two factors are to be examined:
Lecture 9: One Way ANOVA Between Subjects
One-way Between Groups Analysis of Variance
PSY 307 – Statistics for the Behavioral Sciences Chapter 16 – One-Way ANOVA (Cont.)
Comparing Several Means: One-way ANOVA Lesson 14.
Statistical Methods in Computer Science Hypothesis Testing II: Single-Factor Experiments Ido Dagan.
If = 10 and = 0.05 per experiment = 0.5 Type I Error Rates I.Per Comparison II.Per Experiment (frequency) = error rate of any comparison = # of comparisons.
Analysis of Variance (ANOVA) Quantitative Methods in HPELS 440:210.
ANOVA Chapter 12.
PS 225 Lecture 15 Analysis of Variance ANOVA Tables.
Repeated Measures ANOVA
1 1 Slide © 2005 Thomson/South-Western Chapter 13, Part A Analysis of Variance and Experimental Design n Introduction to Analysis of Variance n Analysis.
Stats Lunch: Day 7 One-Way ANOVA. Basic Steps of Calculating an ANOVA M = 3 M = 6 M = 10 Remember, there are 2 ways to estimate pop. variance in ANOVA:
Box Method for Factoring Factoring expressions in the form of.
One-Way Analysis of Variance Comparing means of more than 2 independent samples 1.
Decomposition of Treatment Sums of Squares using prior information on the structure of the treatments and/or treatment groups.
Conducting ANOVA’s. I.Why? A. more than two groups to compare.
Jeopardy Opening Robert Lee | UOIT Game Board $ 200 $ 200 $ 200 $ 200 $ 200 $ 400 $ 400 $ 400 $ 400 $ 400 $ 10 0 $ 10 0 $ 10 0 $ 10 0 $ 10 0 $ 300 $
I. Statistical Tests: A Repetive Review A.Why do we use them? Namely: we need to make inferences from incomplete information or uncertainty þBut we want.
11 Chapter 12 Quantitative Data Analysis: Hypothesis Testing © 2009 John Wiley & Sons Ltd.
One-way ANOVA: - Comparing the means IPS chapter 12.2 © 2006 W.H. Freeman and Company.
Three Statistical Issues (1) Observational Study (2) Multiple Comparisons (3) Censoring Definitions.
Objectives (IPS Chapter 12.1) Inference for one-way ANOVA  Comparing means  The two-sample t statistic  An overview of ANOVA  The ANOVA model  Testing.
One-Way Analysis of Variance Recapitulation Recapitulation 1. Comparing differences among three or more subsamples requires a different statistical test.
Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)
Remember You just invented a “magic math pill” that will increase test scores. On the day of the first test you give the pill to 4 subjects. When these.
Introduction to ANOVA Research Designs for ANOVAs Type I Error and Multiple Hypothesis Tests The Logic of ANOVA ANOVA vocabulary, notation, and formulas.
Simple ANOVA Comparing the Means of Three or More Groups Chapter 9.
ANOVA and Multiple Comparison Tests
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture #10 Testing the Statistical Significance of Factor Effects.
Quantitative methods and R – (2) LING115 December 2, 2009.
I231B QUANTITATIVE METHODS Analysis of Variance (ANOVA)
Posthoc Comparisons finding the differences. Statistical Significance What does a statistically significant F statistic, in a Oneway ANOVA, tell us? What.
Six Easy Steps for an ANOVA 1) State the hypothesis 2) Find the F-critical value 3) Calculate the F-value 4) Decision 5) Create the summary table 6) Put.
Comparing Multiple Groups:
MEASURES OF CENTRAL TENDENCY Central tendency means average performance, while dispersion of a data is how it spreads from a central tendency. He measures.
Why is this important? Requirement Understand research articles
Box Method for Factoring
Box Method for Factoring
Multiple comparisons
Comparing Three or More Means
Hypothesis testing using contrasts
Multiple Comparisons Q560: Experimental Methods in Cognitive Science Lecture 10.
Categorical Variables
Planned Comparisons & Post Hoc Tests
Ch. 14: Comparisons on More Than Two Conditions
Comparing Multiple Groups: Analysis of Variance ANOVA (1-way)
1-Way ANOVA with Numeric Factor – Dose-Response
Categorical Variables
Linear Contrasts and Multiple Comparisons (§ 8.6)
Studentized Range Statistic
Multiple comparisons - multiple pairwise tests - orthogonal contrasts
I. Statistical Tests: Why do we use them? What do they involve?
Multiple comparisons - multiple pairwise tests - orthogonal contrasts
Analysis of Variance: repeated measures
MANOVA Control of experimentwise error rate (problem of multiple tests). Detection of multivariate vs. univariate differences among groups (multivariate.
Multiple comparisons - multiple pairwise tests - orthogonal contrasts
STATISTICS INFORMED DECISIONS USING DATA
Presentation transcript:

Multiple comparisons - multiple pairwise tests - orthogonal contrasts - independent tests - labelling conventions

Card example number 1

Multiple tests Problem: Because we examine the same data in multiple comparisons, the result of the first comparison affects our expectation of the next comparison.

Multiple tests ANOVA shows at least one different, but which one(s)? significant Not significant significant T-tests of all pairwise combinations

Multiple tests T-test: <5% chance that this difference was a fluke… affects likelihood of finding a difference in this pair!

Multiple tests Solution: Make alpha your overall “experiment-wise” error rate affects likelihood (alpha) of finding a difference in this pair! T-test: <5% chance that this difference was a fluke…

Multiple tests Solution: Make alpha your overall “experiment-wise” error rate e.g. simple Bonferroni: Divide alpha by number of tests Alpha / 3 = Alpha / 3 = Alpha / 3 =

Card example 2

Orthogonal contrasts Orthogonal = perpendicular = independent Contrast = comparison Example. We compare the growth of three types of plants: Legumes, graminoids, and asters. These 2 contrasts are orthogonal: 1. Legumes vs. non-legumes (graminoids, asters) 2. Graminoids vs. asters

Trick for determining if contrasts are orthogonal: 1. In the first contrast, label all treatments in one group with “+” and all treatments in the other group with “-” (doesn’t matter which way round). LegumesGraminoidsAsters + - -

Trick for determining if contrasts are orthogonal: 1. In the first contrast, label all treatments in one group with “+” and all treatments in the other group with “-” (doesn’t matter which way round). 2. In each group composed of t treatments, put the number 1/t as the coefficient. If treatment not in contrast, give it the value “0”. LegumesGraminoidsAsters /2 -1/2

Trick for determining if contrasts are orthogonal: 1. In the first contrast, label all treatments in one group with “+” and all treatments in the other group with “-” (doesn’t matter which way round). 2. In each group composed of t treatments, put the number 1/t as the coefficient. If treatment not in contrast, give it the value “0”. 3. Repeat for all other contrasts. LegumesGraminoidsAsters /2 -1/

Trick for determining if contrasts are orthogonal: 4. Multiply each column, then sum these products. LegumesGraminoidsAsters /2 -1/ /2 +1/2 Sum of products = 0

Trick for determining if contrasts are orthogonal: 4. Multiply each column, then sum these products. 5. If this sum = 0 then the contrasts were orthogonal! LegumesGraminoidsAsters /2 -1/ /2 +1/2 Sum of products = 0

What about these contrasts? 1. Monocots (graminoids) vs. dicots (legumes and asters). 2. Legumes vs. non-legumes

Important! You need to assess orthogonality in each pairwise combination of contrasts. So if 4 contrasts: Contrast 1 and 2, 1 and 3, 1 and 4, 2 and 3, 2 and 4, 3 and 4.

How do you program contrasts in JMP (etc.)? Treatment SS } Contrast 2 } Contrast 1

How do you program contrasts in JMP (etc.)? Normal treatments Legume11 Graminoid22 Aster32 SStreat Df treat21 MStreat60 MSerror10 Df error20 Legumes vs. non- legumes “There was a significant treatment effect (F…). About 53% of the variation between treatments was due to differences between legumes and non- legumes (F 1,20 = 6.7).” F 1,20 = (67)/1 = From full model!

Even different statistical tests may not be independent ! Example. We examined effects of fertilizer on growth of dandelions in a pasture using an ANOVA. We then repeated the test for growth of grass in the same plots. Problem?

Multiple tests Not significant significant Not significant a a,b b Convention: Treatments with a common letter are not significantly different