SPSS SPSS Problem (Part 1). SPSS SPSS Problem (Part 1)

Slides:



Advertisements
Similar presentations
Week 2 – PART III POST-HOC TESTS. POST HOC TESTS When we get a significant F test result in an ANOVA test for a main effect of a factor with more than.
Advertisements

ANOVA Demo Part 2: Analysis Psy 320 Cal State Northridge Andrew Ainsworth PhD.
One-Way BG ANOVA Andrew Ainsworth Psy 420. Topics Analysis with more than 2 levels Deviation, Computation, Regression, Unequal Samples Specific Comparisons.
Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests Linear Contrasts Orthogonal Contrasts Trend Analysis Bonferroni t Fisher Least Significance.
Analytic Comparisons & Trend Analyses Analytic Comparisons –Simple comparisons –Complex comparisons –Trend Analyses Errors & Confusions when interpreting.
One-Way ANOVA Multiple Comparisons.
More on ANOVA. Overview ANOVA as Regression Comparison Methods.
Multiple Group X² Designs & Follow-up Analyses X² for multiple condition designs Pairwise comparisons & RH Testing Alpha inflation Effect sizes for k-group.
Comparing Means.
Independent Samples and Paired Samples t-tests PSY440 June 24, 2008.
Analyses of K-Group Designs : Analytic Comparisons & Trend Analyses Analytic Comparisons –Simple comparisons –Complex comparisons –Trend Analyses Errors.
Intro to Statistics for the Behavioral Sciences PSYC 1900
Statistics for the Social Sciences Psychology 340 Spring 2005 Analysis of Variance (ANOVA)
Chapter 9 - Lecture 2 Computing the analysis of variance for simple experiments (single factor, unrelated groups experiments).
If = 10 and = 0.05 per experiment = 0.5 Type I Error Rates I.Per Comparison II.Per Experiment (frequency) = error rate of any comparison = # of comparisons.
Psy B07 Chapter 1Slide 1 ANALYSIS OF VARIANCE. Psy B07 Chapter 1Slide 2 t-test refresher  In chapter 7 we talked about analyses that could be conducted.
ANOVA Chapter 12.
Stats Lunch: Day 7 One-Way ANOVA. Basic Steps of Calculating an ANOVA M = 3 M = 6 M = 10 Remember, there are 2 ways to estimate pop. variance in ANOVA:
Chapter 11 HYPOTHESIS TESTING USING THE ONE-WAY ANALYSIS OF VARIANCE.
Orthogonal Linear Contrasts This is a technique for partitioning ANOVA sum of squares into individual degrees of freedom.
Orthogonal Linear Contrasts
Chapter 10: Analyzing Experimental Data Inferential statistics are used to determine whether the independent variable had an effect on the dependent variance.
Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, October 15, 2013 Analysis of Variance (ANOVA)
Statistics for the Social Sciences Psychology 340 Fall 2012 Analysis of Variance (ANOVA)
Orthogonal Linear Contrasts This is a technique for partitioning ANOVA sum of squares into individual degrees of freedom.
1 Psych 5510/6510 Chapter 14 Repeated Measures ANOVA: Models with Nonindependent ERRORs Part 3: Factorial Designs Spring, 2009.
Linear Models One-Way ANOVA. 2 A researcher is interested in the effect of irrigation on fruit production by raspberry plants. The researcher has determined.
Chapter 13 For Explaining Psychological Statistics, 4th ed. by B. Cohen 1 Chapter 13: Multiple Comparisons Experimentwise Alpha (α EW ) –The probability.
Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)
Remember You just invented a “magic math pill” that will increase test scores. On the day of the first test you give the pill to 4 subjects. When these.
Introduction to ANOVA Research Designs for ANOVAs Type I Error and Multiple Hypothesis Tests The Logic of ANOVA ANOVA vocabulary, notation, and formulas.
Posthoc Comparisons finding the differences. Statistical Significance What does a statistically significant F statistic, in a Oneway ANOVA, tell us? What.
Experimental Designs The objective of Experimental design is to reduce the magnitude of random error resulting in more powerful tests to detect experimental.
Six Easy Steps for an ANOVA 1) State the hypothesis 2) Find the F-critical value 3) Calculate the F-value 4) Decision 5) Create the summary table 6) Put.
Psych 200 Methods & Analysis
One-Way Between-Subjects Design and Analysis of Variance
Comparing Multiple Groups:
SPSS Homework SPSS Homework 12.1 Practice Data from exercise ) Use linear contrasts to compare 5 days vs 20 and 35 days 2) Imagine you.
Week 2 – PART III POST-HOC TESTS.
Why is this important? Requirement Understand research articles
Statistical Data Analysis - Lecture /04/03
Comparing several means: ANOVA (GLM 1)
Comparing Three or More Means
Hypothesis testing using contrasts
Multiple Comparisons Q560: Experimental Methods in Cognitive Science Lecture 10.
Applied Statistical Analysis
Post Hoc Tests on One-Way ANOVA
Central Limit Theorem, z-tests, & t-tests
Planned Comparisons & Post Hoc Tests
Differences Among Group Means: One-Way Analysis of Variance
Comparing Multiple Groups: Analysis of Variance ANOVA (1-way)
Kin 304 Inferential Statistics
What if. . . You were asked to determine if psychology and sociology majors have significantly different class attendance (i.e., the number of days a person.
Multiple comparisons - multiple pairwise tests - orthogonal contrasts
Comparing Several Means: ANOVA
Analysis of Variance (ANOVA)
Reasoning in Psychology Using Statistics
I. Statistical Tests: Why do we use them? What do they involve?
Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests
Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests
Multiple comparisons - multiple pairwise tests - orthogonal contrasts
Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests Linear Contrasts Orthogonal Contrasts Trend Analysis Bonferroni t Fisher Least Significance.
Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests
Psych 231: Research Methods in Psychology
Analysis of Variance: repeated measures
Psych 231: Research Methods in Psychology
Conceptual Understanding
What if. . . You were asked to determine if psychology and sociology majors have significantly different class attendance (i.e., the number of days a person.
Post Hoc Tests.
Multiple comparisons - multiple pairwise tests - orthogonal contrasts
Presentation transcript:

SPSS SPSS Problem (Part 1)

SPSS Problem (Part 2) Due Wed 12.1

Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests Linear Contrasts Orthogonal Contrasts Trend Analysis Bonferroni t Fisher Least Significance Difference Studentized Range Statistic Dunnett’s Test

Multiple t-tests Good if you have just a couple of planned comparisons Do a normal t-test, but use the other groups to help estimate your error term Helps increase you df

Hyp 1: Juniors and Seniors will have different levels of happiness Hyp 2: Seniors and Freshman will have different levels of happiness

Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests Linear Contrasts Orthogonal Contrasts Trend Analysis Bonferroni t Fisher Least Significance Difference Studentized Range Statistic Dunnett’s Test

Linear Contrasts You think that Freshman and Seniors will have different levels of happiness than Sophomores and Juniors

Linear Contrasts Allows for the comparison of one group or set of groups with another group or set of groups

Linear Contrasts a = weight given to a group

Linear Contrasts a1 = 0, a2 = 0, a3 = 1, a4 = -1 L = -23

SS Contrast You can use the linear contrast to compute a SS contrast SS contrast is like SS between SS contrast has 1 df SS contrast is like MS between

SS Contrast

SS Contrasts a1 = .5, a2 = -.5, a3 = -.5, a4 = .5 L = 80.5 – 67 = 13.5

SS Contrasts a1 = .5, a2 = -.5, a3 = -.5, a4 = .5 L = 80.5 – 67 = 13.5 Sum a2 = .52+-.52+ -.52 + .52 = 1

SS Contrasts a1 = .5, a2 = -.5, a3 = -.5, a4 = .5 L = 80.5 – 67 = 13.5 Sum a2 = .52+-.52+ -.52 + .52 = 1

SS Contrasts a1 = 1, a2 = -1, a3 = -1, a4 = 1 L = 161 – 134 = 27

SS Contrasts a1 = 1, a2 = -1, a3 = -1, a4 = 1 L = 161 – 134 = 27 n = 6 Sum a2 = 12+-12+ -12 + 12 = 4

SS Contrasts a1 = 1, a2 = -1, a3 = -1, a4 = 1 L = 161 – 134 = 27 n = 6 Sum a2 = 12+-12+ -12 + 12 = 4

F Test Note: MS contrast = SS contrast

F Test Fresh & Senior vs. Sophomore & Junior

F Test Fresh & Senior vs. Sophomore & Junior

F Test Fresh & Senior vs. Sophomore & Junior F crit (1, 20) = 4.35

SPSS

Make contrasts to determine If seniors are happier than everyone else? 2) If juniors and sophomores have different levels of happiness?

If seniors are happier than everyone else? a1 = -1, a2 = -1, a3 = -1, a4 = 3 L = 45 F crit (1, 20) = 4.35

2) If juniors and sophomores have different levels of happiness? a1 = 0, a2 = -1, a3 = 1, a4 = 0 L = -10 F crit (1, 20) = 4.35

Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests Linear Contrasts Orthogonal Contrasts Trend Analysis Bonferroni t Fisher Least Significance Difference Studentized Range Statistic Dunnett’s Test

Contrasts Some contrasts are independent Some are not Freshman vs. Sophomore (1, -1, 0, 0) Junior vs. Senior (0, 0, 1, -1) Some are not Freshman vs. Sophomore, Junior, Senior (3, -1, -1, -1) Freshman vs. Sophomore & Junior (2, -1, -1, 0)

Orthogonal Contrasts If you have a complete set of orthogonal contrasts The sum of SScontrast = SSbetween

Orthogonal Contrasts 1) ∑ aj = 0 2) ∑ aj bj = 0 Already talked about 2) ∑ aj bj = 0 Ensures contrasts of independent of one another 3) Number of comparisons = K -1 Ensures enough comparisons are used

Orthogonal Contrasts ∑ aj bj = 0 Fresh, Sophomore, Junior, Senior (3, -1, -1, -1) and (2, -1, -1, 0) (3*2)+(-1*-1)+(-1*-1) = 8

Orthogonal Contrasts ∑ aj bj = 0 Fresh, Sophomore, Junior, Senior (-1, 1, 0, 0) & (0, 0, -1, 1) (-1*0)+(1*0)+(-1*0)+(1*0) = 0 *Note: this is not a complete set of contrasts (rule 3)

Orthogonal Contrasts Lets go to five groups What would the complete set contrasts be that would satisfy the earlier rules?

Orthogonal Contrasts General rule There is more than one right answer

Orthogonal Contrasts Fresh, Soph, Jun, Sen, Grad Fresh & Soph vs. Jun, Sen, & Grad Fresh vs. Soph Jun & Sen vs. Grad Jun vs. Sen

Orthogonal Contrasts Fresh, Soph, Jun, Sen, Grad Fresh & Soph vs. Jun, Sen, & Grad Fresh vs. Soph Jun & Sen vs. Grad Jun vs. Sen 2 limbs are created The elements on different limbs can not be combined with each other Elements on the same limbs can be combined with each other (making new limbs)

Orthogonal Contrasts Fresh, Soph, Jun, Sen, Grad Fresh & Soph vs. Jun, Sen, & Grad Fresh vs. Soph Jun & Sen vs. Grad Jun vs. Sen

Orthogonal Contrasts Fresh, Soph, Jun, Sen, Grad Fresh & Soph vs. Jun, Sen, & Grad Fresh vs. Soph Jun & Sen vs. Grad Jun vs. Sen

Orthogonal Contrasts Fresh, Soph, Jun, Sen, Grad Fresh & Soph vs. Jun, Sen, & Grad Fresh vs. Soph Jun & Sen vs. Grad Jun vs. Sen

Orthogonal Contrasts Fresh, Soph, Jun, Sen, Grad Fresh & Soph vs. Jun, Sen, & Grad Fresh vs. Soph Jun & Sen vs. Grad Jun vs. Sen 3, 3, -2, -2, -2

Orthogonal Contrasts Fresh, Soph, Jun, Sen, Grad Fresh & Soph vs. Jun, Sen, & Grad Fresh vs. Soph Jun & Sen vs. Grad Jun vs. Sen 3, 3, -2, -2, -2 1, -1, 0, 0, 0

Orthogonal Contrasts Fresh, Soph, Jun, Sen, Grad Fresh & Soph vs. Jun, Sen, & Grad Fresh vs. Soph Jun & Sen vs. Grad Jun vs. Sen 3, 3, -2, -2, -2 1, -1, 0, 0, 0 0, 0, 1, 1, -2

Orthogonal Contrasts Fresh, Soph, Jun, Sen, Grad Fresh & Soph vs. Jun, Sen, & Grad Fresh vs. Soph Jun & Sen vs. Grad Jun vs. Sen 3, 3, -2, -2, -2 1, -1, 0, 0, 0 0, 0, 1, 1, -2 0, 0, 1, -1, 0

Orthogonal Contrasts 1) ∑ aj = 0 2) ∑ aj bj = 0 3) Number of comparisons = K -1 3, 3, -2, -2, -2 1, -1, 0, 0, 0 0, 0, 1, 1, -2 0, 0, 1, -1, 0

Orthogonal Contrasts 1) ∑ aj = 0 2) ∑ aj bj = 0 3) Number of comparisons = K -1 3, 3, -2, -2, -2 = 0 1, -1, 0, 0, 0 = 0 0, 0, 1, 1, -2 = 0 0, 0, 1, -1, 0 = 0

Orthogonal Contrasts A) 3, 3, -2, -2, -2 B) 1, -1, 0, 0, 0 D) 0, 0, 1, -1, 0 A, B = 0; A, C = 0; A, D = 0 B, C = 0; B, D = 0 C, D = 0

Orthogonal Contrasts If you have a complete set of orthogonal contrasts The sum of SScontrast = SSbetween

Compute a complete set of orthogonal contrasts for the following data. Test each of the contrasts you create for significance

Orthogonal Contrasts Fresh, Soph, Jun, Sen Fresh & Soph vs. Jun & Sen Fresh vs. Soph Jun vs. Sen 1, 1, -1, -1 1, -1, 0, 0 0, 0, 1, -1

1, 1, -1, -1 L = 1 SScontrast = 1.5; F = .014 1, -1, 0, 0 L = 4 SScontrast = 48; F = .48 0, 0, 1, -1 L = -23 SScontrast = 1587; F = 15.72* F crit (1, 20) = 4.35

SScontrast = 1.5 SScontrast = 48 SScontrast = 1587 1.5 + 48 + 1587 = 1636.50 F crit (1, 20) = 4.35

Orthogonal Contrasts Why use them? People like that they sum together People like that they are independent History I would rather have contrasts based on reason then simply because they are orthogonal!

Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests Linear Contrasts Orthogonal Contrasts Trend Analysis Bonferroni t Fisher Least Significance Difference Studentized Range Statistic Dunnett’s Test

Trend Analysis The logic of trend analysis is exactly the same logic we just talked about with contrasts!

Example You collect axon firing rate scores from rats in one of four conditions. Condition 1 = 10 mm of Zeta inhibitor Condition 2 = 20 mm of Zeta inhibitor Condition 3 = 30 mm of Zeta inhibitor Condition 4 = 40 mm of Zeta inhibitor Condition 5 = 50 mm of Zeta inhibitor You think Zeta inhibitor reduces the number of times an axon fires – are you right?

What does this tell you ?

Trend Analysis Contrast Codes! -2 -1 0 1 2

Trend Analysis

a1 = -2, a2 = -1, a3 = 0, a4 = 1, a5 = 2 L = 7.2 F crit (1, 20) = 4.35

Note

Example You place subjects into one of five different conditions of anxiety. 1) Low anxiety 2) Low-Moderate anxiety 3) Moderate anxiety 4) High-Moderate anxiety 5) High anxiety You think subjects will perform best on a test at level 3 (and will do worse at both lower and higher levels of anxiety)

What does this tell you ?

-2 1 2 1 -2 Contrast Codes!

Trend Analysis Create contrast codes that will examine a quadratic trend. -2, 1, 2, 1, -2

a1 = -2, a2 = 1, a3 = 2, a4 = 1, a5 = -2 L = 10 F crit (1, 20) = 4.35

Trend Analysis How do you know which numbers to use? Page 742

Linear (NO BENDS)

Quadratic (ONE BEND)

Cubic (TWO BENDS)

Practice You believe a balance between school and one’s social life is the key to happiness. Therefore you hypothesize that people who focus too much on school (i.e., people who get good grades) and people who focus too much on their social life (i.e., people who get bad grades) will be more depressed. You collect data from 25 subjects 5 subjects = F 5 subjects = D 5 subjects = C 5 subjects = B 5 subjects = A You measured their depression

Practice Below are your findings – interpret!

Trend Analysis Create contrast codes that will examine a quadratic trend. -2, 1, 2, 1, -2

a1 = -2, a2 = 1, a3 = 2, a4 = 1, a5 = -2 L = -12.8 F crit (1, 20) = 4.35

Remember Freshman, Sophomore, Junior, Senior Measure Happiness (1-100)

ANOVA Traditional F test just tells you not all the means are equal Does not tell you which means are different from other means

Why not Do t-tests for all pairs Fresh vs. Sophomore Fresh vs. Junior Fresh vs. Senior Sophomore vs. Junior Sophomore vs. Senior Junior vs. Senior

Problem What if there were more than four groups? Probability of a Type 1 error increases. Maximum value = comparisons (.05) 6 (.05) = .30

Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests Linear Contrasts Orthogonal Contrasts Trend Analysis Bonferroni t Fisher Least Significance Difference Studentized Range Statistic Dunnett’s Test

Bonferoni t Controls for Type I error by using a more conservative alpha

Do t-tests for all pairs Fresh vs. Sophomore Fresh vs. Junior Fresh vs. Senior Sophomore vs. Junior Sophomore vs. Senior Junior vs. Senior

Maximum probability of a Type I error 6 (.05) = .30 But what if we use Alpha = .05/C .00833 = .05 / 6 6 (.00855) = .05

t-table Compute the t-value the exact same way Problem: normal t table does not have these p values Test for significance using the Bonferroni t table (page 751)

Practice

Practice Fresh vs. Sophomore t = .69 Fresh vs. Junior t = 2.41 Fresh vs. Senior t = -1.55 Sophomore vs. Junior t = 1.72 Sophomore vs. Senior t = -2.24 Junior vs. Senior t = -3.97* Critical t = 6 comp/ df = 20 = 2.93

Bonferoni t Problem Silly What should you use as the value in C? Increases the chances of the Type II error!

Fisher Least Significance Difference Simple 1) Do a normal omnibus ANOVA 2) If there it is significant you know that there is a difference somewhere! 3) Do individual t-test to determine where significance is located

Fisher Least Significance Difference Problem You may have an ANOVA that is not significant and still have results that occur in a manner that you predict! If you used this method you would not have “permission” to look for these effects.

Remember

Remember

Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests Linear Contrasts Orthogonal Contrasts Trend Analysis Bonferroni t Fisher Least Significance Difference Studentized Range Statistic Dunnett’s Test

Studentized Range Statistic Which groups would you likely select to determine if they are different?

Studentized Range Statistic Which groups would you likely select to determine if they are different? This statistics controls for Type I error if (after looking at the data) you select the two means that are most different.

Studentized Range Statistic Easy! 1) Do a normal t-test

Studentized Range Statistic Easy! 2) Convert the t to a q

Studentized Range Statistic 3) Critical value of q (note: this is a two-tailed test) Figure out df (same as t) Example = 20 Figure out r r = the number of groups

Studentized Range Statistic 3) Critical value of q (note: this is a two-tailed test) Figure out df (same as t) Example = 20 Figure out r r = the number of groups Example = 4

Studentized Range Statistic 3) Critical value of q Page 744 Example q critical = +/- 3.96

Studentized Range Statistic 4) Compare q obs and q critical same way as t values q = -5.61 q critical = +/– 3.96

Practice You collect axon firing rate scores from rates in one of four conditions. Condition 1 = 10 mm of Zeta inhibitor Condition 2 = 20 mm of Zeta inhibitor Condition 3 = 30 mm of Zeta inhibitor Condition 4 = 40 mm of Zeta inhibitor Condition 5 = 50 mm of Zeta inhibitor You are simply interested in determining if any two groups are different from each other – use the Studentized Range Statistic

Studentized Range Statistic Easy! 1) Do a normal t-test

Studentized Range Statistic Easy! 2) Convert the t to a q

Studentized Range Statistic 3) Critical value of qnote: this is a two-tailed test) Figure out df (same as t) Example = 20 Figure out r r = the number of groups Example = 5

Studentized Range Statistic 3) Critical value of q Page 744 Example q critical = +/- 4.23

Studentized Range Statistic 4) Compare q obs and q critical same way as t values q = -4.34 q critical = +/– 4.23

Dunnett’s Test Used when there are several experimental groups and one control group (or one reference group) Example: Effect of psychotherapy on happiness Group 1) Psychoanalytic Group 2) Humanistic Group 3) Behaviorism Group 4) Control (no therapy)

Psyana vs. Control Human vs. Control Behavior vs. Control

Psyana vs. Control = 47.8 – 51.4 = -3.6 Human vs. Control = 50.8 – 51. 4 = -0.6 Behavior vs. Control = 59 – 51.4 = 7.6

Psyana vs. Control = 47.8 – 51.4 = -3.6 Human vs. Control = 50.8 – 51. 4 = -0.6 Behavior vs. Control = 59 – 51.4 = 7.6 How different do these means need to be in order to reach significance?

Dunnett’s t is on page 753 df = Within groups df / k = number of groups

Dunnett’s t is on page 753 df = 16 / k = 4

Dunnett’s t is on page 753 df = 16 / k = 4

Psyana vs. Control = 47.8 – 51.4 = -3.6 Human vs. Control = 50.8 – 51. 4 = -0.6 Behavior vs. Control = 59 – 51.4 = 7.6* How different do these means need to be in order to reach significance?

Practice As a graduate student you wonder what undergraduate students (freshman, sophomore, etc.) have different levels of happiness then you.

Dunnett’s t is on page 753 df = 25 / k = 5

Fresh vs. Grad = -17.5* Soph vs. Grad = -21.5* Jun vs. Grad = -31.5* Senior vs. Grad = -8.5