Analysis of Variance: Some Final Issues Degrees of Freedom Familywise Error Rate (Bonferroni Adjustment) Magnitude of Effect: Eta Square, Omega Square.

Slides:



Advertisements
Similar presentations
One-Way BG ANOVA Andrew Ainsworth Psy 420. Topics Analysis with more than 2 levels Deviation, Computation, Regression, Unequal Samples Specific Comparisons.
Advertisements

One-way Analysis of Variance Single Independent Variable Between-Subjects Design.
BPS - 5th Ed. Chapter 241 One-Way Analysis of Variance: Comparing Several Means.
Smith/Davis (c) 2005 Prentice Hall Chapter Thirteen Inferential Tests of Significance II: Analyzing and Interpreting Experiments with More than Two Groups.
ANOVA: Analysis of Variation
PSY 307 – Statistics for the Behavioral Sciences
Observational Data Time Interval = 20 secs. Factorial Analysis of Variance What is a factorial design?What is a factorial design? Main effectsMain effects.
Factorial ANOVA 2-Way ANOVA, 3-Way ANOVA, etc.. Factorial ANOVA One-Way ANOVA = ANOVA with one IV with 1+ levels and one DV One-Way ANOVA = ANOVA with.
Intro to Statistics for the Behavioral Sciences PSYC 1900
Confidence Limits on Mean Sample mean is a point estimateSample mean is a point estimate We want interval estimateWe want interval estimate  Probability.
Lecture 9: One Way ANOVA Between Subjects
Two Groups Too Many? Try Analysis of Variance (ANOVA)
Analysis of Differences Between Two Groups Between Multiple Groups Independent Groups Dependent Groups Independent Groups Dependent Groups Independent.
One-way Between Groups Analysis of Variance
Analysis of Differences Between Two Groups Between Multiple Groups Independent Groups Dependent Groups Independent Groups Dependent Groups Independent.
Intro to Statistics for the Behavioral Sciences PSYC 1900
Factorial Designs More than one Independent Variable: Each IV is referred to as a Factor All Levels of Each IV represented in the Other IV.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 14: Factorial ANOVA.
Lecture 13: Factorial ANOVA 1 Laura McAvinue School of Psychology Trinity College Dublin.
2-Way ANOVA, 3-Way ANOVA, etc.
Statistical Methods in Computer Science Hypothesis Testing II: Single-Factor Experiments Ido Dagan.
Introduction to Analysis of Variance (ANOVA)
Analysis of Variance Introduction The Analysis of Variance is abbreviated as ANOVA The Analysis of Variance is abbreviated as ANOVA Used for hypothesis.
Two-Way Balanced Independent Samples ANOVA Overview of Computations.
Chapter 17 Factorial Analysis of Variance Fundamental Statistics for the Behavioral Sciences, 5th edition David C. Howell © 2003 Brooks/Cole Publishing.
Two-Way Balanced Independent Samples ANOVA Computations Contrasts Confidence Intervals.
Factorial Analysis of Variance More than 2 Independent Variables Between-Subjects Designs.
Analysis of Variance. ANOVA Probably the most popular analysis in psychology Why? Ease of implementation Allows for analysis of several groups at once.
Statistics for the Social Sciences Psychology 340 Fall 2013 Thursday, November 21 Review for Exam #4.
ANOVA Chapter 12.
Chapter 14Prepared by Samantha Gaies, M.A.1 Chapter 14: Two-Way ANOVA Let’s begin by reviewing one-way ANOVA. Try this example… Does motivation level affect.
The basic idea So far, we have been comparing two samples
PS 225 Lecture 15 Analysis of Variance ANOVA Tables.
Chapter 16 One-way Analysis of Variance Fundamental Statistics for the Behavioral Sciences, 5th edition David C. Howell © 2003 Brooks/Cole Publishing Company/ITP.
Part IV Significantly Different: Using Inferential Statistics
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Comparing Three or More Means 13.
One-Way Analysis of Variance Comparing means of more than 2 independent samples 1.
Chapter 11 HYPOTHESIS TESTING USING THE ONE-WAY ANALYSIS OF VARIANCE.
 The idea of ANOVA  Comparing several means  The problem of multiple comparisons  The ANOVA F test 1.
Analysis of Variance (Two Factors). Two Factor Analysis of Variance Main effect The effect of a single factor when any other factor is ignored. Example.
t(ea) for Two: Test between the Means of Different Groups When you want to know if there is a ‘difference’ between the two groups in the mean Use “t-test”.
Sociology 5811: Lecture 14: ANOVA 2
Chapter 13 Analysis of Variance (ANOVA) PSY Spring 2003.
Psychology 301 Chapters & Differences Between Two Means Introduction to Analysis of Variance Multiple Comparisons.
Multivariate Analysis. One-way ANOVA Tests the difference in the means of 2 or more nominal groups Tests the difference in the means of 2 or more nominal.
Two-Way Balanced Independent Samples ANOVA Computations.
Jeopardy Opening Robert Lee | UOIT Game Board $ 200 $ 200 $ 200 $ 200 $ 200 $ 400 $ 400 $ 400 $ 400 $ 400 $ 10 0 $ 10 0 $ 10 0 $ 10 0 $ 10 0 $ 300 $
Chapter 10: Analyzing Experimental Data Inferential statistics are used to determine whether the independent variable had an effect on the dependent variance.
Analysis of Variance (One Factor). ANOVA Analysis of Variance Tests whether differences exist among population means categorized by only one factor or.
Chapter Seventeen. Figure 17.1 Relationship of Hypothesis Testing Related to Differences to the Previous Chapter and the Marketing Research Process Focus.
More sophisticated ANOVA applications Repeated measures and factorial PSY SP2003.
MARKETING RESEARCH CHAPTER 17: Hypothesis Testing Related to Differences.
Chapter 15Prepared by Samantha Gaies, M.A.1 Let’s start with an example … A high school gym instructor would like to test the effectiveness of a behavior.
Research Methods and Data Analysis in Psychology Spring 2015 Kyle Stephenson.
Hypothesis test flow chart frequency data Measurement scale number of variables 1 basic χ 2 test (19.5) Table I χ 2 test for independence (19.9) Table.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Lecture Slides Elementary Statistics Eleventh Edition and the Triola.
IE241: Introduction to Design of Experiments. Last term we talked about testing the difference between two independent means. For means from a normal.
Smith/Davis (c) 2005 Prentice Hall Chapter Fifteen Inferential Tests of Significance III: Analyzing and Interpreting Experiments with Multiple Independent.
Statistics for Political Science Levin and Fox Chapter Seven
Handout Eight: Two-Way Between- Subjects Design with Interaction- Assumptions, & Analyses EPSE 592 Experimental Designs and Analysis in Educational Research.
Introduction to ANOVA Research Designs for ANOVAs Type I Error and Multiple Hypothesis Tests The Logic of ANOVA ANOVA vocabulary, notation, and formulas.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 4 Investigating the Difference in Scores.
Slide Slide 1 Copyright © 2007 Pearson Education, Inc Publishing as Pearson Addison-Wesley. Lecture Slides Elementary Statistics Tenth Edition and the.
ANOVA: Analysis of Variation
ANOVA: Analysis of Variation
ANOVA: Analysis of Variation
ANOVA: Analysis of Variation
Comparing Three or More Means
One-way Analysis of Variance
Presentation transcript:

Analysis of Variance: Some Final Issues Degrees of Freedom Familywise Error Rate (Bonferroni Adjustment) Magnitude of Effect: Eta Square, Omega Square Review: Main Effects, Interaction Effects and Simple Effects

Degrees of Freedom (df ) Number of “observations” free to vary. There will be one df associated with the effect you are reporting, and one associated with error. X Xdf total = N – 1 ( N observations) X Xdf groups = g – 1 (Number of groups) X Xdf error = g (n - 1) (An easier easy to compute: df total d – df groups)

Summary Table F(3,68) = 35.8; p<.05

Familywise Error Rate Suppose you are comparing 5 groups. Significant F only shows that not all groups are equalSuppose you are comparing 5 groups. Significant F only shows that not all groups are equal XWhat groups are different. Number of pairwise comparisons are 10 Error rate operates at the level of each comparison.Error rate operates at the level of each comparison. Error rate increases with number of comparisons.Error rate increases with number of comparisons.

In case of multiple comparisons: Bonferroni adjustment The more tests we run the more likely we are to make Type I error.The more tests we run the more likely we are to make Type I error. XGood reason to hold down number of tests Run t tests between pairs of groups, as usualRun t tests between pairs of groups, as usual XHold down number of t tests XReject if t exceeds critical value in Bonferroni table Works by using a more strict level of significance for each comparisonWorks by using a more strict level of significance for each comparison

Bonferroni t--cont. Critical value of a for each test set at.05/c, where c = number of tests runCritical value of a for each test set at.05/c, where c = number of tests run XAssuming familywise a =.05 Xe. g. with 3 tests, each t must be significant at.05/3 =.0167 level. With computer printout, just make sure calculated probability <.05/cWith computer printout, just make sure calculated probability <.05/c Necessary table is in the bookNecessary table is in the book

Magnitude of Effect Why you need to compute magnitude of effect indicesWhy you need to compute magnitude of effect indices XLevel of significance tells us nothing about size of effect XT and F values inflate with sample size XHard to compare with statistic with other kinds of analysis

Visual Explanation of Magnitude of Effect

Magnitude of Effect--cont. Eta squared (h 2 )Eta squared (h 2 ) XEasy to calculate XSomewhat biased on the high side XPercent of variation in the data that can be attributed to treatment differences Omega squared (w 2 )Omega squared (w 2 ) XMuch less biased than h 2 XNot as intuitive XWe adjust both numerator and denominator with MS error XFormula on next slide

h 2 and w 2 for example problem h 2 =.18: 18% of variability in symptoms can be accounted for by treatment h 2 =.18: 18% of variability in symptoms can be accounted for by treatment w 2 =.12: This is a less biased estimate, and note that it is 33% smaller. w 2 =.12: This is a less biased estimate, and note that it is 33% smaller.

R2 is also often used. It is based on the sum of squares. For experiments use Omega Squared. For correlations use R squared. Value of R square is greater than omega squared. Cohen classified effects as Small Effect:.01 Medium Effect:.06 Large Effect:.15

Factorial Analysis of Variance What is a factorial design?What is a factorial design? Main effectsMain effects InteractionsInteractions Simple effectsSimple effects Magnitude of effectMagnitude of effect

There are two factors in the analysis: Type of Rating and Why item was recommended If you examine effect of “Knowing Why” (ignoring Type of Rating for the time being), you are looking at the main effect of Knowing Why. If we look at the effect of Type of Rating, ignoring Knowing Why, then you are looking at the main effect of Knowing Why. Main effects

If you could restrict yourself to one level of one IV for the time being, and looking at the effect of the other IV within that level. Effect of Knowing Why at one level of Type of Rating at, then that is a simple effect of Knowing Why on Interest Ratings. Above is identical to t-test Simple effects Simple Effect of Knowing Why at one level of Type of Rating (I.e., Interest)

Interactions (Effect of one variable on the other) Does Knowing Why effect Interest and Confidence differentially?

Example Data with computation (cell means and standard deviations)

Plotting Results

Effects to be estimated Differences due to instructionsDifferences due to instructions XErrors more in condition without instructions Differences due to genderDifferences due to gender XMales appear higher than females Interaction of video and genderInteraction of video and gender XWhat is an interaction? XDo instructions effect males and females equally? Cont.

Estimated Effects--cont. ErrorError Xaverage within-cell variance Sum of squares and mean squaresSum of squares and mean squares XExtension of the same concepts in the one-way

Calculations Total sum of squaresTotal sum of squares Main effect sum of squaresMain effect sum of squares Cont.

Calculations--cont. Interaction sum of squaresInteraction sum of squares XCalculate SS cells and subtract SS V and SS G SS error = SS total - SS cellsSS error = SS total - SS cells Xor, MS error can be found as average of cell variances

Degrees of Freedom df for main effects = number of levels - 1df for main effects = number of levels - 1 df for interaction = product of df main effectsdf for interaction = product of df main effects df error = N - ab = N - # cellsdf error = N - ab = N - # cells df total = N - 1df total = N - 1

Calculations for Data SS total requires raw data.SS total requires raw data. XIt is actually = SS video SS video Cont.

Calculations--cont. SS genderSS gender Cont.

Calculations--cont. SS cellsSS cells SS VXG = SS cells - SS instruction - SS gender = = 0.125SS VXG = SS cells - SS instruction - SS gender = = Cont.

Calculations--cont. MS error = average of cell variances = ( )/4 =58.89/4 = MS error = average of cell variances = ( )/4 =58.89/4 = Note that this is MS error and not SS errorNote that this is MS error and not SS error

Summary Table

Elaborate on Interactions Diagrammed on next slide as line graphDiagrammed on next slide as line graph Note parallelism of linesNote parallelism of lines XInstruction differences did not depend on gender

Line Graph of Interaction