Presentation is loading. Please wait.

Presentation is loading. Please wait.

Analysis of Differences Between Two Groups Between Multiple Groups Independent Groups Dependent Groups Independent Groups Dependent Groups Independent.

Similar presentations


Presentation on theme: "Analysis of Differences Between Two Groups Between Multiple Groups Independent Groups Dependent Groups Independent Groups Dependent Groups Independent."— Presentation transcript:

1 Analysis of Differences Between Two Groups Between Multiple Groups Independent Groups Dependent Groups Independent Groups Dependent Groups Independent Samples t-test Repeated Measures t-test Independent Samples ANOVA Repeated Measures ANOVA Frequency CHI Square Nominal / Ordinal Data Some kinds of Regression Correlation: Pearson Regression Analysis of Relationships Multiple Predictors Correlation: Spearman Multiple Regression One Predictor Concept Map For Statistics as taught in IS271 (a work in progress) Rashmi Sinha Interval Data Type of Data Ordinal Regression

2 Analysis of Variance or F test ANOVA is a technique for using differences between sample means to draw inferences about the presence or absence of differences between populations means. The logic of ANOVA and calculation in SPSS Magnitude of effect: eta squared, omega squared Note: ANOVA is equivalent to t-test in case of two group situation

3 Logic of Analysis of Variance Null hypothesis (H o ): Population means from different conditions are equalNull hypothesis (H o ): Population means from different conditions are equal Xm 1 = m 2 = m 3 = m 4 Alternative hypothesis: H 1Alternative hypothesis: H 1 XNot all population means equal.

4 Lets visualize total amount of variance in an experiment Between Group Differences (Mean Square Group) Error Variance (Individual Differences + Random Variance) Mean Square Error Total Variance = Mean Square Total F ratio is a proportion of the MS group/MS Error. The larger the group differences, the bigger the F The larger the error variance, the smaller the F

5 Logic Create a measure of variability among group means MS groupCreate a measure of variability among group means MS group Create a measure of variability within groups MS errorCreate a measure of variability within groups MS error Form ratio of MS group /MS errorForm ratio of MS group /MS error XRatio approximately 1 if null true XRatio significantly larger than 1 if null false X“approximately 1” can actually be as high as 2 or 3, but not much higher XLook up statistical tables to see if F ratio is significant for the specified degrees of freedom

6 Grand mean = 3.78 Hypothetical Data

7 Calculations Start with Sum of Squares (SS)Start with Sum of Squares (SS) XWe need: SS totalSS total SS groupsSS groups SS errorSS error Compute degrees of freedom (df )Compute degrees of freedom (df ) Compute mean squares and FCompute mean squares and F Cont.

8 Calculations--cont.

9 Degrees of Freedom (df ) Number of “observations” free to varyNumber of “observations” free to vary Xdf total = N - 1 N observationsN observations Xdf groups = g - 1 g meansg means Xdf error = g (n - 1) n observations in each group = n - 1 dfn observations in each group = n - 1 df times g groupstimes g groups

10 Summary Table

11 When there are more than two groups Significant F only shows that not all groups are equalSignificant F only shows that not all groups are equal XWe want to know what groups are different. Such procedures are designed to control familywise error rate.Such procedures are designed to control familywise error rate. XFamilywise error rate defined XContrast with per comparison error rate

12 In case of multiple comparisons: Bonferroni adjustment The more tests we run the more likely we are to make Type I error.The more tests we run the more likely we are to make Type I error. XGood reason to hold down number of tests Run t tests between pairs of groups, as usualRun t tests between pairs of groups, as usual XHold down number of t tests XReject if t exceeds critical value in Bonferroni table Works by using a more strict level of significance for each comparisonWorks by using a more strict level of significance for each comparison

13 Bonferroni t--cont. Critical value of a for each test set at.05/c, where c = number of tests runCritical value of a for each test set at.05/c, where c = number of tests run XAssuming familywise a =.05 Xe. g. with 3 tests, each t must be significant at.05/3 =.0167 level. With computer printout, just make sure calculated probability <.05/cWith computer printout, just make sure calculated probability <.05/c Necessary table is in the bookNecessary table is in the book

14 Magnitude of Effect Why you need to compute magnitude of effect indicesWhy you need to compute magnitude of effect indices Eta squared (h 2 )Eta squared (h 2 ) XEasy to calculate XSomewhat biased on the high side XPercent of variation in the data that can be attributed to treatment differences

15 Magnitude of Effect--cont. Omega squared (w 2 )Omega squared (w 2 ) XMuch less biased than h 2 XNot as intuitive XWe adjust both numerator and denominator with MS error XFormula on next slide

16 h 2 and w 2 for Foa, et al. h 2 =.18: 18% of variability in symptoms can be accounted for by treatment h 2 =.18: 18% of variability in symptoms can be accounted for by treatment w 2 =.12: This is a less biased estimate, and note that it is 33% smaller. w 2 =.12: This is a less biased estimate, and note that it is 33% smaller.

17 Factorial Analysis of Variance What is a factorial design?What is a factorial design? Main effectsMain effects InteractionsInteractions Simple effectsSimple effects Magnitude of effectMagnitude of effect

18 What is a Factorial At least two independent variablesAt least two independent variables All combinations of each variableAll combinations of each variable Rows X Columns factorialRows X Columns factorial CellsCells 2 X 2 Factorial

19 There are two factors in the experiment: Source of Review and Type of Product. If you examine effect of Source of Review (ignoring Type of Product for the time being), you are looking at the main effect of Source of Review. If we look at the effect of Type of Product, ignoring Source of Review, then you are looking at the main effect of Type of Product. Main effects

20 If you could restrict yourself to one level of one IV for the time being, and looking at the effect of the other IV within that level. Effect of Source of Review at one level of Product Type (e.g. for one kind of Product), then that is a simple effect. Effect of Product Type at one level of Source of Review (e.g. for one kind of Source, then that is a simple effect. Simple effects Simple of Effect of Product Type at one level of Source of Review (I.e., one kind of Review Type, Expert Review)

21 Interactions (Effect of one variable on the other)

22 Types of Interactions And this is when there are only two variables!

23 F ratio is biased because it goes up with sample size. For a true estimate for the treatment effect size, use eta squared (the proportion of the treatment effect / total variance in the experiment). Eta Squared is a better estimate than F but it is still a biased estimate. A better index is Omega Squared. Magnitude of Effect

24 Eta SquaredEta Squared XInterpretation Omega squaredOmega squared XLess biased estimate k = number of levels for the effect in question

25 Omega Squared

26 R2 is also often used. It is based on the sum of squares. For experiments use Omega Squared. For correlations use R squared. Value of R square is greater than omega squared. Cohen classified effects as Small Effect:.01 Medium Effect:.06 Large Effect:.15

27 The Data (cell means and standard deviations)

28 Plotting Results

29 Effects to be estimated Differences due to instructionsDifferences due to instructions XErrors more in condition without instructions Differences due to genderDifferences due to gender XMales appear higher than females Interaction of video and genderInteraction of video and gender XWhat is an interaction? XDo instructions effect males and females equally? Cont.

30 Estimated Effects--cont. ErrorError Xaverage within-cell variance Sum of squares and mean squaresSum of squares and mean squares XExtension of the same concepts in the one-way

31 Calculations Total sum of squaresTotal sum of squares Main effect sum of squaresMain effect sum of squares Cont.

32 Calculations--cont. Interaction sum of squaresInteraction sum of squares XCalculate SS cells and subtract SS V and SS G SS error = SS total - SS cellsSS error = SS total - SS cells Xor, MS error can be found as average of cell variances

33 Degrees of Freedom df for main effects = number of levels - 1df for main effects = number of levels - 1 df for interaction = product of df main effectsdf for interaction = product of df main effects df error = N - ab = N - # cellsdf error = N - ab = N - # cells df total = N - 1df total = N - 1

34 Calculations for Data SS total requires raw data.SS total requires raw data. XIt is actually = 171.50 SS video SS video Cont.

35 Calculations--cont. SS genderSS gender Cont.

36 Calculations--cont. SS cellsSS cells SS VXG = SS cells - SS instruction - SS gender = 171.375 - 105.125 - 66.125 = 0.125SS VXG = SS cells - SS instruction - SS gender = 171.375 - 105.125 - 66.125 = 0.125 Cont.

37 Calculations--cont. MS error = average of cell variances = (4.6 2 + 3.5 2 + 4.2 2 + 2.8 2 )/4 =58.89/4 = 14.723MS error = average of cell variances = (4.6 2 + 3.5 2 + 4.2 2 + 2.8 2 )/4 =58.89/4 = 14.723 Note that this is MS error and not SS errorNote that this is MS error and not SS error

38 Summary Table

39 Elaborate on Interactions Diagrammed on next slide as line graphDiagrammed on next slide as line graph Note parallelism of linesNote parallelism of lines XInstruction differences did not depend on gender

40 Line Graph of Interaction


Download ppt "Analysis of Differences Between Two Groups Between Multiple Groups Independent Groups Dependent Groups Independent Groups Dependent Groups Independent."

Similar presentations


Ads by Google