Analysis of Differences Between Two Groups Between Multiple Groups Independent Groups Dependent Groups Independent Groups Dependent Groups Independent.

Slides:



Advertisements
Similar presentations
Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Advertisements

One-way Analysis of Variance Single Independent Variable Between-Subjects Design.
Bivariate Analyses.
PSY 307 – Statistics for the Behavioral Sciences
ANOVA: Analysis of Variance
Observational Data Time Interval = 20 secs. Factorial Analysis of Variance What is a factorial design?What is a factorial design? Main effectsMain effects.
Independent Sample T-test Formula
Analysis of Variance: Some Final Issues Degrees of Freedom Familywise Error Rate (Bonferroni Adjustment) Magnitude of Effect: Eta Square, Omega Square.
Analysis of Variance: Inferences about 2 or More Means
PSY 307 – Statistics for the Behavioral Sciences
Intro to Statistics for the Behavioral Sciences PSYC 1900
Confidence Limits on Mean Sample mean is a point estimateSample mean is a point estimate We want interval estimateWe want interval estimate  Probability.
Lecture 9: One Way ANOVA Between Subjects
Analysis of Differences Between Two Groups Between Multiple Groups Independent Groups Dependent Groups Independent Groups Dependent Groups Independent.
One-way Between Groups Analysis of Variance
UNDERSTANDING RESEARCH RESULTS: STATISTICAL INFERENCE © 2012 The McGraw-Hill Companies, Inc.
PSY 307 – Statistics for the Behavioral Sciences Chapter 19 – Chi-Square Test for Qualitative Data Chapter 21 – Deciding Which Test to Use.
Comparing Several Means: One-way ANOVA Lesson 14.
Today Concepts underlying inferential statistics
Introduction to Analysis of Variance (ANOVA)
Chapter 14 Inferential Data Analysis
Chapter 17 Factorial Analysis of Variance Fundamental Statistics for the Behavioral Sciences, 5th edition David C. Howell © 2003 Brooks/Cole Publishing.
Factorial Analysis of Variance More than 2 Independent Variables Between-Subjects Designs.
Chapter 12 Inferential Statistics Gay, Mills, and Airasian
Inferential Statistics
ANOVA Chapter 12.
AM Recitation 2/10/11.
Chapter 16 One-way Analysis of Variance Fundamental Statistics for the Behavioral Sciences, 5th edition David C. Howell © 2003 Brooks/Cole Publishing Company/ITP.
One-Way Analysis of Variance Comparing means of more than 2 independent samples 1.
Chapter 11 HYPOTHESIS TESTING USING THE ONE-WAY ANALYSIS OF VARIANCE.
 The idea of ANOVA  Comparing several means  The problem of multiple comparisons  The ANOVA F test 1.
Analysis of Variance (Two Factors). Two Factor Analysis of Variance Main effect The effect of a single factor when any other factor is ignored. Example.
t(ea) for Two: Test between the Means of Different Groups When you want to know if there is a ‘difference’ between the two groups in the mean Use “t-test”.
© Copyright McGraw-Hill CHAPTER 12 Analysis of Variance (ANOVA)
PSY 307 – Statistics for the Behavioral Sciences Chapter 16 – One-Factor Analysis of Variance (ANOVA)
Chapter 13 Analysis of Variance (ANOVA) PSY Spring 2003.
Correlation and Regression Used when we are interested in the relationship between two variables. NOT the differences between means or medians of different.
ANOVA (Analysis of Variance) by Aziza Munir
Psychology 301 Chapters & Differences Between Two Means Introduction to Analysis of Variance Multiple Comparisons.
Multivariate Analysis. One-way ANOVA Tests the difference in the means of 2 or more nominal groups Tests the difference in the means of 2 or more nominal.
Copyright © 2004 Pearson Education, Inc.
Chapter 10: Analyzing Experimental Data Inferential statistics are used to determine whether the independent variable had an effect on the dependent variance.
Educational Research Chapter 13 Inferential Statistics Gay, Mills, and Airasian 10 th Edition.
Analysis of Variance (One Factor). ANOVA Analysis of Variance Tests whether differences exist among population means categorized by only one factor or.
CHI SQUARE TESTS.
Chapter Seventeen. Figure 17.1 Relationship of Hypothesis Testing Related to Differences to the Previous Chapter and the Marketing Research Process Focus.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Three Broad Purposes of Quantitative Research 1. Description 2. Theory Testing 3. Theory Generation.
More sophisticated ANOVA applications Repeated measures and factorial PSY SP2003.
Chapter 12 Introduction to Analysis of Variance PowerPoint Lecture Slides Essentials of Statistics for the Behavioral Sciences Eighth Edition by Frederick.
Chapter 13 Repeated-Measures and Two-Factor Analysis of Variance
Two-Way (Independent) ANOVA. PSYC 6130A, PROF. J. ELDER 2 Two-Way ANOVA “Two-Way” means groups are defined by 2 independent variables. These IVs are typically.
Hypothesis test flow chart frequency data Measurement scale number of variables 1 basic χ 2 test (19.5) Table I χ 2 test for independence (19.9) Table.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Lecture Slides Elementary Statistics Eleventh Edition and the Triola.
Remember You just invented a “magic math pill” that will increase test scores. On the day of the first test you give the pill to 4 subjects. When these.
Copyright c 2001 The McGraw-Hill Companies, Inc.1 Chapter 11 Testing for Differences Differences betweens groups or categories of the independent variable.
Introduction to ANOVA Research Designs for ANOVAs Type I Error and Multiple Hypothesis Tests The Logic of ANOVA ANOVA vocabulary, notation, and formulas.
Significance Tests for Regression Analysis. A. Testing the Significance of Regression Models The first important significance test is for the regression.
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 11 Testing for Differences Differences betweens groups or categories of the independent.
Chapter 13 Understanding research results: statistical inference.
Jump to first page Inferring Sample Findings to the Population and Testing for Differences.
SUMMARY EQT 271 MADAM SITI AISYAH ZAKARIA SEMESTER /2015.
Differences Among Groups
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 4 Investigating the Difference in Scores.
Slide Slide 1 Copyright © 2007 Pearson Education, Inc Publishing as Pearson Addison-Wesley. Lecture Slides Elementary Statistics Tenth Edition and the.
Educational Research Inferential Statistics Chapter th Chapter 12- 8th Gay and Airasian.
Chapter 12 Introduction to Analysis of Variance
Why is this important? Requirement Understand research articles
UNDERSTANDING RESEARCH RESULTS: STATISTICAL INFERENCE
One-way Analysis of Variance
Presentation transcript:

Analysis of Differences Between Two Groups Between Multiple Groups Independent Groups Dependent Groups Independent Groups Dependent Groups Independent Samples t-test Repeated Measures t-test Independent Samples ANOVA Repeated Measures ANOVA Frequency CHI Square Nominal / Ordinal Data Some kinds of Regression Correlation: Pearson Regression Analysis of Relationships Multiple Predictors Correlation: Spearman Multiple Regression One Predictor Concept Map For Statistics as taught in IS271 (a work in progress) Rashmi Sinha Interval Data Type of Data Ordinal Regression

Analysis of Variance or F test ANOVA is a technique for using differences between sample means to draw inferences about the presence or absence of differences between populations means. The logic of ANOVA and calculation in SPSS Magnitude of effect: eta squared, omega squared Note: ANOVA is equivalent to t-test in case of two group situation

Logic of Analysis of Variance Null hypothesis (H o ): Population means from different conditions are equalNull hypothesis (H o ): Population means from different conditions are equal Xm 1 = m 2 = m 3 = m 4 Alternative hypothesis: H 1Alternative hypothesis: H 1 XNot all population means equal.

Lets visualize total amount of variance in an experiment Between Group Differences (Mean Square Group) Error Variance (Individual Differences + Random Variance) Mean Square Error Total Variance = Mean Square Total F ratio is a proportion of the MS group/MS Error. The larger the group differences, the bigger the F The larger the error variance, the smaller the F

Logic Create a measure of variability among group means MS groupCreate a measure of variability among group means MS group Create a measure of variability within groups MS errorCreate a measure of variability within groups MS error Form ratio of MS group /MS errorForm ratio of MS group /MS error XRatio approximately 1 if null true XRatio significantly larger than 1 if null false X“approximately 1” can actually be as high as 2 or 3, but not much higher XLook up statistical tables to see if F ratio is significant for the specified degrees of freedom

Grand mean = 3.78 Hypothetical Data

Calculations Start with Sum of Squares (SS)Start with Sum of Squares (SS) XWe need: SS totalSS total SS groupsSS groups SS errorSS error Compute degrees of freedom (df )Compute degrees of freedom (df ) Compute mean squares and FCompute mean squares and F Cont.

Calculations--cont.

Degrees of Freedom (df ) Number of “observations” free to varyNumber of “observations” free to vary Xdf total = N - 1 N observationsN observations Xdf groups = g - 1 g meansg means Xdf error = g (n - 1) n observations in each group = n - 1 dfn observations in each group = n - 1 df times g groupstimes g groups

Summary Table

When there are more than two groups Significant F only shows that not all groups are equalSignificant F only shows that not all groups are equal XWe want to know what groups are different. Such procedures are designed to control familywise error rate.Such procedures are designed to control familywise error rate. XFamilywise error rate defined XContrast with per comparison error rate

In case of multiple comparisons: Bonferroni adjustment The more tests we run the more likely we are to make Type I error.The more tests we run the more likely we are to make Type I error. XGood reason to hold down number of tests Run t tests between pairs of groups, as usualRun t tests between pairs of groups, as usual XHold down number of t tests XReject if t exceeds critical value in Bonferroni table Works by using a more strict level of significance for each comparisonWorks by using a more strict level of significance for each comparison

Bonferroni t--cont. Critical value of a for each test set at.05/c, where c = number of tests runCritical value of a for each test set at.05/c, where c = number of tests run XAssuming familywise a =.05 Xe. g. with 3 tests, each t must be significant at.05/3 =.0167 level. With computer printout, just make sure calculated probability <.05/cWith computer printout, just make sure calculated probability <.05/c Necessary table is in the bookNecessary table is in the book

Magnitude of Effect Why you need to compute magnitude of effect indicesWhy you need to compute magnitude of effect indices Eta squared (h 2 )Eta squared (h 2 ) XEasy to calculate XSomewhat biased on the high side XPercent of variation in the data that can be attributed to treatment differences

Magnitude of Effect--cont. Omega squared (w 2 )Omega squared (w 2 ) XMuch less biased than h 2 XNot as intuitive XWe adjust both numerator and denominator with MS error XFormula on next slide

h 2 and w 2 for Foa, et al. h 2 =.18: 18% of variability in symptoms can be accounted for by treatment h 2 =.18: 18% of variability in symptoms can be accounted for by treatment w 2 =.12: This is a less biased estimate, and note that it is 33% smaller. w 2 =.12: This is a less biased estimate, and note that it is 33% smaller.

Factorial Analysis of Variance What is a factorial design?What is a factorial design? Main effectsMain effects InteractionsInteractions Simple effectsSimple effects Magnitude of effectMagnitude of effect

What is a Factorial At least two independent variablesAt least two independent variables All combinations of each variableAll combinations of each variable Rows X Columns factorialRows X Columns factorial CellsCells 2 X 2 Factorial

There are two factors in the experiment: Source of Review and Type of Product. If you examine effect of Source of Review (ignoring Type of Product for the time being), you are looking at the main effect of Source of Review. If we look at the effect of Type of Product, ignoring Source of Review, then you are looking at the main effect of Type of Product. Main effects

If you could restrict yourself to one level of one IV for the time being, and looking at the effect of the other IV within that level. Effect of Source of Review at one level of Product Type (e.g. for one kind of Product), then that is a simple effect. Effect of Product Type at one level of Source of Review (e.g. for one kind of Source, then that is a simple effect. Simple effects Simple of Effect of Product Type at one level of Source of Review (I.e., one kind of Review Type, Expert Review)

Interactions (Effect of one variable on the other)

Types of Interactions And this is when there are only two variables!

F ratio is biased because it goes up with sample size. For a true estimate for the treatment effect size, use eta squared (the proportion of the treatment effect / total variance in the experiment). Eta Squared is a better estimate than F but it is still a biased estimate. A better index is Omega Squared. Magnitude of Effect

Eta SquaredEta Squared XInterpretation Omega squaredOmega squared XLess biased estimate k = number of levels for the effect in question

Omega Squared

R2 is also often used. It is based on the sum of squares. For experiments use Omega Squared. For correlations use R squared. Value of R square is greater than omega squared. Cohen classified effects as Small Effect:.01 Medium Effect:.06 Large Effect:.15

The Data (cell means and standard deviations)

Plotting Results

Effects to be estimated Differences due to instructionsDifferences due to instructions XErrors more in condition without instructions Differences due to genderDifferences due to gender XMales appear higher than females Interaction of video and genderInteraction of video and gender XWhat is an interaction? XDo instructions effect males and females equally? Cont.

Estimated Effects--cont. ErrorError Xaverage within-cell variance Sum of squares and mean squaresSum of squares and mean squares XExtension of the same concepts in the one-way

Calculations Total sum of squaresTotal sum of squares Main effect sum of squaresMain effect sum of squares Cont.

Calculations--cont. Interaction sum of squaresInteraction sum of squares XCalculate SS cells and subtract SS V and SS G SS error = SS total - SS cellsSS error = SS total - SS cells Xor, MS error can be found as average of cell variances

Degrees of Freedom df for main effects = number of levels - 1df for main effects = number of levels - 1 df for interaction = product of df main effectsdf for interaction = product of df main effects df error = N - ab = N - # cellsdf error = N - ab = N - # cells df total = N - 1df total = N - 1

Calculations for Data SS total requires raw data.SS total requires raw data. XIt is actually = SS video SS video Cont.

Calculations--cont. SS genderSS gender Cont.

Calculations--cont. SS cellsSS cells SS VXG = SS cells - SS instruction - SS gender = = 0.125SS VXG = SS cells - SS instruction - SS gender = = Cont.

Calculations--cont. MS error = average of cell variances = ( )/4 =58.89/4 = MS error = average of cell variances = ( )/4 =58.89/4 = Note that this is MS error and not SS errorNote that this is MS error and not SS error

Summary Table

Elaborate on Interactions Diagrammed on next slide as line graphDiagrammed on next slide as line graph Note parallelism of linesNote parallelism of lines XInstruction differences did not depend on gender

Line Graph of Interaction