Analysis of Differences Between Two Groups Between Multiple Groups Independent Groups Dependent Groups Independent Groups Dependent Groups Independent.

Slides:



Advertisements
Similar presentations
One-way Analysis of Variance Single Independent Variable Between-Subjects Design.
Advertisements

BPS - 5th Ed. Chapter 241 One-Way Analysis of Variance: Comparing Several Means.
Design of Experiments and Analysis of Variance
Chapter Seventeen HYPOTHESIS TESTING
Independent Sample T-test Formula
Analysis of Variance: Some Final Issues Degrees of Freedom Familywise Error Rate (Bonferroni Adjustment) Magnitude of Effect: Eta Square, Omega Square.
ANOVA Analysis of Variance: Why do these Sample Means differ as much as they do (Variance)? Standard Error of the Mean (“variance” of means) depends upon.
Experimental Design & Analysis
Analysis of Variance: Inferences about 2 or More Means
PSY 307 – Statistics for the Behavioral Sciences
Intro to Statistics for the Behavioral Sciences PSYC 1900
Final Review Session.
Intro to Statistics for the Behavioral Sciences PSYC 1900
Confidence Limits on Mean Sample mean is a point estimateSample mean is a point estimate We want interval estimateWe want interval estimate  Probability.
Lecture 9: One Way ANOVA Between Subjects
Copyright © 2006 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide Are the Means of Several Groups Equal? Ho:Ha: Consider the following.
One-way Between Groups Analysis of Variance
Analysis of Differences Between Two Groups Between Multiple Groups Independent Groups Dependent Groups Independent Groups Dependent Groups Independent.
Today Concepts underlying inferential statistics
Independent Sample T-test Classical design used in psychology/medicine N subjects are randomly assigned to two groups (Control * Treatment). After treatment,
Summary of Quantitative Analysis Neuman and Robson Ch. 11
Chapter 14 Inferential Data Analysis
Richard M. Jacobs, OSA, Ph.D.
Chapter 12 Inferential Statistics Gay, Mills, and Airasian
Inferential Statistics
AM Recitation 2/10/11.
PS 225 Lecture 15 Analysis of Variance ANOVA Tables.
Selecting the Correct Statistical Test
Chapter 16 One-way Analysis of Variance Fundamental Statistics for the Behavioral Sciences, 5th edition David C. Howell © 2003 Brooks/Cole Publishing Company/ITP.
ANOVA Greg C Elvers.
One-Way Analysis of Variance Comparing means of more than 2 independent samples 1.
Chapter 11 HYPOTHESIS TESTING USING THE ONE-WAY ANALYSIS OF VARIANCE.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
PSY 307 – Statistics for the Behavioral Sciences Chapter 16 – One-Factor Analysis of Variance (ANOVA)
Chapter 13 Analysis of Variance (ANOVA) PSY Spring 2003.
Correlation and Regression Used when we are interested in the relationship between two variables. NOT the differences between means or medians of different.
ANOVA (Analysis of Variance) by Aziza Munir
Psychology 301 Chapters & Differences Between Two Means Introduction to Analysis of Variance Multiple Comparisons.
Between-Groups ANOVA Chapter 12. >When to use an F distribution Working with more than two samples >ANOVA Used with two or more nominal independent variables.
Chapter 10: Analyzing Experimental Data Inferential statistics are used to determine whether the independent variable had an effect on the dependent variance.
Educational Research Chapter 13 Inferential Statistics Gay, Mills, and Airasian 10 th Edition.
ITEC6310 Research Methods in Information Technology Instructor: Prof. Z. Yang Course Website: c6310.htm Office:
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
1 G Lect 11a G Lecture 11a Example: Comparing variances ANOVA table ANOVA linear model ANOVA assumptions Data transformations Effect sizes.
Chapter 13 - ANOVA. ANOVA Be able to explain in general terms and using an example what a one-way ANOVA is (370). Know the purpose of the one-way ANOVA.
Analysis of Variance (One Factor). ANOVA Analysis of Variance Tests whether differences exist among population means categorized by only one factor or.
Analysis of Variance ANOVA Anwar Ahmad. ANOVA Samples from different populations (treatment groups) Any difference among the population means? Null hypothesis:
Analysis of Variance. What is Variance? Think….think…
1 ANALYSIS OF VARIANCE (ANOVA) Heibatollah Baghi, and Mastee Badii.
Chapter Seventeen. Figure 17.1 Relationship of Hypothesis Testing Related to Differences to the Previous Chapter and the Marketing Research Process Focus.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics S eventh Edition By Brase and Brase Prepared by: Lynn Smith.
Chapter 12 Introduction to Analysis of Variance PowerPoint Lecture Slides Essentials of Statistics for the Behavioral Sciences Eighth Edition by Frederick.
Statistics for Political Science Levin and Fox Chapter Seven
Introduction to ANOVA Research Designs for ANOVAs Type I Error and Multiple Hypothesis Tests The Logic of ANOVA ANOVA vocabulary, notation, and formulas.
Chapter 13 Understanding research results: statistical inference.
Jump to first page Inferring Sample Findings to the Population and Testing for Differences.
Differences Among Groups
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 4 Investigating the Difference in Scores.
Educational Research Inferential Statistics Chapter th Chapter 12- 8th Gay and Airasian.
Chapter 11: Categorical Data n Chi-square goodness of fit test allows us to examine a single distribution of a categorical variable in a population. n.
 List the characteristics of the F distribution.  Conduct a test of hypothesis to determine whether the variances of two populations are equal.  Discuss.
McGraw-Hill/Irwin © 2003 The McGraw-Hill Companies, Inc.,All Rights Reserved. Part Four ANALYSIS AND PRESENTATION OF DATA.
Chapter 12 Introduction to Analysis of Variance
Part Four ANALYSIS AND PRESENTATION OF DATA
Basic Practice of Statistics - 5th Edition
Analysis of Variance (ANOVA)
Introduction to ANOVA.
One way ANALYSIS OF VARIANCE (ANOVA)
One-way Analysis of Variance
Presentation transcript:

Analysis of Differences Between Two Groups Between Multiple Groups Independent Groups Dependent Groups Independent Groups Dependent Groups Independent Samples t-test Repeated Measures t-test Independent Samples ANOVA Repeated Measures ANOVA Frequency CHI Square Nominal / Ordinal Data Some kinds of Regression Correlation: Pearson Regression Analysis of Relationships Multiple Predictors Correlation: Spearman Multiple Regression One Predictor Concept Map For Statistics as taught in IS271 (a work in progress) Rashmi Sinha Interval Data Type of Data Ordinal Regression

Analysis of Variance or F test ANOVA is a technique for using differences between sample means to draw inferences about the presence or absence of differences between populations means. The logic Calculations in SPSS Magnitude of effect: eta squared, omega squared

Assumptions of ANOVA Assume:Assume: XObservations normally distributed within each population XPopulation variances are equal Homogeneity of variance or homoscedasticityHomogeneity of variance or homoscedasticity XObservations are independent

Assumptions--cont. Analysis of variance is generally robust to first twoAnalysis of variance is generally robust to first two XA robust test is one that is not greatly affected by violations of assumptions.

Logic of Analysis of Variance Null hypothesis (H o ): Population means from different conditions are equalNull hypothesis (H o ): Population means from different conditions are equal Xm 1 = m 2 = m 3 = m 4 Alternative hypothesis: H 1Alternative hypothesis: H 1 XNot all population means equal.

Lets visualize total amount of variance in an experiment Between Group Differences (Mean Square Group) Error Variance (Individual Differences + Random Variance) Mean Square Error Total Variance = Mean Square Total F ratio is a proportion of the MS group/MS Error. The larger the group differences, the bigger the F The larger the error variance, the smaller the F

Logic--cont. Create a measure of variability among group meansCreate a measure of variability among group means XMS group Create a measure of variability within groupsCreate a measure of variability within groups XMS error

Logic--cont. Form ratio of MS group /MS errorForm ratio of MS group /MS error XRatio approximately 1 if null true XRatio significantly larger than 1 if null false X“approximately 1” can actually be as high as 2 or 3, but not much higher

Grand mean = 3.78

Calculations Start with Sum of Squares (SS)Start with Sum of Squares (SS) XWe need: SS totalSS total SS groupsSS groups SS errorSS error Compute degrees of freedom (df )Compute degrees of freedom (df ) Compute mean squares and FCompute mean squares and F Cont.

Calculations--cont.

Degrees of Freedom (df ) Number of “observations” free to varyNumber of “observations” free to vary Xdf total = N - 1 N observationsN observations Xdf groups = g - 1 g meansg means Xdf error = g (n - 1) n observations in each group = n - 1 dfn observations in each group = n - 1 df times g groupstimes g groups

Summary Table

When there are more than two groups Significant F only shows that not all groups are equalSignificant F only shows that not all groups are equal XWe want to know what groups are different. Such procedures are designed to control familywise error rate.Such procedures are designed to control familywise error rate. XFamilywise error rate defined XContrast with per comparison error rate

Multiple Comparisons The more tests we run the more likely we are to make Type I error.The more tests we run the more likely we are to make Type I error. XGood reason to hold down number of tests

Bonferroni t Test Run t tests between pairs of groups, as usualRun t tests between pairs of groups, as usual XHold down number of t tests XReject if t exceeds critical value in Bonferroni table Works by using a more strict level of significance for each comparisonWorks by using a more strict level of significance for each comparison Cont.

Bonferroni t--cont. Critical value of a for each test set at.05/c, where c = number of tests runCritical value of a for each test set at.05/c, where c = number of tests run XAssuming familywise a =.05 Xe. g. with 3 tests, each t must be significant at.05/3 =.0167 level. With computer printout, just make sure calculated probability <.05/cWith computer printout, just make sure calculated probability <.05/c Necessary table is in the bookNecessary table is in the book

Magnitude of Effect Why you need to compute magnitude of effect indicesWhy you need to compute magnitude of effect indices Eta squared (h 2 )Eta squared (h 2 ) XEasy to calculate XSomewhat biased on the high side XFormula See slide #33See slide #33 XPercent of variation in the data that can be attributed to treatment differences Cont.

Magnitude of Effect--cont. Omega squared (w 2 )Omega squared (w 2 ) XMuch less biased than h 2 XNot as intuitive XWe adjust both numerator and denominator with MS error XFormula on next slide

h 2 and w 2 for Foa, et al. h 2 =.18: 18% of variability in symptoms can be accounted for by treatment h 2 =.18: 18% of variability in symptoms can be accounted for by treatment w 2 =.12: This is a less biased estimate, and note that it is 33% smaller. w 2 =.12: This is a less biased estimate, and note that it is 33% smaller.