ANOVA Demo Part 2: Analysis Psy 320 Cal State Northridge Andrew Ainsworth PhD.

Slides:



Advertisements
Similar presentations
Lecture 2 ANALYSIS OF VARIANCE: AN INTRODUCTION
Advertisements

Multiple-choice question
1 Contact details Colin Gray Room S16 (occasionally) address: Telephone: (27) 2233 Dont hesitate to get in touch.
Hypothesis Tests: Two Independent Samples
Hypothesis Tests: Two Related Samples AKA Dependent Samples Tests
One-Way Analysis of Variance
Mixed Designs: Between and Within Psy 420 Ainsworth.
Chapter 10 - Part 1 Factorial Experiments. Nomenclature We will use the terms Factor and Independent Variable interchangeably. They mean the same thing.
One-Way BG ANOVA Andrew Ainsworth Psy 420. Topics Analysis with more than 2 levels Deviation, Computation, Regression, Unequal Samples Specific Comparisons.
ANOVA Demo Part 1: Explanation Psy 320 Cal State Northridge Andrew Ainsworth PhD.
Smith/Davis (c) 2005 Prentice Hall Chapter Thirteen Inferential Tests of Significance II: Analyzing and Interpreting Experiments with More than Two Groups.
Cal State Northridge  320 Andrew Ainsworth PhD Regression.
Two Factor ANOVA.
Between Groups & Within-Groups ANOVA
Analysis of Variance: Inferences about 2 or More Means
ANCOVA Psy 420 Andrew Ainsworth. What is ANCOVA?
Chapter 3 Analysis of Variance
PSY 307 – Statistics for the Behavioral Sciences
Intro to Statistics for the Behavioral Sciences PSYC 1900
Lecture 9: One Way ANOVA Between Subjects
Chapter 10 - Part 1 Factorial Experiments.
One-way Between Groups Analysis of Variance
Chapter 9 - Lecture 2 Computing the analysis of variance for simple experiments (single factor, unrelated groups experiments).
Basic Analysis of Variance and the General Linear Model Psy 420 Andrew Ainsworth.
1 Two Factor ANOVA Greg C Elvers. 2 Factorial Designs Often researchers want to study the effects of two or more independent variables at the same time.
Statistical Analysis. Purpose of Statistical Analysis Determines whether the results found in an experiment are meaningful. Answers the question: –Does.
Analysis of Variance. ANOVA Probably the most popular analysis in psychology Why? Ease of implementation Allows for analysis of several groups at once.
Psy B07 Chapter 1Slide 1 ANALYSIS OF VARIANCE. Psy B07 Chapter 1Slide 2 t-test refresher  In chapter 7 we talked about analyses that could be conducted.
The basic idea So far, we have been comparing two samples
HAWKES LEARNING SYSTEMS math courseware specialists Copyright © 2010 by Hawkes Learning Systems/Quant Systems, Inc. All rights reserved. Chapter 14 Analysis.
Statistical Analysis Statistical Analysis
T-test Mechanics. Z-score If we know the population mean and standard deviation, for any value of X we can compute a z-score Z-score tells us how far.
Analysis of Variance: Some Review and Some New Ideas
COURSE: JUST 3900 INTRODUCTORY STATISTICS FOR CRIMINAL JUSTICE Instructor: Dr. John J. Kerbs, Associate Professor Joint Ph.D. in Social Work and Sociology.
One-Way Analysis of Variance Comparing means of more than 2 independent samples 1.
Chapter 11 HYPOTHESIS TESTING USING THE ONE-WAY ANALYSIS OF VARIANCE.
Smith/Davis (c) 2005 Prentice Hall Chapter Six Summarizing and Comparing Data: Measures of Variation, Distribution of Means and the Standard Error of the.
Tuesday August 27, 2013 Distributions: Measures of Central Tendency & Variability.
One-sample In the previous cases we had one sample and were comparing its mean to a hypothesized population mean However in many situations we will use.
© Copyright McGraw-Hill CHAPTER 12 Analysis of Variance (ANOVA)
PSY 307 – Statistics for the Behavioral Sciences Chapter 16 – One-Factor Analysis of Variance (ANOVA)
One-way Analysis of Variance 1-Factor ANOVA. Previously… We learned how to determine the probability that one sample belongs to a certain population.
ANOVA (Analysis of Variance) by Aziza Munir
Testing Hypotheses about Differences among Several Means.
Chapter 10: Analyzing Experimental Data Inferential statistics are used to determine whether the independent variable had an effect on the dependent variance.
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
INTRODUCTION TO ANALYSIS OF VARIANCE (ANOVA). COURSE CONTENT WHAT IS ANOVA DIFFERENT TYPES OF ANOVA ANOVA THEORY WORKED EXAMPLE IN EXCEL –GENERATING THE.
Chapter 3 For Explaining Psychological Statistics, 4th ed. by B. Cohen 1 Chapter 3: Measures of Central Tendency and Variability Imagine that a researcher.
General Linear Model 2 Intro to ANOVA.
Analysis of Variance (ANOVA) Brian Healy, PhD BIO203.
Chapter 14 Repeated Measures and Two Factor Analysis of Variance
Chapter Seventeen. Figure 17.1 Relationship of Hypothesis Testing Related to Differences to the Previous Chapter and the Marketing Research Process Focus.
Chapter 12 Introduction to Analysis of Variance PowerPoint Lecture Slides Essentials of Statistics for the Behavioral Sciences Eighth Edition by Frederick.
Chapter 13 Repeated-Measures and Two-Factor Analysis of Variance
Hypothesis test flow chart frequency data Measurement scale number of variables 1 basic χ 2 test (19.5) Table I χ 2 test for independence (19.9) Table.
Smith/Davis (c) 2005 Prentice Hall Chapter Fifteen Inferential Tests of Significance III: Analyzing and Interpreting Experiments with Multiple Independent.
Statistics for Political Science Levin and Fox Chapter Seven
Variability Introduction to Statistics Chapter 4 Jan 22, 2009 Class #4.
Research Methods and Data Analysis in Psychology Spring 2015 Kyle Stephenson.
While you wait: Enter the following in your calculator. Find the mean and sample variation of each group. Bluman, Chapter 121.
Quantitative methods and R – (2) LING115 December 2, 2009.
Factorial BG ANOVA Psy 420 Ainsworth. Topics in Factorial Designs Factorial? Crossing and Nesting Assumptions Analysis Traditional and Regression Approaches.
Chapter 12 Introduction to Analysis of Variance
12 Inferential Analysis.
Analysis of Variance: Some Review and Some New Ideas
Introduction to ANOVA.
Chapter 13 Group Differences
12 Inferential Analysis.
Statistics for the Social Sciences
MGS 3100 Business Analysis Regression Feb 18, 2016
Presentation transcript:

ANOVA Demo Part 2: Analysis Psy 320 Cal State Northridge Andrew Ainsworth PhD

Review: Sample Variance This can be re-written into:

Data Set FYI: N = 9

Total Sums of Squares Let’s calculate the Sums of Squares (SS) for this data set as it is… As we can see the mean of 9.67 has already been calculated for us and we are going to treat that 9.67 as a Grand Mean (i.e. ungrouped mean)

Total Sums of Squares

Between Groups Sums of Squares So, the Total Sums of Squares applies to this data if all of the 9 data points were collected as part of a single 9 member group. However, what if the data were collected in groups of 3 instead And let’s imagine that each group is receiving some different form of treatment (e.g. Independent variable) that we think will affect the subjects’ scores along with each individual group’s mean

Between Groups Sums of Squares Note: that each group has it’s own mean that describes the central tendency of the participants in that group (e.g. 6 is the mean of 7, 1 and 10) Also Note: That with equal subjects in each group the average of the 3 group means is the “Grand Mean” from earlier (i.e is the average of 6, and )

Between Groups Sums of Squares So, if we want to understand the effect that the different treatments are having on the group via the participants (i.e. how are the treatments moving the participants away from the grand mean) we can pretend for a second that every participant scored exactly at their own group mean

Between Groups Sums of Squares So, if we want to understand the effect that the different treatments are having on the group via the participants (i.e. how are the treatments moving the participants away from the grand mean) we can pretend for a second that every participant scored exactly at their own group mean Note: the group means and grand mean stays the same

Between Groups Sums of Squares Let’s ignore the group means for a second and calculate the SS pretending that every participant scored at their group mean

Between Groups Sums of Squares Let’s take a look at that last formula and see that when calculated there is a lot of redundancy within each group For instance for the first group we are subtracting and squaring the same number 3 times (i.e. one for each participant) Couldn’t we come to the same answer by simply doing it once and multiplying by the number of participants in that group (i.e. 3)?

Between Groups Sums of Squares This is why typically we don’t substitute the mean for every person’s score but just weight the difference by the number of scores in each group (i.e. n g )

Within Groups Sums of Squares Looking back at the original data we can see that in fact the subject rarely, if ever, scored exactly at their group mean… So, something else, beside our hypothesized treatment, is causing our subjects to differ within each of the groups We haven’t hypothesized it, therefore we can’t explain why it’s there so we are going to assume that it is just random variation, but we still need to identify it…

Within Groups Sums of Squares To identify the random variation, let’s look inside each group separately to see if we can find an average degree of variation Remembering that variance is SS/df let’s identify the with group SS values for each group For Group 1:

Within Groups Sums of Squares To identify the random variation, let’s look inside each group separately to see if we can find an average degree of variation Remembering that variance is SS/df let’s identify the with group SS values for each group For Group 2: Note: this equals the mean coincidentally

Within Groups Sums of Squares To identify the random variation, let’s look inside each group separately to see if we can find an average degree of variation Remembering that variance is SS/df let’s identify the with group SS values for each group For Group 3:

Within Groups Variance These Within Groups Sums of Squares can be used to tell us how people just “randomly” spread out within each of groups Remembering that variance is SS/df let’s divide each groups SS by it’s degrees of freedom (df) Where n j is the number of participants in each group (e.g. for our example n j = 3 for all of the groups)

The Variance Within Each Group For group 1:

The Variance Within Each Group For group 2:

The Variance Within Each Group For group 3:

Average Within Groups Variance Now that we have the variances within each of the groups we can calculate an average within groups variance that is an extension of the pooled variance from the independent samples t-test Because the values for n j are equal this is just a simple average (Note: if the n j values are not equal you can perform a weighted average or just calculate the WG variance directly as in the next slide) Note: No subscript because this is not for any particular group but an average across the across the groups

Within Groups Sums of Squares The value for the overall WG Sums of Squares could have been calculated directly by simply combining the SS WG formula across groups Note again: that there is no subscript on the SS value because it is done for all groups at the same time

Within Groups Sums of Squares Remembering that the means for the groups are 6, and respectively we can simply take every individual score and subtract the mean of the group the score belongs to All together now…

ANOVA Summary Table Let’s take what we know and see if we can’t put it together and summarize it using the table above We know that the SS Total = 164 We know that the SS BG = We know that the SS WG = We also know that the WG variance (i.e. MS WG above) = from the average of the 3 group variances Note: The SS for Between and Within add up to the SS-total as it should

ANOVA Summary Table We have the SS value for the BG source of variability but we need to convert it to a variance. Remembering that variance is SS/df, we just need to figure out what are the BG degrees of freedom.

ANOVA Summary Table We have the SS value for the BG source of variability but we need to convert it to a variance. Remembering that variance is SS/df, we now just need to divide the SS value by the df value for the Between Groups source

ANOVA Summary Table We have the SS value and the MS value for the WG source of variability but these to values should be connected, somehow… Remembering that variance is SS/df, we just need to figure out what are the degrees of freedom Within Groups to see if we divide in the same way as with the BG source if we get the same value (i.e )

ANOVA Summary Table When we calculated the Within Groups variance we did so by averaging over the three individual group variances, each of which had a n – 1 degree of freedom So, that’s an n – 1 for each group or g * (n – 1) Or you can think of it as you need to calculate a mean for each group so you simply take the total number of scores (i.e. N) and subtract 1 for every group (i.e. g), and that’s N – g Note: that if all of the n j values are equal then g*(n-1) = N – g

ANOVA Summary Table We have 9 total subjects and 3 groups so that’s…

ANOVA Summary Table If we divide the SS value by the df value for the Within Groups source of variance we in fact get the same value we calculated earlier using the pooling method

ANOVA Summary Table In order to calculate the total SS we needed to estimate a single Grand Mean, because of this we lose one degree of freedom. The total degrees of freedom is simply the total number of participants (i.e. N) minus 1 Note: The degrees of freedom for BG and WG sum to the df-total as they should

ANOVA Summary Table In the ANOVA demo #1 we talked about how the Between Groups Variance is a measure of how far apart the groups are from the Grand Mean, which in turn tells us how far apart they are from each other on average. In order for us to know if the groups are varying far away from each other (i.e. they are significantly different) we need a measure of random variability to see if our groups are differing more than just randomly The Within Groups Variance tells us how much individuals vary from one another on average across the groups and this is our best estimate of random variability so we use it to see if the groups are different by creating the F-ratio

ANOVA Summary Table The F-ratio is simply the ratio of the Between Groups variance over the Within Groups Variance

ANOVA Summary Table The Between Groups variance contains both Real and Random variability, while the Within Groups variance contains only random (at least that’s what we are assuming). So in order for an F-ratio to be large the real group differences have to be large enough for us to see them through the random differences If no real differences exist than you are left with

ANOVA Summary Table The values found in the F-table indicate how much “real” variability exists between the groups compared to the random variability, controlling for the number of groups (i.e. df BG ) and the number of people in each group (i.e. df WG ) For our example

ANOVA Summary Table The value of tells us that any value of or larger is not likely to occur by accident (i.e. it has a.05 of lower probability) given the number of groups and the number of subjects per group Since our F-value is 5.89 and that is larger than we can conclude that some significant group difference occurs somewhere between 2 of our group means