STAT Single-Factor ANOVA

Slides:



Advertisements
Similar presentations
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 10 The Analysis of Variance.
Advertisements

Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Topic 12: Multiple Linear Regression
Kin 304 Regression Linear Regression Least Sum of Squares
BA 275 Quantitative Business Methods
1 Chapter 4 Experiments with Blocking Factors The Randomized Complete Block Design Nuisance factor: a design factor that probably has an effect.
Chapter 4 Randomized Blocks, Latin Squares, and Related Designs
Classical Regression III
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
One-Way Between Subjects ANOVA. Overview Purpose How is the Variance Analyzed? Assumptions Effect Size.
Confidence intervals. Population mean Assumption: sample from normal distribution.
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
1 Chapter 5 Introduction to Factorial Designs Basic Definitions and Principles Study the effects of two or more factors. Factorial designs Crossed:
15: Linear Regression Expected change in Y per unit X.
Simple Linear Regression Analysis
6.1 - One Sample One Sample  Mean μ, Variance σ 2, Proportion π Two Samples Two Samples  Means, Variances, Proportions μ 1 vs. μ 2.
9 - 1 Intrinsically Linear Regression Chapter Introduction In Chapter 7 we discussed some deviations from the assumptions of the regression model.
Introduction to Linear Regression and Correlation Analysis
7.1 - Motivation Motivation Correlation / Simple Linear Regression Correlation / Simple Linear Regression Extensions of Simple.
Analyzing Data: Comparing Means Chapter 8. Are there differences? One of the fundament questions of survey research is if there is a difference among.
Psychology 301 Chapters & Differences Between Two Means Introduction to Analysis of Variance Multiple Comparisons.
Multiple Regression I KNNL – Chapter 6. Models with Multiple Predictors Most Practical Problems have more than one potential predictor variable Goal is.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
ANOVA Assumptions 1.Normality (sampling distribution of the mean) 2.Homogeneity of Variance 3.Independence of Observations - reason for random assignment.
Lack of Fit (LOF) Test A formal F test for checking whether a specific type of regression function adequately fits the data.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Single-Factor Studies KNNL – Chapter 16. Single-Factor Models Independent Variable can be qualitative or quantitative If Quantitative, we typically assume.
General Linear Model.
Making Comparisons All hypothesis testing follows a common logic of comparison Null hypothesis and alternative hypothesis – mutually exclusive – exhaustive.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
STAT 3120 Statistical Methods I Lecture Notes 6 Analysis of Variance (ANOVA)
Introduction to Multilevel Analysis Presented by Vijay Pillai.
The General Linear Model. Estimation -- The General Linear Model Formula for a straight line y = b 0 + b 1 x x y.
Copyright © 2008 by Nelson, a division of Thomson Canada Limited Chapter 18 Part 5 Analysis and Interpretation of Data DIFFERENCES BETWEEN GROUPS AND RELATIONSHIPS.
The simple linear regression model and parameter estimation
Chapter 12 Simple Linear Regression and Correlation
STAT Single-Factor ANOVA
CHAPTER 7 Linear Correlation & Regression Methods
Chapter 13 Nonlinear and Multiple Regression
Kin 304 Regression Linear Regression Least Sum of Squares
Chapter 2 Simple Comparative Experiments
Statistical Analysis of the Randomized Block Design
Hypothesis Tests: One Sample
BPK 304W Regression Linear Regression Least Sum of Squares
One-Way Analysis of Variance: Comparing Several Means
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
Linear Regression.
Single-Factor Studies
STAT Two-Factor ANOVA with Kij = 1
Comparing k Populations
Single-Factor Studies
Chapter 11: The ANalysis Of Variance (ANOVA)
T test.
Chapter 12 Simple Linear Regression and Correlation
Introduction to Probability & Statistics The Central Limit Theorem
Review of Chapter 2 Some Basic Concepts: Sample center
LEARNING OUTCOMES After studying this chapter, you should be able to
Simple Linear Regression
Two-way analysis of variance (ANOVA)
Simple Linear Regression
Chapter 9: Differences among Groups
The Analysis of Variance
One-Factor Experiments
1-factor analysis of variance (1-anova)
Inference for Regression
Chapter 14 Multiple Regression
The Structural Model in the
Presentation transcript:

STAT 312 10.1 - Single-Factor ANOVA Chapter 10 - Analysis of Variance (ANOVA) Introduction 10.1 - Single-Factor ANOVA 10.2 - Multiple Comparisons in ANOVA 10.3 - More on Single-Factor ANOVA

Grand Mean (Estimator) For simplicity, take k = 3 treatment groups, independent, normal, equivariant: equivariant: 1 2 = H0: (True) Grand Mean ith group (row) jth entry (col) Group Samples Group Means i = 1 i = 2 i = 3 Grand Mean (Estimator)

Grand Mean (Estimator) For simplicity, take k = 3 treatment groups, independent, normal, equivariant: equivariant: 1 2 = H0: (True) Grand Mean ith group (row) jth value (col) Group Samples Group Means i = 1 i = 2 i = 3 Grand Mean (Estimator)

How far is each group mean i from the grand mean ? In general… k groups Moreover… If H0 is true, then each i = 0!!! So… (True) Group Means (True) Grand Mean How far is each group mean i from the grand mean ? Recall… The sum of deviations of any set of values from its mean is 0.

k groups In general… Moreover… If H0 is true, then each i = 0!!! So… (True) Grand Mean Recall… The sum of deviations of any set of values from its mean is 0.

How far is each sample value Yij from its group mean i? In general… k groups (True) Group Means (True) Grand Mean How far is each sample value Yij from its group mean i? Recall… The sum of deviations of any set of values from its mean is 0.

k groups In general… (True) Group Means (True) Grand Mean “residuals” ANOVA Model Recall… The sum of deviations of any set of values from its mean is 0.

k groups Moreover… In general… (True) Group Means “residuals” ANOVA Model Residuals are independent, normally distributed about 0, and equivariant.

X1 and X2 are called indicator or dummy variables. For simplicity, take k = 3 treatment groups, independent, normal, equivariant: 1 2 = H0: “reference group” ANOVA Model X1 and X2 are called indicator or dummy variables.

For simplicity, take k = 3 treatment groups, independent, normal, equivariant: 1 2 = H0: “reference group” ANOVA Model > days = c(4,5,4,3,...,,5,6,5,5) Cure A: 4 5 4 3 2 4 3 4 4 Cure B: 6 8 4 5 4 6 5 8 6 Cure C: 6 7 6 6 7 5 6 5 5 > mean(days[1:9]) 3.666667 > mean(days[10:18]) 5.777778 > mean(days[19:27]) 5.888889

For simplicity, take k = 3 treatment groups, independent, normal, equivariant: 1 2 = H0: “reference group” ANOVA Model cure = c(rep("A",9),..., rep("C",9)) concrete = data.frame(days,cure) Cure A: 4 5 4 3 2 4 3 4 4 Cure B: 6 8 4 5 4 6 5 8 6 Cure C: 6 7 6 6 7 5 6 5 5 results = aov(days ~ cure, data = concrete) summary(results) # gave full ANOVA table

For simplicity, take k = 3 treatment groups, independent, normal, equivariant: 1 2 = H0: “reference group” ANOVA Model cure = c(rep("A",9),..., rep("C",9)) concrete = data.frame(days,cure) Cure A: 4 5 4 3 2 4 3 4 4 Cure B: 6 8 4 5 4 6 5 8 6 Cure C: 6 7 6 6 7 5 6 5 5 results = aov(days ~ cure, data = concrete) results$coeff (Intercept) cureB cureC 3.666667 2.111111 2.222222

For simplicity, take k = 3 treatment groups, independent, normal, equivariant: 1 2 = H0: “reference group” ANOVA Model cure = c(rep("A",9),..., rep("C",9)) concrete = data.frame(days,cure) Cure A: 4 5 4 3 2 4 3 4 4 Cure B: 6 8 4 5 4 6 5 8 6 Cure C: 6 7 6 6 7 5 6 5 5 results = aov(days ~ cure, data = concrete) results$coeff (Intercept) cureB cureC 3.666667 2.111111 2.222222

= H0: ANOVA Model Linear Regression For simplicity, take k = 3 treatment groups, independent, normal, equivariant: 1 2 = H0: “reference group” ANOVA Model Linear Regression