1 G89.2229 Lect 10M Contrasting coefficients: a review ANOVA and Regression software Interactions of categorical predictors Type I, II, and III sums of.

Slides:



Advertisements
Similar presentations
Multiple-choice question
Advertisements

1-Way Analysis of Variance
Qualitative predictor variables
Topic 12: Multiple Linear Regression
1 G Lect 4M Interpreting multiple regression weights: suppression and spuriousness. Partial and semi-partial correlations Multiple regression in.
Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
Lecture 10 F-tests in MLR (continued) Coefficients of Determination BMTRY 701 Biostatistical Methods II.
Analysis of Variance (ANOVA). Hypothesis H 0 :  i =  G H 1 :  i | (  i   G ) Logic S 2 within = error variability S 2 between = error variability.
Research Support Center Chongming Yang
Topic 12 – Further Topics in ANOVA
BA 275 Quantitative Business Methods
1 G Lect 5W Regression assumptions Inferences about regression Predicting Day 29 Anxiety Analysis of sets of variables: partitioning the sums of.
Type I and Type III Sums of Squares. Confounding in Unbalanced Designs When designs are “unbalanced”, typically with missing values, our estimates of.
N-way ANOVA. 3-way ANOVA 2 H 0 : The mean respiratory rate is the same for all species H 0 : The mean respiratory rate is the same for all temperatures.
Analysis – Regression The ANOVA through regression approach is still the same, but expanded to include all IVs and the interaction The number of orthogonal.
Statistics for the Social Sciences Psychology 340 Fall 2006 Putting it all together.
Chapter 7 Blocking and Confounding in the 2k Factorial Design
PSY 1950 General Linear Model November 12, The General Linear Model Or, What the Hell ’ s Going on During Estimation?
DATA ANALYSIS III MKT525. Multiple Regression Simple regression:DV = a + bIV Multiple regression: DV = a + b 1 IV 1 + b 2 IV 2 + …b n IV n b i = weight.
1 1 Slide © 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Wrap-up and Review Wrap-up and Review PSY440 July 8, 2008.
Statistics for the Social Sciences Psychology 340 Spring 2005 Course Review.
Two-Way Balanced Independent Samples ANOVA Overview of Computations.
Two-Way Balanced Independent Samples ANOVA Computations Contrasts Confidence Intervals.
Review for Final Exam Some important themes from Chapters 9-11 Final exam covers these chapters, but implicitly tests the entire course, because we use.
Linear Regression and Correlation Explanatory and Response Variables are Numeric Relationship between the mean of the response variable and the level of.
ANCOVA Lecture 9 Andrew Ainsworth. What is ANCOVA?
Factorial Experiments
© The McGraw-Hill Companies, Inc., 2000 Business and Finance College Principles of Statistics Lecture 10 aaed EL Rabai week
230 Jeopardy Unit 4 Chi-Square Repeated- Measures ANOVA Factorial Design Factorial ANOVA Correlation $100 $200$200 $300 $500 $400 $300 $400 $300 $400 $500.
1 1 Slide © 2016 Cengage Learning. All Rights Reserved. The equation that describes how the dependent variable y is related to the independent variables.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide © 2003 Thomson/South-Western Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination.
23-1 Analysis of Covariance (Chapter 16) A procedure for comparing treatment means that incorporates information on a quantitative explanatory variable,
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Analysis of Variance (Two Factors). Two Factor Analysis of Variance Main effect The effect of a single factor when any other factor is ignored. Example.
1 G Lect 6M Comparing two coefficients within a regression equation Analysis of sets of variables: partitioning the sums of squares Polynomial curve.
Sums of Squares. Sums of squares Besides the unweighted means solution, sums of squares can be calculated in various ways depending on the situation and.
Xuhua Xia Polynomial Regression A biologist is interested in the relationship between feeding time and body weight in the males of a mammalian species.
MANAGERIAL ECONOMICS 11 th Edition By Mark Hirschey.
Two-Way Balanced Independent Samples ANOVA Computations.
1 G Lect 9M Example Nominal (Categorical) Variables as Explanatory Factors Coding of Nominal Explanatory Variables G Multiple Regression.
Chapter 10: Analyzing Experimental Data Inferential statistics are used to determine whether the independent variable had an effect on the dependent variance.
14- 1 Chapter Fourteen McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved.
Chapter 13 Multiple Regression
1 G Lect 11a G Lecture 11a Example: Comparing variances ANOVA table ANOVA linear model ANOVA assumptions Data transformations Effect sizes.
Environmental Modeling Basic Testing Methods - Statistics III.
Lecture 5 EPSY 642 Victor Willson Fall EFFECT SIZE DISTRIBUTION Hypothesis: All effects come from the same distribution What does this look like.
1 G Lect 13b G Lecture 13b Mixed models Special case: one entry per cell Equal vs. unequal cell n's.
1 G Lect 9W Review of Coding Schemes for Categorical Data Example revisited Inclusion of Covariates Example extended Adjusting in Regression G
Factorial BG ANOVA Psy 420 Ainsworth. Topics in Factorial Designs Factorial? Crossing and Nesting Assumptions Analysis Traditional and Regression Approaches.
29 October 2009 MRC CBU Graduate Statistics Lectures 4: GLM: The General Linear Model - ANOVA & ANCOVA1 MRC Cognition and Brain Sciences Unit Graduate.
Chapter 11 REGRESSION Multiple Regression  Uses  Explanation  Prediction.
Chapter 14 Introduction to Multiple Regression
Chapter 12 Simple Linear Regression and Correlation
Effect Sizes (continued)
Factorial Experiments
Essentials of Modern Business Statistics (7e)
Chapter 13 Created by Bethany Stubbe and Stephan Kogitz.
John Loucks St. Edward’s University . SLIDES . BY.
Statistics for the Social Sciences
BIBD and Adjusted Sums of Squares
CHAPTER 29: Multiple Regression*
Chapter 12 Simple Linear Regression and Correlation
The Analysis of Variance
Presentation transcript:

1 G Lect 10M Contrasting coefficients: a review ANOVA and Regression software Interactions of categorical predictors Type I, II, and III sums of squares G Multiple Regression Week 10 (Monday)

2 G Lect 10M Contrasting Coefficients Suppose dummy codes are used to represent categories »And group k is the reference »B i -B j contrasts the means of groups i and j The standard error of the contrast can be computed two different ways »By recognizing that the groups i and j are independent, and using the usual se of the mean to compute a t test. »By using the se of the B estimates and the estimated correlation of the two estimates to compute a general contrast.

3 G Lect 10M Numerical Example The approach using the standard errors of the regression weights has to correct for the common reference group (the 12 year olds in the example) which makes the b's correlated.

4 G Lect 10M Two crossed categorical independent variables Suppose subjects can be classified into one of six categories according to a 2x3 crossed design. A main effects ANOVA model attempts to represent the six means with four degrees of freedom: a grand mean, an effect for Factor A and two effects for Factor B. Main effects suggest that the difference between levels of Factor B are consistent in both levels of Factor A. cdefghcdefgh Factor B Factor A

5 G Lect 10M Main effects and Interactions c d e f g h Example of a Main Effect Result c d e f g h cd e f g h Examples of Interaction Results

6 G Lect 10M Six Cells with Dummy Variables Note that the products of the dummy variables allow cells c and d to be fit exactly. This flexibility allows all patterns of six means to be fit perfectly. »The effect of A can be moderated in one level of B. In general, if there are J levels of Factor A and K levels of Factor B, then there will be (J-1)(K-1) interaction terms in the model.

7 G Lect 10M Weighted and Unweighted Means When the cell n's are different, the marginal means are confounded with the cell means. Example of depression among PR adolescents (age group by gender)

8 G Lect 10M Type I Sums of Squares Suppose we have factor A, B and A*B When the numbers are representative of a population, then a hierarchical regression approach is appropriate »Sets for A, B and A*B are entered »The first is entered ignoring the others »The second set is adjusted for the first, but ignores the later sets »The last set is adjusted for all before it.

9 G Lect 10M Type II Sums of Squares When the numbers are representative of a population, and when there is strong belief that interactions are not important, Type II Sums of squares might be right. »All sets, A, B and A*B are considered »A is adjusted for B, but not for A*B »B is adjusted for A, but not for A*B »A*B is adjusted for both A and B Type II SS is not much used in practice.

10 G Lect 10M Type III Sums of Squares When the cell means are constructed by design, and are not representative, Type III SS are appropriate »Conceptually, the Type III SS contrast the marginal means in the unweighted mean table »In practice, this is accomplished by fitting a specific regression model Unweighted effect codes are used A is adjusted for B and A*B B is adjusted for A and A*B A*B is adjusted for A and B Type III is the default in ANOVA programs