Multiple Regression II

Slides:



Advertisements
Similar presentations
Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Advertisements

3.3 Hypothesis Testing in Multiple Linear Regression
Multiple Regression Analysis
Topic 12: Multiple Linear Regression
Chapter 7: Multiple Regression II Ayona Chatterjee Spring 2008 Math 4813/5813.
Chapter 12 Simple Linear Regression
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Simple Linear Regression
Psychology 202b Advanced Psychological Statistics, II February 1, 2011.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
Statistics 350 Lecture 23. Today Today: Exam next day Good Chapter 7 questions: 7.1, 7.2, 7.3, 7.28, 7.29.
Ch. 14: The Multiple Regression Model building
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 11 th Edition.
BCOR 1020 Business Statistics
1 732G21/732A35/732G28. Formal statement  Y i is i th response value  β 0 β 1 model parameters, regression parameters (intercept, slope)  X i is i.
Simple Linear Regression. Introduction In Chapters 17 to 19, we examine the relationship between interval variables via a mathematical equation. The motivation.
Hypothesis tests for slopes in multiple linear regression model Using the general linear test and sequential sums of squares.
Multiple Linear Regression Response Variable: Y Explanatory Variables: X 1,...,X k Model (Extension of Simple Regression): E(Y) =  +  1 X 1 +  +  k.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Correlation and Regression
Simple Linear Regression Models
CHAPTER 15 Simple Linear Regression and Correlation
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
CHAPTER 14 MULTIPLE REGRESSION
Multiple Regression I KNNL – Chapter 6. Models with Multiple Predictors Most Practical Problems have more than one potential predictor variable Goal is.
Autocorrelation in Time Series KNNL – Chapter 12.
Sequential sums of squares … or … extra sums of squares.
Chapter 13 Multiple Regression
Simple Linear Regression (OLS). Types of Correlation Positive correlationNegative correlationNo correlation.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
 Relationship between education level, income, and length of time out of school  Our new regression equation: is the predicted value of the dependent.
Week 101 ANOVA F Test in Multiple Regression In multiple regression, the ANOVA F test is designed to test the following hypothesis: This test aims to assess.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
EXTRA SUMS OF SQUARES  Extra sum squares  Decomposition of SSR  Usage of Extra sum of Squares  Coefficient of partial determination.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 10 th Edition.
Multicollinearity. Multicollinearity (or intercorrelation) exists when at least some of the predictor variables are correlated among themselves. In observational.
Multiple Regression.
Chapter 15 Multiple Regression Model Building
The simple linear regression model and parameter estimation
Chapter 20 Linear and Multiple Regression
REGRESSION G&W p
Decomposition of Sum of Squares
Regression Analysis Module 3.
Regression Chapter 6 I Introduction to Regression
Chapter 11: Simple Linear Regression
Analysis of Variance in Matrix form
Regression Diagnostics
Ch12.1 Simple Linear Regression
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
Econ 3790: Business and Economics Statistics
Regression Model Building - Diagnostics
Multiple Regression – Part II
Review of Chapter 3 where Multiple Linear Regression Model:
Linear Regression.
Linear Regression.
Multiple Regression.
Multiple Regression II
CHAPTER- 17 CORRELATION AND REGRESSION
24/02/11 Tutorial 3 Inferential Statistics, Statistical Modelling & Survey Methods (BS2506) Pairach Piboonrungroj (Champ)
Review of Chapter 2 Some Basic Concepts: Sample center
Regression Model Building - Diagnostics
CHAPTER 14 MULTIPLE REGRESSION
Simple Linear Regression
Multiple Linear Regression
Chapter 13 Additional Topics in Regression Analysis
Multiple regression Partial F-tests.
Estimating the Variance of the Error Terms
Multiple Regression Berlin Chen
Inference about the Slope and Intercept
Decomposition of Sum of Squares
Presentation transcript:

Multiple Regression II KNNL – Chapter 7

Extra Sums of Squares For a given dataset, the total sum of squares remains the same, no matter what predictors are included (when no missing values exist among variables) As we include more predictors, the regression sum of squares (SSR) increases (technically does not decrease), and the error sum of squares (SSE) decreases SSR + SSE = SSTO, regardless of predictors in model When a model contains just X1, denote: SSR(X1), SSE(X1) Model Containing X1, X2: SSR(X1,X2), SSE(X1,X2) Predictive contribution of X2 above that of X1: SSR(X2|X1) = SSE(X1) – SSE(X1,X2) = SSR(X1,X2) – SSR(X1) Extends to any number of Predictors

Definitions and Decomposition of SSR Note that as the # of predictors increases, so does the ways of decomposing SSR

ANOVA – Sequential Sum of Squares

Extra Sums of Squares & Tests of Regression Coefficients (Single bk)

Extra Sums of Squares & Tests of Regression Coefficients (Multiple bk)

Extra Sums of Squares & Tests of Regression Coefficients (General Case)

Other Linear Tests

Coefficients of Partial Determination-I

Coefficients of Partial Determination-II

Standardized Regression Model - I Useful in removing round-off errors in computing (X’X)-1 Makes easier comparison of magnitude of effects of predictors measured on different measurement scales Coefficients represent changes in Y (in standard deviation units) as each predictor increases 1 SD (holding all others constant) Since all variables are centered, no intercept term

Standardized Regression Model - II

Standardized Regression Model - III

Multicollinearity Consider model with 2 Predictors (this generalizes to any number of predictors) Yi = b0+b1Xi1+b2Xi2+ei When X1 and X2 are uncorrelated, the regression coefficients b1 and b2 are the same whether we fit simple regressions or a multiple regression, and: SSR(X1) = SSR(X1|X2) SSR(X2) = SSR(X2|X1) When X1 and X2 are highly correlated, their regression coefficients become unstable, and their standard errors become larger (smaller t-statistics, wider CIs), leading to strange inferences when comparing simple and partial effects of each predictor Estimated means and Predicted values are not affected