More Multiple Regression

Slides:



Advertisements
Similar presentations
Chapter 12 Simple Linear Regression
Advertisements

Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Chapter 14, part D Statistical Significance. IV. Model Assumptions The error term is a normally distributed random variable and The variance of  is constant.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Chapter 12 Simple Linear Regression
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Chapter 10 Simple Regression.
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
Chapter 11 Multiple Regression.
Simple Linear Regression Analysis
1 1 Slide © 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Simple Linear Regression Analysis
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Multiple Linear Regression Analysis
Correlation and Linear Regression
Introduction to Linear Regression and Correlation Analysis
Chapter 13: Inference in Regression
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide © 2003 Thomson/South-Western Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved OPIM 303-Lecture #9 Jose M. Cruz Assistant Professor.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 1 Slide Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide © 2004 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
QMS 6351 Statistics and Research Methods Regression Analysis: Testing for Significance Chapter 14 ( ) Chapter 15 (15.5) Prof. Vera Adamchik.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
CHAPTER 14 MULTIPLE REGRESSION
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Six.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 5: Regression Analysis Part 1: Simple Linear Regression.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Chapter 13 Multiple Regression
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Introduction to Multiple Regression Lecture 11. The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more.
Chapter 16 Multiple Regression and Correlation
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 14-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
1 1 Slide © 2011 Cengage Learning Assumptions About the Error Term  1. The error  is a random variable with mean of zero. 2. The variance of , denoted.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
Lecture 11: Simple Linear Regression
Chapter 14 Introduction to Multiple Regression
Statistics for Managers using Microsoft Excel 3rd Edition
Essentials of Modern Business Statistics (7e)
Statistics for Business and Economics (13e)
Relationship with one independent variable
Quantitative Methods Simple Regression.
Slides by JOHN LOUCKS St. Edward’s University.
CHAPTER 29: Multiple Regression*
Prepared by Lee Revere and John Large
Multiple Regression Models
Simple Linear Regression
More Multiple Regression
More Multiple Regression
Relationship with one independent variable
Simple Linear Regression
Introduction to Regression
St. Edward’s University
Presentation transcript:

More Multiple Regression Chapter 15, continued More Multiple Regression

III. Adjusted R2 During single variable regression, we assess goodness of fit with R2, the coefficient of determination. R2 = SSR/SST This value is interpreted as the proportion of the variability in y that is explained by the estimated regression equation.

A. Inclusion of more variables An unfortunate result of adding more independent variables to our regression is that R2 will increase, even if we are adding insignificant variables. For example, if we had added x2=“Color of the car” to our repair regression, R2 would have marginally increased, despite the ridiculous idea that the color of a car should influence its repair cost.

B. Adjustment To adjust for the addition of more and more variables, just to increase R2, we compensate for the number of independent variables in the model. With n denoting the # of observations in the sample and p is the # of independent variables included in the model,

C. An Example Y is # of hours of television watched in a week. X1 is the amount of alcohol consumed in a typical week. Can you interpret these estimated coefficients and test their significance? Can you correctly evaluate the fit of the equation?

Include one more variable Now I’ll add X2=Age of the student, which I don’t believe affects television viewing, but am adding to make a point. If you looked simply at R2, you would conclude that the goodness of fit slightly improved. However, looking at Ra you can see that adding this insignificant variable actually decreased the fit. Alcohol is still significant and positive, but Age is insignificant.

IV. Model Assumptions These assumptions are modified from chapter 14 to accommodate the inclusion of multiple independent variables. The error term is a normally distributed random variable and thus, The variance of  is constant for all values of x1, x2,…,xp. All  are independent, not influenced by any other error term. Thus the size of  is also constant.

V. Testing for Significance Now that we have multiple independent variables, we can conduct a true F-test of overall significance. Ho: ß1=ß2=…=ßp = 0 Ha: One or more of the parameters is not equal to zero.

A. The F-test Described in Chapter 14, the test statistic is calculated by F = MSR/MSE where: MSR = SSR/p and p is the # of x-variables. and MSE = SSE/(n-p-1)

B. Rejection Rule The critical F is based on an F distribution with p degrees of freedom in the numerator and (n-p-1) degrees of freedom in the denominator. So I’ll test the overall significance of my Television watching model.

C. The Example I have a sample of n=60 and p=2 independent variables. I have d.f.=2 in the numerator and d.f.=57 in the denominator. So at the .05 level of significance, my critical F is approximately 3.15. If my test F is greater than 3.15, I reject the null and conclude that at least one of my coefficients is NOT zero and my model has overall significance.

Excel Output My test statistic is greater than 3.15 so I reject Ho. You can see from the p-value that it is less than  (.05), which also indicates reject Ho. However, it is not less than  (.01). Thus my model is significant at the 95% level, but not the 99% level of confidence.

E. T-Tests A t-test of a coefficient’s statistical significance is done the same way as in Chapter 14. If t>t/2, reject the null that =0 for that coefficient. Reproducing my Excel output reveals that: the coefficient on Age is insignificant. You can’t reject the null that that coefficient is non-zero. You CAN reject the null for the Alcohol coefficient. It is statistically significant.