Multiple Regression and Model Building

Slides:



Advertisements
Similar presentations
Simple Linear Regression Analysis
Advertisements

Multiple Regression and Model Building
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Chapter 12 Simple Linear Regression
Chapter 10 Simple Regression.
Multiple Linear Regression Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
Chapter 11 Multiple Regression.
Simple Linear Regression Analysis
Multiple Regression and Correlation Analysis
1 1 Slide © 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Ch. 14: The Multiple Regression Model building
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Simple Linear Regression Analysis
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Correlation & Regression
Quantitative Business Analysis for Decision Making Multiple Linear RegressionAnalysis.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Introduction to Linear Regression and Correlation Analysis
Chapter 13: Inference in Regression
Correlation and Linear Regression
Chapter 12 Multiple Regression and Model Building.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide © 2003 Thomson/South-Western Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved OPIM 303-Lecture #9 Jose M. Cruz Assistant Professor.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 1 Slide Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Chapter 16 Data Analysis: Testing for Associations.
Chapter 13 Multiple Regression
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Model Building and Model Diagnostics Chapter 15.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Chapter 13 Simple Linear Regression
23. Inference for regression
Chapter 14 Introduction to Multiple Regression
Regression Analysis AGEC 784.
Basic Estimation Techniques
Essentials of Modern Business Statistics (7e)
Chapter 11 Simple Regression
Chapter 13 Simple Linear Regression
Quantitative Methods Simple Regression.
Correlation and Simple Linear Regression
I271B Quantitative Methods
Correlation and Regression
Multiple Regression Models
Simple Linear Regression
Correlation and Simple Linear Regression
Chapter 14 Inference for Regression
Simple Linear Regression
Multiple Linear Regression
Simple Linear Regression
3.2. SIMPLE LINEAR REGRESSION
St. Edward’s University
Presentation transcript:

Multiple Regression and Model Building Chapter 14 Multiple Regression and Model Building

Multiple Regression and Model Building 14.1 The Multiple Regression Model and the Least Squares Point Estimate 14.2 Model Assumptions and the Standard Error 14.3 R2 and Adjusted R2 14.4 The Overall F Test 14.5 Testing the Significance of an Independent Variable 14.6 Confidence and Prediction Intervals

Multiple Regression and Model Building Continued 14.7 The Sales Territory Performance Case: Evaluating Employee Performance 14.8 Using Dummy Variables to Model Qualitative Independent Variables 14.9 Using Squared and Interactive Terms 14.10 Model Building and the Effects of Multicollinearity 14.11 Residual Analysis in Multiple Regression 14.12 Logistic Regression

LO 14-1: Explain the multiple regression model and the related least squares point estimates. 14.1 The Multiple Regression Model and the Least Squares Point Estimate Simple linear regression used one independent variable to explain the dependent variable Some relationships are too complex to be described using a single independent variable Multiple regression uses two or more independent variables to describe the dependent variable This allows multiple regression models to handle more complex situations There is no limit to the number of independent variables a model can use Multiple regression has only one dependent variable

14.2 Model Assumptions and the Standard Error LO 14-2: Explain the assumptions behind multiple regression and calculate the standard error. 14.2 Model Assumptions and the Standard Error The model is y = β0 + β1x1 + β2x2 + … + βkxk +  Assumptions for multiple regression are stated about the model error terms, ’s

LO 14-3: Calculate and interpret the multiple and adjusted multiple coefficients of determination. 14.3 R2 and Adjusted R2 Total variation is given by the formula Σ(yi - ȳ)2 Explained variation is given by the formula Σ(ŷi - ȳ)2 Unexplained variation is given by the formula Σ(yi - ŷi)2 Total variation is the sum of explained and unexplained variation This section can be read anytime after reading Section 14.1

LO 14-4: Test the significance of a multiple regression model by using an F test. 14.4 The Overall F Test To test H0: β1= β2 = …= βk = 0 versus Ha: At least one of β1, β2,…, βk ≠ 0 Reject H0 in favor of Ha if F(model) > F* or p-value <  *F is based on k numerator and n-(k+1) denominator degrees of freedom

14.5 Testing the Significance of an Independent Variable LO 14-5: Test the significance of a single independent variable. 14.5 Testing the Significance of an Independent Variable A variable in a multiple regression model is not likely to be useful unless there is a significant relationship between it and y To test significance, we use the null hypothesis H0: βj = 0 Versus the alternative hypothesis Ha: βj ≠ 0

14.6 Confidence and Prediction Intervals LO 14-6: Find and interpret a confidence interval for a mean value and a prediction interval for an individual value. 14.6 Confidence and Prediction Intervals The point on the regression line corresponding to a particular value of x1, x2,…, xk, of the independent variables is ŷ = b0 + b1x1 + b2x2 + … + bkxk It is unlikely that this value will equal the mean value of y for these x values Therefore, we need to place bounds on how far away the predicted value might be We can do this by calculating a confidence interval for the mean value of y and a prediction interval for an individual value of y

14.8 Using Dummy Variables to Model Qualitative Independent Variables LO 14-7: Use dummy variables to model qualitative independent variables. 14.8 Using Dummy Variables to Model Qualitative Independent Variables So far, we have only looked at including quantitative data in a regression model However, we may wish to include descriptive qualitative data as well For example, might want to include the gender of respondents We can model the effects of different levels of a qualitative variable by using what are called dummy variables Also known as indicator variables

14.9 Using Squared and Interaction Variables LO 14-8: Use squared and interaction variables. 14.9 Using Squared and Interaction Variables Quadratic regression model is: y = β0 + β1x + β2x2 ε where β0 + β1x + β2x2 is μy Β, β, and β2 are the regression parameters ε is an error term

14.10 Model Building and the Effects of Multicollinearity LO 14-9: Describe multicollinearity and build a multiple regression model. 14.10 Model Building and the Effects of Multicollinearity Multicollinearity: when “independent” variables are related to one another Considered severe when the simple correlation exceeds 0.9 Even moderate multicollinearity can be a problem Another measurement is variance inflation factors Multicollinearity a problem when VIF>10 Moderate problem for VIF>5

14.11 Residual Analysis in Multiple Regression LO 14-10: Use residual analysis to check the assumptions of multiple regression. 14.11 Residual Analysis in Multiple Regression For an observed value of yi, the residual is ei = yi - ŷ = yi – (b0 + b1xi1 + … + bkxik) If the regression assumptions hold, the residuals should look like a random sample from a normal distribution with mean 0 and variance σ2 Residual plots Residuals versus each independent variable Residuals versus predicted y’s Residuals in time order (if the response is a time series)