Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.2 Extending the Correlation and R-Squared for Multiple.

Slides:



Advertisements
Similar presentations
13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Advertisements

Statistics Measures of Regression and Prediction Intervals.
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Section 10-4 Variation and Prediction Intervals.
Correlation and Regression
Chapter 4 The Relation between Two Variables
Chapter 4 Describing the Relation Between Two Variables
Chapter 13 Multiple Regression
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
Describing the Relation Between Two Variables
Statistics for the Social Sciences
Chapter 12 Simple Regression
Chapter 12 Multiple Regression
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Class 3: Thursday, Sept. 16 Reliability and Validity of Measurements Introduction to Regression Analysis Simple Linear Regression (2.3)
Prediction/Regression
Multiple Regression and Correlation Analysis
Simple Linear Regression and Correlation
Review for Final Exam Some important themes from Chapters 9-11 Final exam covers these chapters, but implicitly tests the entire course, because we use.
1 Chapter 10 Correlation and Regression We deal with two variables, x and y. Main goal: Investigate how x and y are related, or correlated; how much they.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Introduction to Multiple Regression Statistics for Managers.
Example of Simple and Multiple Regression
Copyright © 2011 Pearson Education, Inc. Multiple Regression Chapter 23.
Correlation and Regression
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Section 10-3 Regression.
HAWKES LEARNING SYSTEMS math courseware specialists Copyright © 2010 by Hawkes Learning Systems/Quant Systems, Inc. All rights reserved. Chapter 14 Analysis.
MAT 254 – Probability and Statistics Sections 1,2 & Spring.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Chapter 13 Statistics © 2008 Pearson Addison-Wesley. All rights reserved.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.1 Using Several Variables to Predict a Response.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
STA302/ week 911 Multiple Regression A multiple regression model is a model that has more than one explanatory variable in it. Some of the reasons.
© 2008 Pearson Addison-Wesley. All rights reserved Chapter 1 Section 13-6 Regression and Correlation.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 12-1 Correlation and Regression.
Chapter 3 Section 3.1 Examining Relationships. Continue to ask the preliminary questions familiar from Chapter 1 and 2 What individuals do the data describe?
© 2010 Pearson Prentice Hall. All rights reserved. CHAPTER 12 Statistics.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 19 Linear Patterns.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.3 Using Multiple Regression to Make Inferences.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 14 Comparing Groups: Analysis of Variance Methods Section 14.1 One-Way ANOVA: Comparing.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 12 Analyzing the Association Between Quantitative Variables: Regression Analysis Section.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 14 Comparing Groups: Analysis of Variance Methods Section 14.3 Two-Way ANOVA.
Multiple Regression I 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Multiple Regression Analysis (Part 1) Terry Dielman.
Correlation & Regression Analysis
Linear Regression Day 1 – (pg )
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 3 Association: Contingency, Correlation, and Regression Section 3.3 Predicting the Outcome.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide 8- 1.
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 14-1 Chapter 14 Introduction to Multiple Regression Statistics for Managers using Microsoft.
Advanced Statistical Methods: Continuous Variables REVIEW Dr. Irina Tomescu-Dubrow.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Describing the Relation between Two Variables 4.
Copyright © 2015, 2012, and 2009 Pearson Education, Inc. 1 Chapter Correlation and Regression 9.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Describing the Relation between Two Variables 4.
Topics, Summer 2008 Day 1. Introduction Day 2. Samples and populations Day 3. Evaluating relationships Scatterplots and correlation Day 4. Regression and.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.1 Using Several Variables to Predict a Response.
Copyright © 2016 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. C ORRELATION Section 4.1.
Chapter Correlation and Regression 1 of 84 9 © 2012 Pearson Education, Inc. All rights reserved.
11-1 Copyright © 2014, 2011, and 2008 Pearson Education, Inc.
Chapter 14 Introduction to Multiple Regression
Chapter 13 Multiple Regression
The Least-Squares Regression Line
Chapter 13 Multiple Regression
Prepared by Lee Revere and John Large
Lecture Notes The Relation between Two Variables Q Q
Regression and Categorical Predictors
Pearson Correlation and R2
Presentation transcript:

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.2 Extending the Correlation and R-Squared for Multiple Regression

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 3 To summarize how well a multiple regression model predicts y, we analyze how well the observed y values correlate with the predicted values. The multiple correlation is the correlation between the observed y values and the predicted values.  It is denoted by R. Multiple Correlation

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 4  For each subject, the regression equation provides a predicted value.  Each subject has an observed y-value and a predicted y-value. Table 13.4 Selling Prices and Their Predicted Values. These values refer to the two home sales listed in Table The predictors are = house size and = number of bedrooms. Multiple Correlation

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 5 The correlation computed between all pairs of observed y-values and predicted y-values is the multiple correlation, R. The larger the multiple correlation, the better are the predictions of y by the set of explanatory variables. Multiple Correlation

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 6 The R-value always falls between 0 and 1. In this way, the multiple correlation ‘R’ differs from the bivariate correlation ‘r’ between y and a single variable x, which falls between -1 and +1. Multiple Correlation

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 7 For predicting y, the square of R describes the relative improvement from using the prediction equation instead of using the sample mean,. The error in using the prediction equation to predict y is summarized by the residual sum of squares: R-squared

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 8 The error in using to predict y is summarized by the total sum of squares: R-squared

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 9 The proportional reduction in error is: R-squared

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 10  The better the predictions are using the regression equation, the larger is.  For multiple regression, is the square of the multiple correlation,. R-squared

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 11 For the 200 observations on y=selling price, = house size, and = lot size, a table, called the ANOVA (analysis of variance) table was created. The table displays the sums of squares in the SS column. Example: Predicting House Selling Prices Table 13.5 ANOVA Table and R -Squared for Predicting House Selling Price (in thousands of dollars) Using House Size (in thousands of square feet) and Number of Bedrooms.

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 12 The value can be created from the sums of squares in the table Example: Predicting House Selling Prices

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 13 Using house size and lot size together to predict selling price reduces the prediction error by 52%, relative to using alone to predict selling price. Example: Predicting House Selling Prices

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 14 Find and interpret the multiple correlation. There is a moderately strong association between the observed and the predicted selling prices. House size and number of bedrooms are very helpful in predicting selling prices. Example: Predicting House Selling Prices

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 15 If we used a bivariate regression model to predict selling price with house size as the predictor, the value would be If we used a bivariate regression model to predict selling price with number of bedrooms as the predictor, the value would be 0.1. Example: Predicting House Selling Prices

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 16 The multiple regression model has, a similar value to using only house size as a predictor. There is clearly more to this prediction than using only one variable in the model. Interpretation of results is important. Larger lot sizes in this area could mean older homes with smaller size or fewer bedrooms or bathrooms. Example: Predicting House Selling Prices

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 17 Table 13.6 Value for Multiple Regression Models for y = House Selling Price Example: Predicting House Selling Prices

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 18 Although R 2 goes up by only small amounts after house size is in the model, this does not mean that the other predictors are only weakly correlated with selling price. Because the predictors are themselves highly correlated, once one or two of them are in the model, the remaining ones don’t help much in adding to the predictive power. For instance, lot size is highly positively correlated with number of bedrooms and with size of house. So, once number of bedrooms and size of house are included as predictors in the model, there’s not much benefit to including lot size as an additional predictor. Example: Predicting House Selling Prices

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 19 Properties of The previous example showed that for the multiple regression model was larger than for a bivariate model using only one of the explanatory variables. A key factor of is that it cannot decrease when predictors are added to a model.

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 20  falls between 0 and 1.  The larger the value, the better the explanatory variables collectively predict y.  only when all residuals are 0, that is, when all regression predictions are perfect.  when the correlation between y and each explanatory variable equals 0. Properties of

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 21  gets larger, or at worst stays the same, whenever an explanatory variable is added to the multiple regression model.  The value of does not depend on the units of measurement. Properties of