Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.1 Using Several Variables to Predict a Response.

Slides:



Advertisements
Similar presentations
Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Advertisements

Chapter 3 Bivariate Data
Warm up Use calculator to find r,, a, b. Chapter 8 LSRL-Least Squares Regression Line.
Chapter 4 Describing the Relation Between Two Variables
Describing the Relation Between Two Variables
Chapter 12 Simple Regression
Copyright © 2008 Pearson Education, Inc. Chapter 1 Linear Functions Copyright © 2008 Pearson Education, Inc.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 13 Introduction to Linear Regression and Correlation Analysis.
Linear Regression and Correlation Analysis
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Copyright © 2014, 2013, 2010 and 2007 Pearson Education, Inc. Chapter Describing the Relation between Two Variables 4.
Simple Linear Regression Analysis
Linear Regression/Correlation
Review for Final Exam Some important themes from Chapters 9-11 Final exam covers these chapters, but implicitly tests the entire course, because we use.
Correlation & Regression
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Overview 4.2 Introduction to Correlation 4.3 Introduction to Regression.
Introduction to Linear Regression and Correlation Analysis
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Section 10-3 Regression.
Chapter 11 Simple Regression
MAT 254 – Probability and Statistics Sections 1,2 & Spring.
Chapter 14 Introduction to Multiple Regression Sections 1, 2, 3, 4, 6.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Chapter 14 – Correlation and Simple Regression Math 22 Introductory Statistics.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 22 Regression Diagnostics.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 12-1 Correlation and Regression.
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.2 Extending the Correlation and R-Squared for Multiple.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 19 Linear Patterns.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 12: Analyzing the Association Between Quantitative Variables: Regression Analysis Section.
Section 4.2: Least-Squares Regression Goal: Fit a straight line to a set of points as a way to describe the relationship between the X and Y variables.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.3 Using Multiple Regression to Make Inferences.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 12 Analyzing the Association Between Quantitative Variables: Regression Analysis Section.
Copyright © 2012 Pearson Education, Inc. All rights reserved. Chapter 4 Multiple Regression Models.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 3 Association: Contingency, Correlation, and Regression Section 3.3 Predicting the Outcome.
LEAST-SQUARES REGRESSION 3.2 Least Squares Regression Line and Residuals.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Copyright © 2011 Pearson Education, Inc. Regression Diagnostics Chapter 22.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Describing the Relation between Two Variables 4.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.1 Using Several Variables to Predict a Response.
Copyright © 2016 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. C ORRELATION Section 4.1.
Chapter 12 Simple Regression Statistika.  Analisis regresi adalah analisis hubungan linear antar 2 variabel random yang mempunyai hub linear,  Variabel.
Introduction Many problems in Engineering, Management, Health Sciences and other Sciences involve exploring the relationships between two or more variables.
Least Square Regression Line. Line of Best Fit Our objective is to fit a line in the scatterplot that fits the data the best As just seen, the best fit.
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 13 Multiple Regression
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 11 Simple Regression
The Least-Squares Regression Line
Lecture Slides Elementary Statistics Thirteenth Edition
Chapter 13 Multiple Regression
Ice Cream Sales vs Temperature
3 4 Chapter Describing the Relation between Two Variables
Chapter 1 Linear Functions
Chapter 3 Describing Relationships Section 3.2
3 4 Chapter Describing the Relation between Two Variables
Chapter 3: Describing Relationships
Regression and Categorical Predictors
Chapter 13 Multiple Regression
Presentation transcript:

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.1 Using Several Variables to Predict a Response

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 3 Regression Models The model that contains only two variables, x and y, is called a bivariate model.

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 4 Suppose there are two predictors, denoted by and. This is called a multiple regression model. Regression Models

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 5 The multiple regression model relates the mean of a quantitative response variable y to a set of explanatory variables Multiple Regression Model

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 6 Example: For three explanatory variables, the multiple regression equation is: Multiple Regression Model

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 7 Example: The sample prediction equation with three explanatory variables is: Multiple Regression Model

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 8 The data set “house selling prices” contains observations on 100 home sales in Florida in November A multiple regression analysis was done with selling price as the response variable and with house size and number of bedrooms as the explanatory variables. Example: Predicting Selling Price Using House Size and Number of Bedrooms

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 9 Output from the analysis: Table 13.3 Regression of Selling Price on House Size and Bedrooms. The regression equation is price = 60, house size + 15,170 bedrooms. Example: Predicting Selling Price Using House Size and Number of Bedrooms

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 10 Prediction Equation: where y = selling price, =house size and = number of bedrooms. Example: Predicting Selling Price Using House Size and Number of Bedrooms

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 11 One house listed in the data set had house size = 1679 square feet, number of bedrooms = 3: Find its predicted selling price: Example: Predicting Selling Price Using House Size and Number of Bedrooms

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 12 Find its residual: The residual tells us that the actual selling price was $21,111 higher than predicted. Example: Predicting Selling Price Using House Size and Number of Bedrooms

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 13 The Number of Explanatory Variables You should not use many explanatory variables in a multiple regression model unless you have lots of data. A rough guideline is that the sample size n should be at least 10 times the number of explanatory variables. For example, to use two explanatory variables, you should have at least n = 20.

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 14 Plotting Relationships Always look at the data before doing a multiple regression. Most software has the option of constructing scatterplots on a single graph for each pair of variables.  This is called a scatterplot matrix.

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 15 Figure 13.1 Scatterplot Matrix for Selling Price, House Size, and Number of Bedrooms. The middle plot in the top row has house size on the x -axis and selling price on the y -axis. The first plot in the second row reverses this, with selling price on the x -axis and house size on the y -axis. Question: Why are the plots of main interest the ones in the first row? Plotting Relationships

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 16 Interpretation of Multiple Regression Coefficients The simplest way to interpret a multiple regression equation looks at it in two dimensions as a function of a single explanatory variable. We can look at it this way by fixing values for the other explanatory variable(s).

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 17 Example using the housing data: Suppose we fix number of bedrooms = three bedrooms. The prediction equation becomes: Interpretation of Multiple Regression Coefficients

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 18 Since the slope coefficient of is 63, the predicted selling price increases, for houses with this number of bedrooms, by $63.00 for every additional square foot in house size. For a 100 square-foot increase in lot size, the predicted selling price increases by 100(63.00) = $6300. Interpretation of Multiple Regression Coefficients

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 19 Summarizing the Effect While Controlling for a Variable The multiple regression model assumes that the slope for a particular explanatory variable is identical for all fixed values of the other explanatory variables.

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 20 For example, the coefficient of in the prediction equation: is 63.0 regardless of whether we plug in or or. Summarizing the Effect While Controlling for a Variable

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 21 Figure 13.2 The Relationship Between and for the Multiple Regression Equation. This shows how the equation simplifies when number of bedrooms, or, or. Question: The lines move upward (to higher -values ) as increases. How would you interpret this fact? Summarizing the Effect While Controlling for a Variable

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 22 Slopes in Multiple Regression and in Bivariate Regression  In multiple regression, a slope describes the effect of an explanatory variable while controlling effects of the other explanatory variables in the model.  Bivariate regression has only a single explanatory variable. A slope in bivariate regression describes the effect of that variable while ignoring all other possible explanatory variables.

Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 23 Importance of Multiple Regression  One of the main uses of multiple regression is to identify potential lurking variables and control for them by including them as explanatory variables in the model.  Doing so can have a major impact on a variable’s effect.  When we control a variable, we keep that variable from influencing the associations among the other variables in the study.