Chapter 13 Multiple Regression

Slides:



Advertisements
Similar presentations
Chapter 3 Bivariate Data
Advertisements

Describing the Relation Between Two Variables
Chapter 12 Simple Regression
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 13 Introduction to Linear Regression and Correlation Analysis.
Stat 217 – Day 26 Regression, cont.. Last Time – Two quantitative variables Graphical summary  Scatterplot: direction, form (linear?), strength Numerical.
Linear Regression and Correlation Analysis
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Stat 112: Lecture 9 Notes Homework 3: Due next Thursday
Simple Linear Regression Analysis
Chapter 6 (cont.) Regression Estimation. Simple Linear Regression: review of least squares procedure 2.
Linear Regression/Correlation
Review for Final Exam Some important themes from Chapters 9-11 Final exam covers these chapters, but implicitly tests the entire course, because we use.
Correlation & Regression
Overview 4.2 Introduction to Correlation 4.3 Introduction to Regression.
Introduction to Linear Regression and Correlation Analysis
Chapter 14 Introduction to Multiple Regression Sections 1, 2, 3, 4, 6.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.1 Using Several Variables to Predict a Response.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 22 Regression Diagnostics.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 12-1 Correlation and Regression.
Applied Quantitative Analysis and Practices LECTURE#22 By Dr. Osman Sadiq Paracha.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.2 Extending the Correlation and R-Squared for Multiple.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 19 Linear Patterns.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 12: Analyzing the Association Between Quantitative Variables: Regression Analysis Section.
Section 4.2: Least-Squares Regression Goal: Fit a straight line to a set of points as a way to describe the relationship between the X and Y variables.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Stat 112 Notes 9 Today: –Multicollinearity (Chapter 4.6) –Multiple regression and causal inference.
LEAST-SQUARES REGRESSION 3.2 Least Squares Regression Line and Residuals.
Chapter 9 Scatter Plots and Data Analysis LESSON 1 SCATTER PLOTS AND ASSOCIATION.
AP Statistics Section 15 A. The Regression Model When a scatterplot shows a linear relationship between a quantitative explanatory variable x and a quantitative.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.1 Using Several Variables to Predict a Response.
Copyright © 2016 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. C ORRELATION Section 4.1.
Chapter 12 Simple Regression Statistika.  Analisis regresi adalah analisis hubungan linear antar 2 variabel random yang mempunyai hub linear,  Variabel.
Introduction Many problems in Engineering, Management, Health Sciences and other Sciences involve exploring the relationships between two or more variables.
Least Square Regression Line. Line of Best Fit Our objective is to fit a line in the scatterplot that fits the data the best As just seen, the best fit.
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
Linear Regression.
Chapter 13 Multiple Regression
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 5 Lesson 5.3 Summarizing Bivariate Data
The Least-Squares Regression Line
Simple Linear Regression
Lecture Slides Elementary Statistics Thirteenth Edition
Describing Bivariate Relationships
Ice Cream Sales vs Temperature
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
Unit 4 Vocabulary.
Introduction to Probability and Statistics Thirteenth Edition
Chapter 3 Describing Relationships Section 3.2
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
3 4 Chapter Describing the Relation between Two Variables
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
3.2 – Least Squares Regression
CHAPTER 3 Describing Relationships
Regression and Categorical Predictors
Chapter 13 Multiple Regression
CHAPTER 3 Describing Relationships
9/27/ A Least-Squares Regression.
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
Presentation transcript:

Chapter 13 Multiple Regression Section 13.1 Using Several Variables to Predict a Response

Regression Models The model that contains only two variables, x and y, is called a bivariate model.

Regression Models Suppose there are two predictors, denoted by and . This is called a multiple regression model.

Multiple Regression Model The multiple regression model relates the mean of a quantitative response variable y to a set of explanatory variables

Multiple Regression Model Example: For three explanatory variables, the multiple regression equation is:

Multiple Regression Model Example: The sample prediction equation with three explanatory variables is:

Example: Predicting Selling Price Using House Size and Number of Bedrooms The data set “house selling prices” contains observations on 100 home sales in Florida in November 2003. A multiple regression analysis was done with selling price as the response variable and with house size and number of bedrooms as the explanatory variables.

Example: Predicting Selling Price Using House Size and Number of Bedrooms Output from the analysis: Table 13.3 Regression of Selling Price on House Size and Bedrooms. The regression equation is price = 60,102 + 63.0 house size + 15,170 bedrooms.

Example: Predicting Selling Price Using House Size and Number of Bedrooms Prediction Equation: where y = selling price, =house size and = number of bedrooms.

Example: Predicting Selling Price Using House Size and Number of Bedrooms One house listed in the data set had house size = 1679 square feet, number of bedrooms = 3: Find its predicted selling price:

Example: Predicting Selling Price Using House Size and Number of Bedrooms Find its residual: The residual tells us that the actual selling price was $21,111 higher than predicted.

The Number of Explanatory Variables You should not use many explanatory variables in a multiple regression model unless you have lots of data. A rough guideline is that the sample size n should be at least 10 times the number of explanatory variables. For example, to use two explanatory variables, you should have at least n = 20.

Plotting Relationships Always look at the data before doing a multiple regression. Most software has the option of constructing scatterplots on a single graph for each pair of variables. This is called a scatterplot matrix.

Plotting Relationships Figure 13.1 Scatterplot Matrix for Selling Price, House Size, and Number of Bedrooms. The middle plot in the top row has house size on the x -axis and selling price on the y -axis. The first plot in the second row reverses this, with selling price on the x -axis and house size on the y -axis. Question: Why are the plots of main interest the ones in the first row?

Interpretation of Multiple Regression Coefficients The simplest way to interpret a multiple regression equation looks at it in two dimensions as a function of a single explanatory variable. We can look at it this way by fixing values for the other explanatory variable(s).

Interpretation of Multiple Regression Coefficients Example using the housing data: Suppose we fix number of bedrooms = three bedrooms. The prediction equation becomes:

Interpretation of Multiple Regression Coefficients Since the slope coefficient of is 63, the predicted selling price increases, for houses with this number of bedrooms, by $63.00 for every additional square foot in house size. For a 100 square-foot increase in lot size, the predicted selling price increases by 100(63.00) = $6300.

Summarizing the Effect While Controlling for a Variable The multiple regression model assumes that the slope for a particular explanatory variable is identical for all fixed values of the other explanatory variables.

Summarizing the Effect While Controlling for a Variable For example, the coefficient of in the prediction equation: is 63.0 regardless of whether we plug in or or .

Summarizing the Effect While Controlling for a Variable Figure 13.2 The Relationship Between and for the Multiple Regression Equation . This shows how the equation simplifies when number of bedrooms , or , or . Question: The lines move upward (to higher -values ) as increases. How would you interpret this fact?

Slopes in Multiple Regression and in Bivariate Regression In multiple regression, a slope describes the effect of an explanatory variable while controlling effects of the other explanatory variables in the model. Bivariate regression has only a single explanatory variable. A slope in bivariate regression describes the effect of that variable while ignoring all other possible explanatory variables.

Importance of Multiple Regression One of the main uses of multiple regression is to identify potential lurking variables and control for them by including them as explanatory variables in the model. Doing so can have a major impact on a variable’s effect. When we control a variable, we keep that variable from influencing the associations among the other variables in the study.