Download presentation
Presentation is loading. Please wait.
1
Chapter 3 Multiple Linear Regression
Linear Regression Analysis 5E Montgomery, Peck & Vining
2
3.1 Multiple Regression Models
Suppose that the yield in pounds of conversion in a chemical process depends on temperature and the catalyst concentration. A multiple regression model that might describe this relationship is This is a multiple linear regression model in two variables. Linear Regression Analysis 5E Montgomery, Peck & Vining
3
3.1 Multiple Regression Models
Figure 3.1 (a) The regression plane for the model E(y)= 50+10x1+7x2 . (b) The contour plot. Linear Regression Analysis 5E Montgomery, Peck & Vining
4
3.1 Multiple Regression Models
In general, the multiple linear regression model with k regressors is Linear Regression Analysis 5E Montgomery, Peck & Vining
5
3.1 Multiple Regression Models
Linear Regression Analysis 5E Montgomery, Peck & Vining
6
3.1 Multiple Regression Models
Linear regression models may also contain interaction effects: If we let x3 = x1x2 and 3 = 12, then the model can be written in the form Linear Regression Analysis 5E Montgomery, Peck & Vining
7
3.1 Multiple Regression Models
Linear Regression Analysis 5E Montgomery, Peck & Vining
8
Linear Regression Analysis 5E Montgomery, Peck & Vining
9
3.2 Estimation of the Model Parameters
3.2.1 Least Squares Estimation of the Regression Coefficients Notation n – number of observations available k – number of regressor variables y – response or dependent variable xij – ith observation or level of regressor j. Linear Regression Analysis 5E Montgomery, Peck & Vining
10
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.2.1 Least Squares Estimation of Regression Coefficients Linear Regression Analysis 5E Montgomery, Peck & Vining
11
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.2.1 Least Squares Estimation of the Regression Coefficients The sample regression model can be written as Linear Regression Analysis 5E Montgomery, Peck & Vining
12
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.2.1 Least Squares Estimation of the Regression Coefficients The least squares function is The function S must be minimized with respect to the coefficients. Linear Regression Analysis 5E Montgomery, Peck & Vining
13
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.2.1 Least Squares Estimation of the Regression Coefficients The least squares estimates of the coefficients must satisfy Linear Regression Analysis 5E Montgomery, Peck & Vining
14
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.2.1 Least Squares Estimation of the Regression Coefficients Simplifying, we obtain the least squares normal equations: The ordinary least squares estimators are the solutions to the normal equations. Linear Regression Analysis 5E Montgomery, Peck & Vining
15
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.2.1 Least Squares Estimation of the Regression Coefficients Matrix notation is typically used: Let where Linear Regression Analysis 5E Montgomery, Peck & Vining
16
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.2.1 Least Squares Estimation of the Regression Coefficients Linear Regression Analysis 5E Montgomery, Peck & Vining
17
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.2.1 Least Squares Estimation of the Regression Coefficients These are the least-squares normal equations. The solution is Linear Regression Analysis 5E Montgomery, Peck & Vining
18
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.2.1 Least Squares Estimation of the Regression Coefficients Linear Regression Analysis 5E Montgomery, Peck & Vining
19
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.2.1 Least Squares Estimation of the Regression Coefficients The n residuals can be written in matrix form as There will be some situations where an alternative form will prove useful Linear Regression Analysis 5E Montgomery, Peck & Vining
20
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example The Delivery Time Data The model of interest is y = 0 + 1x1+ 2x2 + Linear Regression Analysis 5E Montgomery, Peck & Vining
21
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example The Delivery Time Data Figure 3.4 Scatterplot matrix for the delivery time data from Example 3.1. Linear Regression Analysis 5E Montgomery, Peck & Vining
22
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example 3-1 The Delivery Time Data Figure 3.5 Three-dimensional scatterplot of the delivery time data from Example 3.1. Linear Regression Analysis 5E Montgomery, Peck & Vining
23
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example 3-1 The Delivery Time Data Linear Regression Analysis 5E Montgomery, Peck & Vining
24
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example 3-1 The Delivery Time Data Linear Regression Analysis 5E Montgomery, Peck & Vining
25
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example 3-1 The Delivery Time Data Linear Regression Analysis 5E Montgomery, Peck & Vining
26
Linear Regression Analysis 5E Montgomery, Peck & Vining
27
Linear Regression Analysis 5E Montgomery, Peck & Vining
MINITAB Output Linear Regression Analysis 5E Montgomery, Peck & Vining
28
3.2.2 Geometrical Interpretation of Least-Squares
Linear Regression Analysis 5E Montgomery, Peck & Vining
29
3.2.3 Properties of Least-Squares Estimators
Statistical Properties Variances/Covariances Linear Regression Analysis 5E Montgomery, Peck & Vining
30
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.2.4 Estimation of 2 The residual sum of squares can be shown to be: The residual mean square for the model with p parameters is: Linear Regression Analysis 5E Montgomery, Peck & Vining
31
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.2.4 Estimation of 2 Recall that the estimator of 2 is model dependent - that is, change the form of the model and the estimate of 2 will invariably change. Note that the variance estimate is a function of the errors; “unexplained noise about the fitted regression line” Linear Regression Analysis 5E Montgomery, Peck & Vining
32
Example 3.2 Delivery Time Data
Linear Regression Analysis 5E Montgomery, Peck & Vining
33
Example 3.2 Delivery Time Data
Linear Regression Analysis 5E Montgomery, Peck & Vining
34
3.2.5 Inadequacy of Scatter Diagrams in Multiple Regression
Scatter diagrams of the regressor variable(s) against the response may be of little value in multiple regression. These plots can actually be misleading If there is an interdependency between two or more regressor variables, the true relationship between xi and y may be masked. Linear Regression Analysis 5E Montgomery, Peck & Vining
35
Linear Regression Analysis 5E Montgomery, Peck & Vining
Illustration of the Inadequacy of Scatter Diagrams in Multiple Regression Linear Regression Analysis 5E Montgomery, Peck & Vining
36
3.3 Hypothesis Testing in Multiple Linear Regression
Once we have estimated the parameters in the model, we face two immediate questions: 1. What is the overall adequacy of the model? 2. Which specific regressors seem important? Linear Regression Analysis 5E Montgomery, Peck & Vining
37
3.3 Hypothesis Testing in Multiple Linear Regression
This section considers four cases: Test for Significance of Regression (sometimes called the global test of model adequacy) Tests on Individual Regression Coefficients (or groups of coefficients) Special Case of Hypothesis Testing with Orthogonal Columns in X Testing the General Linear Hypothesis Linear Regression Analysis 5E Montgomery, Peck & Vining
38
3.3.1 Test for Significance of Regression
The test for significance is a test to determine if there is a linear relationship between the response and any of the regressor variables The hypotheses are H0: 1 = 2 = …= k = 0 H1: j 0 for at least one j Linear Regression Analysis 5E Montgomery, Peck & Vining
39
3.3.1 Test for Significance of Regression
As in Chapter 2, the total sum of squares can be partitioned in two parts: SST = SSR + SSRes This leads to an ANOVA procedure with the test (F) statistic Linear Regression Analysis 5E Montgomery, Peck & Vining
40
3.3.1 Test for Significance of Regression
The standard ANOVA is conducted with Linear Regression Analysis 5E Montgomery, Peck & Vining
41
3.3.1 Test for Significance of Regression
ANOVA Table: Reject H0 if Linear Regression Analysis 5E Montgomery, Peck & Vining
42
Linear Regression Analysis 5E Montgomery, Peck & Vining
43
Example 3.3 Delivery Time Data
Linear Regression Analysis 5E Montgomery, Peck & Vining
44
Example 3.3 Delivery Time Data
To test H0: 1 = 2 = 0, we calculate the F–statistic: Linear Regression Analysis 5E Montgomery, Peck & Vining
45
Example 3.3 Delivery Time Data
Linear Regression Analysis 5E Montgomery, Peck & Vining
46
3.3.1 Test for Significance of Regression
R2 is calculated exactly as in simple linear regression R2 can be inflated simply by adding more terms to the model (even insignificant terms) Adjusted R2 Penalizes you for added terms to the model that are not significant Linear Regression Analysis 5E Montgomery, Peck & Vining
47
3.3.1 Test for Significance of Regression
Adjusted R2 Example Say SST = , n = 15 Suppose that for a model with three regressors, the SSres = 90.00, then Now suppose that a fourth regressor has been added, and the SSRes = 88.00 Linear Regression Analysis 5E Montgomery, Peck & Vining
48
3.3.2 Tests on Individual Regression Coefficients
Hypothesis test on any single regression coefficient: Test Statistic: Reject H0 if |t0| > This is a partial or marginal test! See Example 3.4, pg. 88, text Linear Regression Analysis 5E Montgomery, Peck & Vining
49
Linear Regression Analysis 5E Montgomery, Peck & Vining
The Extra Sum of Squares method can also be used to test hypotheses on individual model parameters or groups of parameters Full model Linear Regression Analysis 5E Montgomery, Peck & Vining
50
Linear Regression Analysis 5E Montgomery, Peck & Vining
51
Linear Regression Analysis 5E Montgomery, Peck & Vining
52
3.3.3 Special Case of Orthogonal Columns in X
If the columns X1 are orthogonal to the columns in X2, the sum of squares due to 2 that is free of any dependence on the the regressors in X1. Linear Regression Analysis 5E Montgomery, Peck & Vining
53
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example Consider a dataset with four regressor variables and a single response. Fit the equation with all regressors and find that: y = x x x x4 Looking at the t-tests, suppose that x3 is insignificant. So it is removed. What is the equation now? Generally, it is not y = x x x4 Linear Regression Analysis 5E Montgomery, Peck & Vining
54
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example The model must be refit with the insignificant regressors left out of the model. The regression equation is y = x x x4 The refitting must be done since the coefficient estimates for an individual regressor depend on all of the regressors, xj Linear Regression Analysis 5E Montgomery, Peck & Vining
55
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example However, if the columns are orthogonal to each other, then there is no need to refit. Can you think of some situations where we would have orthogonal columns? Linear Regression Analysis 5E Montgomery, Peck & Vining
56
3.4.1. Confidence Intervals on the Regression Coefficients
A 100(1-) percent C.I. for the regression coefficient, j is: Or, Linear Regression Analysis 5E Montgomery, Peck & Vining
57
Linear Regression Analysis 5E Montgomery, Peck & Vining
58
3.4.2. Confidence Interval Estimation of the Mean Response
100(1-) percent CI on the mean response at the point x01, x02, …, x0k is See Example 3-9 on page 95 and the discussion that follows Linear Regression Analysis 5E Montgomery, Peck & Vining
59
Linear Regression Analysis 5E Montgomery, Peck & Vining
60
Linear Regression Analysis 5E Montgomery, Peck & Vining
61
Linear Regression Analysis 5E Montgomery, Peck & Vining
62
3.4.3. Simultaneous Confidence Intervals on Regression Coefficients
It can be shown that From this result, the joint confidence region for all parameters in is Linear Regression Analysis 5E Montgomery, Peck & Vining
63
3.5 Prediction of New Observations
A 100(1-) percent prediction interval for a future observation is Linear Regression Analysis 5E Montgomery, Peck & Vining
64
Linear Regression Analysis 5E Montgomery, Peck & Vining
65
3.6 Hidden Extrapolation in Multiple Regression
In prediction, exercise care about potentially extrapolating beyond the region containing the original observations. Figure 3.10 An example of extrapolation in multiple regression. Linear Regression Analysis 5E Montgomery, Peck & Vining
66
3.6 Hidden Extrapolation in Multiple Regression
We will define the smallest convex set containing all of the original n data points (xi1, xi2, … xik), i = 1, 2, …, n, as the regressor variable hull RVH. If a point x01, x02, …, x0k lies inside or on the boundary of the RVH, then prediction or estimation involves interpolation, while if this point lies outside the RVH, extrapolation is required. Linear Regression Analysis 5E Montgomery, Peck & Vining
67
3.6 Hidden Extrapolation in Multiple Regression
Diagonal elements of the matrix H = X(X’X)-1X’ can aid in determining if hidden extrapolation exists: The set of points x (not necessarily data points used to fit the model) that satisfy is an ellipsoid enclosing all points inside the RVH. Linear Regression Analysis 5E Montgomery, Peck & Vining
68
3.6 Hidden Extrapolation in Multiple Regression
Let x0 be a point at which prediction or estimation is of interest. Then If h00 > hmax then the point is a point of extrapolation. Linear Regression Analysis 5E Montgomery, Peck & Vining
69
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example 3.13 Consider prediction or estimation at: Linear Regression Analysis 5E Montgomery, Peck & Vining
70
Linear Regression Analysis 5E Montgomery, Peck & Vining
#9 d c a b Figure 3.10 Scatterplot of cases and distance for the delivery time data. Linear Regression Analysis 5E Montgomery, Peck & Vining
71
3.7 Standardized Regression Coefficients
It is often difficult to directly compare regression coefficients due to possible varying dimensions. It may be beneficial to work with dimensionless regression coefficients. Dimensionless regression coefficients are often referred to as standardized regression coefficients. Two common methods of scaling: Unit normal scaling Unit length scaling Linear Regression Analysis 5E Montgomery, Peck & Vining
72
3.7 Standardized Regression Coefficients
Unit Normal Scaling The first approach employs unit normal scaling for the regressors and the response variable. That is, Linear Regression Analysis 5E Montgomery, Peck & Vining
73
3.7 Standardized Regression Coefficients
Unit Normal Scaling where Linear Regression Analysis 5E Montgomery, Peck & Vining
74
3.7 Standardized Regression Coefficients
Unit Normal Scaling All of the scaled regressors and the scaled response have sample mean equal to zero and sample variance equal to 1. The model becomes The least squares estimator: Linear Regression Analysis 5E Montgomery, Peck & Vining
75
3.7 Standardized Regression Coefficients
Unit Length Scaling In unit length scaling: Linear Regression Analysis 5E Montgomery, Peck & Vining
76
3.7 Standardized Regression Coefficients
Unit Length Scaling Each regressor has mean 0 and length The regression model becomes The vector of least squares regression coefficients: Linear Regression Analysis 5E Montgomery, Peck & Vining
77
3.7 Standardized Regression Coefficients
Unit Length Scaling In unit length scaling, the W’W matrix is in the form of a correlation matrix: where rij is the simple correlation between xi and xj. Linear Regression Analysis 5E Montgomery, Peck & Vining
78
3.7 Standardized Regression Coefficients
Linear Regression Analysis 5E Montgomery, Peck & Vining
79
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example 3.14 Linear Regression Analysis 5E Montgomery, Peck & Vining
80
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example 3.14 Linear Regression Analysis 5E Montgomery, Peck & Vining
81
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example 3.14 Linear Regression Analysis 5E Montgomery, Peck & Vining
82
Linear Regression Analysis 5E Montgomery, Peck & Vining
Example 3.14 Linear Regression Analysis 5E Montgomery, Peck & Vining
83
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.8 Multicollinearity A serious problem that may dramatically impact the usefulness of a regression model is multicollinearity, or near-linear dependence among the regression variables. Multicollinearity implies near-linear dependence among the regressors. The regressors are the columns of the X matrix, so clearly an exact linear dependence would result in a singular X’X. The presence of multicollinearity can dramatically impact the ability to estimate regression coefficients and other uses of the regression model. Linear Regression Analysis 5E Montgomery, Peck & Vining
84
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.8 Multicollinearity Linear Regression Analysis 5E Montgomery, Peck & Vining
85
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.8 Multicollinearity The main diagonal elements of the inverse of the X’X matrix in correlation form (W’W)-1 are often called variance inflation factors VIFs, and they are an important multicollinearity diagnostic. For the soft drink delivery data, Linear Regression Analysis 5E Montgomery, Peck & Vining
86
Linear Regression Analysis 5E Montgomery, Peck & Vining
3.8 Multicollinearity The variance inflation factors can also be written as: where R2j is the coefficient of multiple determination obtained from regressing xj on the other regressor variables. If xj is highly correlated with any other regressor variable, then R2j will be large. Linear Regression Analysis 5E Montgomery, Peck & Vining
87
3.9 Why Do Regression Coefficients Have The Wrong Sign?
Regression coefficients may have the wrong sign for the following reasons: The range of some of the regressors is too small. Important regressors have not been included in the model. Multicollinearity is present. Computational errors have been made. Linear Regression Analysis 5E Montgomery, Peck & Vining
88
3.9 Why Do Regression Coefficients Have The Wrong Sign?
Linear Regression Analysis 5E Montgomery, Peck & Vining
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.