Download presentation
Presentation is loading. Please wait.
Published byMegan Sutton Modified over 9 years ago
1
1 1 Slide © 2003 Thomson/South-Western Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination n Model Assumptions n Testing for Significance n Using the Estimated Regression Equation for Estimation and Prediction for Estimation and Prediction n Qualitative Independent Variables
2
2 2 Slide © 2003 Thomson/South-Western Multiple Regression Model n The equation that describes how the dependent variable y is related to the independent variables x 1, x 2,... x p and an error term is called the multiple regression model. n The multiple regression model is: y = 0 + 1 x 1 + 2 x 2 +... + p x p + 0, 1, 2,..., p are the parameters. 0, 1, 2,..., p are the parameters. is a random variable called the error term. is a random variable called the error term.
3
3 3 Slide © 2003 Thomson/South-Western n The equation that describes how the mean value of y is related to x 1, x 2,... x p is called the multiple regression equation. n The multiple regression equation is: E ( y ) = 0 + 1 x 1 + 2 x 2 +... + p x p Multiple Regression Equation
4
4 4 Slide © 2003 Thomson/South-Western A simple random sample is used to compute sample statistics b 0, b 1, b 2,..., b p that are used as the point estimators of the parameters 0, 1, 2,..., p. A simple random sample is used to compute sample statistics b 0, b 1, b 2,..., b p that are used as the point estimators of the parameters 0, 1, 2,..., p. n The estimated multiple regression equation is: y = b 0 + b 1 x 1 + b 2 x 2 +... + b p x p Estimated Multiple Regression Equation ^
5
5 5 Slide © 2003 Thomson/South-Western Estimation Process Multiple Regression Model E ( y ) = 0 + 1 x 1 + 2 x 2 +.. + p x p + Multiple Regression Equation E ( y ) = 0 + 1 x 1 + 2 x 2 +... + p x p Unknown parameters are 0, 1, 2,..., p Sample Data: x 1 x 2... x p y.... Estimated Multiple Regression Equation b 0, b 1, b 2,..., b p are sample statistics b 0, b 1, b 2,..., b p provide estimates of 0, 1, 2,..., p
6
6 6 Slide © 2003 Thomson/South-Western Least Squares Method n Least Squares Criterion n Computation of Coefficients Values The formulas for the regression coefficients b 0, b 1, b 2,... b p involve the use of matrix algebra. We will rely on computer software packages to perform the calculations. ^
7
7 7 Slide © 2003 Thomson/South-Western Least Squares Method n A Note on Interpretation of Coefficients b i represents an estimate of the change in y corresponding to a one-unit change in x i when all other independent variables are held constant.
8
8 8 Slide © 2003 Thomson/South-Western n Relationship Among SST, SSR, SSE SST = SSR + SSE Multiple Coefficient of Determination
9
9 9 Slide © 2003 Thomson/South-Western n Multiple Coefficient of Determination R 2 = SSR/SST n Adjusted Multiple Coefficient of Determination Multiple Coefficient of Determination
10
10 Slide © 2003 Thomson/South-Western Model Assumptions Assumptions About the Error Term Assumptions About the Error Term 1.The error is a random variable with mean of zero. 2.The variance of , denoted by 2, is the same for all values of the independent variables. 3.The values of are independent. 4.The error is a normally distributed random variable reflecting the deviation between the y value and the expected value of y given by 0 + 1 x 1 + 2 x 2 +... + p x p 0 + 1 x 1 + 2 x 2 +... + p x p
11
11 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey A software firm collected data for a sample of 20 computer programmers. A suggestion was made that regression analysis could be used to determine if salary was related to the years of experience and the score on the firm’s programmer aptitude test. The years of experience, score on the aptitude test, and corresponding annual salary ($1000s) for a sample of 20 programmers is shown on the next slide.
12
12 Slide © 2003 Thomson/South-Western Exper. Score Salary Exper. Score Salary Exper. Score Salary Exper. Score Salary 4782498838 71004327326.6 18623.7107536.2 58234.358131.6 88635.867429 10843888734 07522.247930.1 18023.169433.9 6833037028.2 6913338930 Example: Programmer Salary Survey
13
13 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey n Multiple Regression Model Suppose we believe that salary ( y ) is related to the years of experience ( x 1 ) and the score on the programmer aptitude test ( x 2 ) by the following regression model: y = 0 + 1 x 1 + 2 x 2 + where y = annual salary ($000) y = annual salary ($000) x 1 = years of experience x 1 = years of experience x 2 = score on programmer aptitude test x 2 = score on programmer aptitude test
14
14 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey Solving for the Estimates of 0, 1, 2 Solving for the Estimates of 0, 1, 2 ComputerPackage for Solving MultipleRegressionProblemsComputerPackage MultipleRegressionProblems b 0 = b 1 = b 1 = b 2 = b 2 = R 2 = etc. b 0 = b 1 = b 1 = b 2 = b 2 = R 2 = etc. Input Data Least Squares Output x 1 x 2 y 4 78 24 4 78 24 7 100 43 7 100 43...... 3 89 30 3 89 30 x 1 x 2 y 4 78 24 4 78 24 7 100 43 7 100 43...... 3 89 30 3 89 30
15
15 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey n Minitab Computer Output The regression is Salary = 3.174 + 1.404 Exper + 0.251 Score Predictor Coef Stdev t-ratio p Constant3.1746.156.52.613 Exper1.4039.19867.07.000 Score.25089.077353.24.005 s = 2.419 R-sq = 83.4% R-sq(adj) = 81.5%
16
16 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey n Estimated Regression Equation SALARY = 3.174 + 1.404(EXPER) + 0.2509(SCORE) SALARY = 3.174 + 1.404(EXPER) + 0.2509(SCORE) Note: Predicted salary will be in thousands of dollars
17
17 Slide © 2003 Thomson/South-Western n In simple linear regression, the F and t tests provide the same conclusion. n In multiple regression, the F and t tests have different purposes. Testing for Significance
18
18 Slide © 2003 Thomson/South-Western Testing for Significance: F Test n The F test is used to determine whether a significant relationship exists between the dependent variable and the set of all the independent variables. n The F test is referred to as the test for overall significance.
19
19 Slide © 2003 Thomson/South-Western Testing for Significance: t Test n If the F test shows an overall significance, the t test is used to determine whether each of the individual independent variables is significant. n A separate t test is conducted for each of the independent variables in the model. n We refer to each of these t tests as a test for individual significance.
20
20 Slide © 2003 Thomson/South-Western Testing for Significance: F Test n Hypotheses H 0 : 1 = 2 =... = p = 0 H 0 : 1 = 2 =... = p = 0 H a : One or more of the parameters H a : One or more of the parameters is not equal to zero. is not equal to zero. n Test Statistic F = MSR/MSE n Rejection Rule Reject H 0 if F > F where F is based on an F distribution with p d.f. in the numerator and n - p - 1 d.f. in the denominator.
21
21 Slide © 2003 Thomson/South-Western Testing for Significance: t Test n Hypotheses H 0 : i = 0 H 0 : i = 0 H a : i = 0 H a : i = 0 n Test Statistic n Rejection Rule Reject H 0 if t t where t is based on a t distribution with where t is based on a t distribution with n - p - 1 degrees of freedom. n - p - 1 degrees of freedom.
22
22 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey n Minitab Computer Output (continued) Analysis of Variance SOURCE DF SS MS F P Regression2500.33250.1642.760.000 Error1799.465.85 Total19599.79
23
23 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey n F Test Hypotheses Hypotheses H 0 : 1 = 2 = 0 H a : One or both of the parameters H a : One or both of the parameters is not equal to zero. is not equal to zero. Rejection Rule Rejection Rule For =.05 and d.f. = 2, 17: For =.05 and d.f. = 2, 17: F.05 = 3.59 Reject H 0 if F > 3.59.
24
24 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey n F Test Test Statistic Test Statistic F = MSR/MSE = 250.16/5.85 = 42.76 = 250.16/5.85 = 42.76 Conclusion Conclusion F = 42.76 > 3.59, so we can reject H 0.
25
25 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey n t Test for Significance of Individual Parameters Hypotheses Hypotheses H 0 : i = 0 H a : i = 0 H a : i = 0 Rejection Rule Rejection Rule For =.05 and d.f. = 17: For =.05 and d.f. = 17: t.025 = 2.11 Reject H 0 if t > 2.11
26
26 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey n t Test for Significance of Individual Parameters Test Statistics Test Statistics Conclusions Conclusions Reject H 0 : 1 = 0 and reject H 0 : 2 = 0. Reject H 0 : 1 = 0 and reject H 0 : 2 = 0. Both independent variables are significant. Both independent variables are significant.
27
27 Slide © 2003 Thomson/South-Western Testing for Significance: Multicollinearity n The term multicollinearity refers to the correlation among the independent variables. n When the independent variables are highly correlated (say, | r | >.7), it is not possible to determine the separate effect of any particular independent variable on the dependent variable.
28
28 Slide © 2003 Thomson/South-Western Testing for Significance: Multicollinearity n If the estimated regression equation is to be used only for predictive purposes, multicollinearity is usually not a serious problem. n Every attempt should be made to avoid including independent variables that are highly correlated.
29
29 Slide © 2003 Thomson/South-Western Using the Estimated Regression Equation for Estimation and Prediction n The procedures for estimating the mean value of y and predicting an individual value of y in multiple regression are similar to those in simple regression. n We substitute the given values of x 1, x 2,..., x p into the estimated regression equation and use the corresponding value of y as the point estimate. n The formulas required to develop interval estimates for the mean value of y and for an individual value of y are beyond the scope of the text. n Software packages for multiple regression will often provide these interval estimates. ^
30
30 Slide © 2003 Thomson/South-Western Qualitative Independent Variables n In many situations we must work with qualitative independent variables such as gender (male, female), method of payment (cash, check, credit card), etc. n For example, x 2 might represent gender where x 2 = 0 indicates male and x 2 = 1 indicates female. n In this case, x 2 is called a dummy or indicator variable.
31
31 Slide © 2003 Thomson/South-Western Qualitative Independent Variables n If a qualitative variable has k levels, k - 1 dummy variables are required, with each dummy variable being coded as 0 or 1. n For example, a variable with levels A, B, and C would be represented by x 1 and x 2 values of (0, 0), (1, 0), and (0,1), respectively.
32
32 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey (B) As an extension of the problem involving the computer programmer salary survey, suppose that management also believes that the annual salary is related to whether or not the individual has a graduate degree in computer science or information systems. The years of experience, the score on the programmer aptitude test, whether or not the individual has a relevant graduate degree, and the annual salary ($000) for each of the sampled 20 programmers are shown on the next slide.
33
33 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey (B) Exp. Score Degr. Salary Exp. Score Degr. Salary Exp. Score Degr. Salary Exp. Score Degr. Salary 478No24988Yes38 7100Yes43273No26.6 186No23.71075Yes36.2 582Yes34.3581No31.6 886Yes35.8674No29 1084Yes38887Yes34 075No22.2479No30.1 180 No 23.1694Yes33.9 683No30370No28.2 691Yes33389No30
34
34 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey (B) n Multiple Regression Equation E( y ) = 0 + 1 x 1 + 2 x 2 + 3 x 3 n Estimated Regression Equation y = b 0 + b 1 x 1 + b 2 x 2 + b 3 x 3 y = b 0 + b 1 x 1 + b 2 x 2 + b 3 x 3where y = annual salary ($000) y = annual salary ($000) x 1 = years of experience x 1 = years of experience x 2 = score on programmer aptitude test x 3 = 0 if individual does not have a grad. degree 1 if individual does have a grad. degree 1 if individual does have a grad. degree Note: x 3 is referred to as a dummy variable. ^
35
35 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey (B) n Minitab Computer Output The regression is Salary = 7.95 + 1.15 Exp + 0.197 Score + 2.28 Deg Predictor Coef Stdev t-ratio p Constant7.9457.3811.08.298 Exp1.1476.29763.86.001 Score.19694.08992.19.044 Deg2.2801.9871.15.268 s = 2.396 R-sq = 84.7% R-sq(adj) = 81.8% s = 2.396 R-sq = 84.7% R-sq(adj) = 81.8%
36
36 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey (B) n Minitab Computer Output (continued) Analysis of Variance SOURCE DF SS MS F P Regression3507.90169.3029.480.000 Error1691.895.74 Total19599.79
37
37 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey (B) n Interpreting the Parameters b 1 = 1.15 b 1 = 1.15 Salary is expected to increase by $1,150 for each additional year of experience (when all other independent variables are held constant)
38
38 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey (B) n Interpreting the Parameters b 2 = 0.197 b 2 = 0.197 Salary is expected to increase by $197 for each additional point scored on the programmer aptitude test (when all other independent variables are held constant)
39
39 Slide © 2003 Thomson/South-Western Example: Programmer Salary Survey (B) n Interpreting the Parameters b 3 = 2.28 b 3 = 2.28 Salary is expected to be $2,280 higher for an individual with a graduate degree than one without a graduate degree (when all other independent variables are held constant)
40
40 Slide © 2003 Thomson/South-Western End of Chapter 13
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.