Download presentation
Presentation is loading. Please wait.
Published byShanon Byrd Modified over 9 years ago
1
Lecture 3 Introduction to Multiple Regression Business and Economic Forecasting
2
Chap 14-2 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-2 The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more independent variables (X i ) Multiple Regression Model with k Independent Variables: Y-intercept Population slopesRandom Error DCOVA
3
Chap 14-3 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-3 Multiple Regression Equation The coefficients of the multiple regression model are estimated using sample data Estimated (or predicted) value of Y Estimated slope coefficients Multiple regression equation with k independent variables: Estimated intercept In this lecture we will use Excel or STATA to obtain the regression slope coefficients and other regression summary measures. DCOVA
4
Chap 14-4 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-4 Two variable model Y X1X1 X2X2 Slope for variable X 1 Slope for variable X 2 Multiple Regression Equation (continued) DCOVA
5
Chap 14-5 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-5 Example: 2 Independent Variables A distributor of frozen dessert pies wants to evaluate factors thought to influence demand Dependent variable: Pie sales (units per week) Independent variables: Price (in $) Advertising ($100’s) Data are collected for 15 weeks DCOVA
6
Chap 14-6 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-6 Pie Sales Example Sales = b 0 + b 1 (Price) + b 2 (Advertising) Week Pie Sales Price ($) Advertising ($100s) 13505.503.3 24607.503.3 33508.003.0 44308.004.5 53506.803.0 63807.504.0 74304.503.0 84706.403.7 94507.003.5 104905.004.0 113407.203.5 123007.903.2 134405.904.0 144505.003.5 153007.002.7 Multiple regression equation: DCOVA
7
Chap 14-7 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-7 Excel Multiple Regression Output Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSFSignificance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 CoefficientsStandard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 DCOVA
8
Chap 14-8 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-8 The Multiple Regression Equation b 1 = -24.975: sales will decrease, on average, by 24.975 pies per week for each $1 increase in selling price, net of the effects of changes due to advertising b 2 = 74.131: sales will increase, on average, by 74.131 pies per week for each $100 increase in advertising, net of the effects of changes due to price where Sales is in number of pies per week Price is in $ Advertising is in $100’s. DCOVA
9
Chap 14-9 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-9 Using The Equation to Make Predictions Predict sales for a week in which the selling price is $5.50 and advertising is $350: Predicted sales is 428.62 pies Note that Advertising is in $100’s, so $350 means that X 2 = 3.5 DCOVA
10
Chap 14-10 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-10 Coefficient of Multiple Determination Reports the proportion of total variation in Y explained by all X variables taken together DCOVA
11
Chap 14-11 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-11 Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSFSignificance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 CoefficientsStandard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 52.1% of the variation in pie sales is explained by the variation in price and advertising Multiple Coefficient of Determination In Excel DCOVA
12
Chap 14-12 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-12 Adjusted r 2 r 2 never decreases when a new X variable is added to the model This can be a disadvantage when comparing models What is the net effect of adding a new variable? We lose a degree of freedom when a new X variable is added Did the new X variable add enough explanatory power to offset the loss of one degree of freedom? DCOVA
13
Chap 14-13 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-13 Shows the proportion of variation in Y explained by all X variables adjusted for the number of X variables used (where n = sample size, k = number of independent variables) Penalize excessive use of unimportant independent variables Smaller than r 2 Useful in comparing among models Adjusted r 2 (continued) DCOVA
14
Chap 14-14 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-14 Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSFSignificance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 CoefficientsStandard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 44.2% of the variation in pie sales is explained by the variation in price and advertising, taking into account the sample size and number of independent variables Adjusted r 2 in Excel DCOVA
15
Chap 14-15 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-15 Is the Model Significant? F Test for Overall Significance of the Model Shows if there is a linear relationship between all of the X variables considered together and Y Use F-test statistic Hypotheses: H 0 : β 1 = β 2 = … = β k = 0 (no linear relationship) H 1 : at least one β i ≠ 0 (at least one independent variable affects Y) DCOVA
16
Chap 14-16 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-16 F Test for Overall Significance Test statistic: where F STAT has numerator d.f. = k and denominator d.f. = (n – k - 1) DCOVA
17
Chap 14-17 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-17 Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSFSignificance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 CoefficientsStandard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 (continued) F Test for Overall Significance In Excel With 2 and 12 degrees of freedom P-value for the F Test DCOVA
18
Chap 14-18 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-18 H 0 : β 1 = β 2 = 0 H 1 : β 1 and β 2 not both zero =.05 df 1 = 2 df 2 = 12 Test Statistic: Decision: Conclusion: Since F STAT test statistic is in the rejection region (p- value <.05), reject H 0 There is evidence that at least one independent variable affects Y 0 =.05 F 0.05 = 3.885 Reject H 0 Do not reject H 0 Critical Value: F 0.05 = 3.885 F Test for Overall Significance (continued) F DCOVA
19
Chap 14-19 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-19 Two variable model Y X1X1 X2X2 YiYi Y i < x 2i x 1i The best fit equation is found by minimizing the sum of squared errors, e 2 Sample observation Residuals in Multiple Regression Residual = e i = (Y i – Y i ) < DCOVA
20
Chap 14-20 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-20 Multiple Regression Assumptions Assumptions: The errors are normally distributed Errors have a constant variance The model errors are independent e i = (Y i – Y i ) < Errors ( residuals ) from the regression model: DCOVA
21
Chap 14-21 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-21 Residual Plots Used in Multiple Regression These residual plots are used in multiple regression: Residuals vs. Y i Residuals vs. X 1i Residuals vs. X 2i Residuals vs. time (if time series data) < Use the residual plots to check for violations of regression assumptions DCOVA
22
Chap 14-22 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-22 Are Individual Variables Significant? Use t tests of individual variable slopes Shows if there is a linear relationship between the variable X j and Y holding constant the effects of other X variables Hypotheses: H 0 : β j = 0 (no linear relationship) H 1 : β j ≠ 0 (linear relationship does exist between X j and Y) DCOVA
23
Chap 14-23 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-23 Are Individual Variables Significant? H 0 : β j = 0 (no linear relationship) H 1 : β j ≠ 0 (linear relationship does exist between X j and Y) Test Statistic: ( df = n – k – 1) (continued) DCOVA
24
Chap 14-24 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-24 Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSFSignificance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 CoefficientsStandard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 t Stat for Price is t STAT = -2.306, with p-value.0398 t Stat for Advertising is t STAT = 2.855, with p-value.0145 (continued) Are Individual Variables Significant? Excel Output DCOVA
25
Chap 14-25 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-25 d.f. = 15-2-1 = 12 =.05 t /2 = 2.1788 Inferences about the Slope: t Test Example H 0 : β j = 0 H 1 : β j 0 The test statistic for each variable falls in the rejection region (p-values <.05) There is evidence that both Price and Advertising affect pie sales at =.05 From the Excel output: Reject H 0 for each variable Decision: Conclusion: Reject H 0 /2=.025 -t α/2 Do not reject H 0 0 t α/2 /2=.025 -2.17882.1788 For Price t STAT = -2.306, with p-value.0398 For Advertising t STAT = 2.855, with p-value.0145 DCOVA
26
Chap 14-26 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-26 Confidence Interval Estimate for the Slope Confidence interval for the population slope β j Example: Form a 95% confidence interval for the effect of changes in price (X 1 ) on pie sales: -24.975 ± (2.1788)(10.832) So the interval is (-48.576, -1.374) (This interval does not contain zero, so price has a significant effect on sales) CoefficientsStandard Error Intercept306.52619114.25389 Price-24.9750910.83213 Advertising74.1309625.96732 where t has (n – k – 1) d.f. Here, t has (15 – 2 – 1) = 12 d.f. DCOVA
27
Chap 14-27 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-27 Confidence Interval Estimate for the Slope Confidence interval for the population slope β j Example: Excel output also reports these interval endpoints: Weekly sales are estimated to be reduced by between 1.37 to 48.58 pies for each increase of $1 in the selling price, holding the effect of price constant CoefficientsStandard Error…Lower 95%Upper 95% Intercept306.52619114.25389…57.58835555.46404 Price-24.9750910.83213…-48.57626-1.37392 Advertising74.1309625.96732…17.55303130.70888 (continued) DCOVA
28
Chap 14-28 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-28 Using Dummy Variables A dummy variable is a categorical independent variable with two levels: yes or no, on or off, male or female coded as 0 or 1 Assumes the slopes associated with numerical independent variables do not change with the value for the categorical variable If more than two levels, the number of dummy variables needed is (number of levels - 1) DCOVA
29
Chap 14-29 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-29 Dummy-Variable Example (with 2 Levels) Let: Y = pie sales X 1 = price X 2 = holiday (X 2 = 1 if a holiday occurred during the week) (X 2 = 0 if there was no holiday that week) DCOVA
30
Chap 14-30 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-30 Same slope Dummy-Variable Example (with 2 Levels) (continued) X 1 (Price) Y (sales) b 0 + b 2 b0b0 Holiday No Holiday Different intercept Holiday (X 2 = 1) No Holiday (X 2 = 0) If H 0 : β 2 = 0 is rejected, then “Holiday” has a significant effect on pie sales DCOVA
31
Chap 14-31 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-31 Sales: number of pies sold per week Price: pie price in $ Holiday: Interpreting the Dummy Variable Coefficient (with 2 Levels) Example: 1 If a holiday occurred during the week 0 If no holiday occurred b 2 = 15: on average, sales were 15 pies greater in weeks with a holiday than in weeks without a holiday, given the same price DCOVA
32
Chap 14-32 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-32 Dummy-Variable Models (more than 2 Levels) The number of dummy variables is one less than the number of levels Example: Y = house price ; X 1 = square feet If style of the house is also thought to matter: Style = ranch, split level, colonial Three levels, so two dummy variables are needed DCOVA
33
Chap 14-33 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-33 Dummy-Variable Models (more than 2 Levels) Example: Let “colonial” be the default category, and let X 2 and X 3 be used for the other two categories: Y = house price X 1 = square feet X 2 = 1 if ranch, 0 otherwise X 3 = 1 if split level, 0 otherwise The multiple regression equation is: (continued) DCOVA
34
Chap 14-34 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-34 Interpreting the Dummy Variable Coefficients (with 3 Levels) With the same square feet, a ranch will have an estimated average price of 23.53 thousand dollars more than a colonial. With the same square feet, a split-level will have an estimated average price of 18.84 thousand dollars more than a colonial. Consider the regression equation: For a colonial: X 2 = X 3 = 0 For a ranch: X 2 = 1; X 3 = 0 For a split level: X 2 = 0; X 3 = 1 DCOVA
35
Chap 14-35 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-35 Interaction Between Independent Variables Hypothesizes interaction between pairs of X variables Response to one X variable may vary at different levels of another X variable Contains two-way cross product terms DCOVA
36
Chap 14-36 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-36 Effect of Interaction Given: Without interaction term, effect of X 1 on Y is measured by β 1 With interaction term, effect of X 1 on Y is measured by β 1 + β 3 X 2 Effect changes as X 2 changes DCOVA
37
Chap 14-37 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-37 X 2 = 1: Y = 1 + 2X 1 + 3(1) + 4X 1 (1) = 4 + 6X 1 X 2 = 0: Y = 1 + 2X 1 + 3(0) + 4X 1 (0) = 1 + 2X 1 Interaction Example Slopes are different if the effect of X 1 on Y depends on X 2 value X1X1 4 8 12 0 010.51.5 Y = 1 + 2X 1 + 3X 2 + 4X 1 X 2 Suppose X 2 is a dummy variable and the estimated regression equation is DCOVA
38
Chap 14-38 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-38 Logistic Regression Used when the dependent variable Y is binary (i.e., Y takes on only two values) Examples Customer prefers Brand A or Brand B Employee chooses to work full-time or part-time Loan is delinquent or is not delinquent Person voted in last election or did not Logistic regression allows you to predict the probability of a particular categorical response DCOVA
39
Chap 14-39 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-39 Logistic Regression Where k = number of independent variables in the model ε i = random error in observation i Logistic Regression Model: Logistic Regression Equation: (continued) DCOVA
40
Chap 14-40 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-40 Estimated Odds Ratio and Probability of Success Once you have the logistic regression equation, compute the estimated odds ratio: The estimated probability of success is DCOVA
41
Chap 15-41 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-41 The relationship between the dependent variable and an independent variable may not be linear Can review the scatter plot to check for non- linear relationships Example: Quadratic model The second independent variable is the square of the first variable Nonlinear Relationships DCOVA
42
Chap 15-42 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-42 Quadratic Regression Model where: β 0 = Y intercept β 1 = regression coefficient for linear effect of X on Y β 2 = regression coefficient for quadratic effect on Y ε i = random error in Y for observation i Model form: DCOVA
43
Chap 15-43 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-43 Linear fit does not give random residuals Linear vs. Nonlinear Fit Nonlinear fit gives random residuals X residuals X Y X Y X DCOVA
44
Chap 15-44 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-44 Quadratic Regression Model Quadratic models may be considered when the scatter plot takes on one of the following shapes: X1X1 Y X1X1 X1X1 YYY β 1 < 0β 1 > 0β 1 < 0β 1 > 0 β 1 = the coefficient of the linear term β 2 = the coefficient of the squared term X1X1 β 2 > 0 β 2 < 0 DCOVA
45
Chap 15-45 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-45 Testing for Significance: Quadratic Effect Testing the Quadratic Effect Compare adjusted r 2 from simple regression model to adjusted r 2 from the quadratic model If adjusted r 2 from the quadratic model is larger than the adjusted r 2 from the simple model, then the quadratic model is likely a better model (continued) DCOVA
46
Chap 15-46 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-46 Example: Quadratic Model Purity increases as filter time increases: Purity Filter Time 3 1 7 2 8 3 15 5 22 7 33 8 4010 5412 6713 7014 7815 8515 8716 9917 DCOVA
47
Chap 15-47 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-47 Example: Quadratic Model (continued) Regression Statistics R Square0.96888 Adjusted R Square0.96628 Standard Error6.15997 Simple regression results: Y = -11.283 + 5.985 Time Coefficients Standard Errort StatP-value Intercept-11.282673.46805-3.253320.00691 Time5.985200.3096619.328192.078E-10 FSignificance F 373.579042.0778E-10 ^ t statistic, F statistic, and r 2 are all high, but the residuals are not random: DCOVA
48
Chap 15-48 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-48 Coefficients Standard Errort StatP-value Intercept1.538702.244650.685500.50722 Time1.564960.601792.600520.02467 Time-squared0.245160.032587.524061.165E-05 Quadratic regression results: Y = 1.539 + 1.565 Time + 0.245 (Time) 2 ^ Example: Quadratic Model (continued) The quadratic term is statistically significant (p-value very small) DCOVA
49
Chap 15-49 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-49 Regression Statistics R Square0.99494 Adjusted R Square0.99402 Standard Error2.59513 Quadratic regression results: Y = 1.539 + 1.565 Time + 0.245 (Time) 2 ^ Example: Quadratic Model (continued) The adjusted r 2 of the quadratic model is higher than the adjusted r 2 of the simple regression model. The quadratic model explains 99.4% of the variation in Y. DCOVA
50
Chap 15-50 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Example: Quadratic Model Residual Plots Quadratic regression results: Y = 1.539 + 1.565 Time + 0.245 (Time) 2 The residuals plotted versus both Time and Time-squared show a random pattern. (continued) DCOVA
51
Chap 15-51 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-51 Using Transformations in Regression Analysis Idea: non-linear models can often be transformed to a linear form Can be estimated by least squares if transformed transform X or Y or both to get a better fit or to deal with violations of regression assumptions Can be based on theory, logic or scatter plots DCOVA
52
Chap 15-52 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-52 The Square Root Transformation The square-root transformation Used to overcome violations of the constant variance assumption fit a non-linear relationship DCOVA
53
Chap 15-53 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-53 Shape of original relationship X The Square Root Transformation (continued) b 1 > 0 b 1 < 0 X Y Y Y Y Relationship when transformed DCOVA
54
Chap 15-54 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-54 Original multiplicative model The Log Transformation Transformed multiplicative model The Multiplicative Model: Original multiplicative model Transformed exponential model The Exponential Model: DCOVA
55
Chap 15-55 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-55 Interpretation of Coefficients For the multiplicative model: When both dependent and independent variables are lagged: The coefficient of the independent variable X k can be interpreted as : a 1 percent change in X k leads to an estimated b k percentage change in the average value of Y. Therefore b k is the elasticity of Y with respect to a change in X k. DCOVA
56
Chap 15-56 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-56 Collinearity Collinearity: High correlation exists among two or more independent variables This means the correlated variables contribute redundant information to the multiple regression model DCOVA
57
Chap 15-57 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-57 Collinearity Including two highly correlated independent variables can adversely affect the regression results No new information provided Can lead to unstable coefficients (large standard error and low t-values) Coefficient signs may not match prior expectations (continued) DCOVA
58
Chap 15-58 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-58 Some Indications of Strong Collinearity Incorrect signs on the coefficients Large change in the value of a previous coefficient when a new variable is added to the model A previously significant variable becomes non- significant when a new independent variable is added The estimate of the standard deviation of the model increases when a variable is added to the model DCOVA
59
Chap 15-59 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-59 Detecting Collinearity (Variance Inflationary Factor) VIF j is used to measure collinearity: If VIF j > 5, X j is highly correlated with the other independent variables where R 2 j is the coefficient of determination of variable X j with all other X variables DCOVA
60
Chap 15-60 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-60 Example: Pie Sales Sales = b 0 + b 1 (Price) + b 2 (Advertising) Week Pie Sales Price ($) Advertising ($100s) 13505.503.3 24607.503.3 33508.003.0 44308.004.5 53506.803.0 63807.504.0 74304.503.0 84706.403.7 94507.003.5 104905.004.0 113407.203.5 123007.903.2 134405.904.0 144505.003.5 153007.002.7 Recall the multiple regression equation of chapter 14: DCOVA
61
Chap 15-61 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-61 Detecting Collinearity in Excel using PHStat Output for the pie sales example: Since there are only two independent variables, only one VIF is reported VIF is < 5 There is no evidence of collinearity between Price and Advertising Regression Analysis Price and all other X Regression Statistics Multiple R0.030438 R Square0.000926 Adjusted R Square-0.075925 Standard Error1.21527 Observations15 VIF1.000927 PHStat / regression / multiple regression … Check the “variance inflationary factor (VIF)” box DCOVA
62
Chap 15-62 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-62 Model Building Goal is to develop a model with the best set of independent variables Easier to interpret if unimportant variables are removed Lower probability of collinearity Stepwise regression procedure Provide evaluation of alternative models as variables are added and deleted Best-subset approach Try all combinations and select the best using the highest adjusted r 2 and lowest standard error DCOVA
63
Chap 15-63 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-63 Idea: develop the least squares regression equation in steps, adding one independent variable at a time and evaluating whether existing variables should remain or be removed The coefficient of partial determination is the measure of the marginal contribution of each independent variable, given that other independent variables are in the model Stepwise Regression DCOVA
64
Chap 15-64 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-64 Best Subsets Regression Idea: estimate all possible regression equations using all possible combinations of independent variables Choose the best fit by looking for the highest adjusted r 2 and lowest standard error Stepwise regression and best subsets regression can be performed using PHStat DCOVA
65
Chap 15-65 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-65 Alternative Best Subsets Criterion Calculate the value C p for each potential regression model Consider models with C p values close to or below k + 1 k is the number of independent variables in the model under consideration DCOVA
66
Chap 15-66 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-66 Alternative Best Subsets Criterion The C p Statistic Where k = number of independent variables included in a particular regression model T = total number of parameters to be estimated in the full regression model = coefficient of multiple determination for model with k independent variables = coefficient of multiple determination for full model with all T estimated parameters (continued) DCOVA
67
Chap 15-67 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-67 Steps in Model Building 1. Compile a listing of all independent variables under consideration 2. Estimate full model and check VIFs 3. Check if any VIFs > 5 If no VIF > 5, go to step 4 If one VIF > 5, remove this variable If more than one, eliminate the variable with the highest VIF and go back to step 2 4.Perform best subsets regression with remaining variables DCOVA
68
Chap 15-68 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-68 Steps in Model Building 5. List all models with C p close to or less than (k + 1) 6. Choose the best model Consider parsimony Do extra variables make a significant contribution? 7.Perform complete analysis with chosen model, including residual analysis 8.Transform the model if necessary to deal with violations of linearity or other model assumptions 9.Use the model for prediction and inference (continued) DCOVA
69
Chap 15-69 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 15-69 Model Building Flowchart Choose X 1,X 2,…X k Run regression to find VIFs Remove variable with highest VIF Any VIF>5? Run subsets regression to obtain “best” models in terms of C p Do complete analysis Add quadratic and/or interaction terms or transform variables Perform predictions No More than one? Remove this X Yes No Yes DCOVA
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.