Presentation is loading. Please wait.

Presentation is loading. Please wait.

Statistics for Business and Economics 8 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice.

Similar presentations


Presentation on theme: "Statistics for Business and Economics 8 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice."— Presentation transcript:

1 Statistics for Business and Economics 8 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

2 Chapter Goals After completing this chapter, you should be able to: Apply multiple regression analysis to business decision- making situations Analyze and interpret the computer output for a multiple regression model Perform a hypothesis test for all regression coefficients or for a subset of coefficients Fit and interpret nonlinear regression models Incorporate qualitative variables into the regression model by using dummy variables Discuss model specification and analyze residuals Ch. 12-2 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

3 The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more independent variables (X i ) Multiple Regression Model with K Independent Variables: Y-intercept Population slopesRandom Error Ch. 12-3 12.1 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

4 Multiple Regression Equation The coefficients of the multiple regression model are estimated using sample data Estimated (or predicted) value of y Estimated slope coefficients Multiple regression equation with K independent variables: Estimated intercept In this chapter we will always use a computer to obtain the regression slope coefficients and other regression summary measures. Ch. 12-4 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

5 Three Dimensional Graphing Two variable model y x1x1 x2x2 Slope for variable x 1 Slope for variable x 2 Ch. 12-5 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

6 Three Dimensional Graphing Two variable model y x1x1 x2x2 yiyi y i < e i = (y i – y i ) < x 2i x 1i The best fit equation, y, is found by minimizing the sum of squared errors,  e 2 < Sample observation Ch. 12-6 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall (continued)

7 Estimation of Coefficients 1. The x ji terms are fixed numbers, or they are realizations of random variables X j that are independent of the error terms, ε i 2. The expected value of the random variable Y is a linear function of the independent X j variables. 3. The error terms are normally distributed random variables with mean 0 and a constant variance,  2. (The constant variance property is called homoscedasticity) Ch. 12-7 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall 12.2 Standard Multiple Regression Assumptions

8 4. The random error terms, ε i, are not correlated with one another, so that 5. It is not possible to find a set of numbers, c 0, c 1,..., c k, such that (This is the property of no linear relation for the X j s) (continued) Ch. 12-8 Standard Multiple Regression Assumptions Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

9 Example: 2 Independent Variables A distributor of frozen desert pies wants to evaluate factors thought to influence demand Dependent variable: Pie sales (units per week) Independent variables: Price (in $) Advertising ($100’s) Data are collected for 15 weeks Ch. 12-9 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

10 Pie Sales Example Sales = b 0 + b 1 (Price) + b 2 (Advertising) Week Pie Sales Price ($) Advertising ($100s) 13505.503.3 24607.503.3 33508.003.0 44308.004.5 53506.803.0 63807.504.0 74304.503.0 84706.403.7 94507.003.5 104905.004.0 113407.203.5 123007.903.2 134405.904.0 144505.003.5 153007.002.7 Multiple regression equation: Ch. 12-10 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

11 Estimating a Multiple Linear Regression Equation Excel can be used to generate the coefficients and measures of goodness of fit for multiple regression Data / Data Analysis / Regression Ch. 12-11 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

12 Multiple Regression Output Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSFSignificance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 CoefficientsStandard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 Ch. 12-12 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

13 The Multiple Regression Equation b 1 = -24.975: sales will decrease, on average, by 24.975 pies per week for each $1 increase in selling price, net of the effects of changes due to advertising b 2 = 74.131: sales will increase, on average, by 74.131 pies per week for each $100 increase in advertising, net of the effects of changes due to price where Sales is in number of pies per week Price is in $ Advertising is in $100’s. Ch. 12-13 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

14 Example 12.3 Y profit margines X 1 : net revenue for deposit dollar X 2 nunber of offices Y = 1.56 + 0.237X 1 -0.000249X 2 Corrleations YX 1 X 1 -0.704 X 2 -.0.8680.941

15 Y = 1.33 -0.169X 1 Y = 1.55 – 0.000120X 2

16 Explanatory Power of a Multiple Regression Equation Coefficient of Determination, R 2 Reports the proportion of total variation in y explained by all x variables taken together This is the ratio of the explained variability to total sample variability Ch. 12-16 12.3 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

17 Coefficient of Determination, R 2 Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSFSignificance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 CoefficientsStandard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 52.1% of the variation in pie sales is explained by the variation in price and advertising Ch. 12-17 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

18 Estimation of Error Variance Consider the population regression model The unbiased estimate of the variance of the errors is where The square root of the variance, s e, is called the standard error of the estimate Ch. 12-18 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

19 Standard Error, s e Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSFSignificance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 CoefficientsStandard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 The magnitude of this value can be compared to the average y value Ch. 12-19 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

20 Adjusted Coefficient of Determination, R 2 never decreases when a new X variable is added to the model, even if the new variable is not an important predictor variable This can be a disadvantage when comparing models What is the net effect of adding a new variable? We lose a degree of freedom when a new X variable is added Did the new X variable add enough explanatory power to offset the loss of one degree of freedom? Ch. 12-20 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

21 Adjusted Coefficient of Determination, Used to correct for the fact that adding non-relevant independent variables will still reduce the error sum of squares (where n = sample size, K = number of independent variables) Adjusted R 2 provides a better comparison between multiple regression models with different numbers of independent variables Penalize excessive use of unimportant independent variables Value is less than R 2 (continued) Ch. 12-21 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

22 Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSFSignificance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 CoefficientsStandard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 44.2% of the variation in pie sales is explained by the variation in price and advertising, taking into account the sample size and number of independent variables Ch. 12-22 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

23 Coefficient of Multiple Correlation The coefficient of multiple correlation is the correlation between the predicted value and the observed value of the dependent variable Is the square root of the multiple coefficient of determination Used as another measure of the strength of the linear relationship between the dependent variable and the independent variables Comparable to the correlation between Y and X in simple regression Ch. 12-23 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

24 Conf. Intervals and Hypothesis Tests for Regression Coefficients The variance of a coefficient estimate is affected by: the sample size the spread of the X variables the correlations between the independent variables, and the model error term We are typically more interested in the regression coefficients b j than in the constant or intercept b 0 Ch. 12-24 12.4 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

25 Confidence Intervals Confidence interval limits for the population slope β j Example: Form a 95% confidence interval for the effect of changes in price (x 1 ) on pie sales: -24.975 ± (2.1788)(10.832) So the interval is -48.576 < β 1 < -1.374 CoefficientsStandard Error Intercept306.52619114.25389 Price-24.9750910.83213 Advertising74.1309625.96732 where t has (n – K – 1) d.f. Here, t has (15 – 2 – 1) = 12 d.f. Ch. 12-25 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

26 Confidence Intervals Confidence interval for the population slope β i Example: Excel output also reports these interval endpoints: Weekly sales are estimated to be reduced by between 1.37 to 48.58 pies for each increase of $1 in the selling price CoefficientsStandard Error…Lower 95%Upper 95% Intercept306.52619114.25389…57.58835555.46404 Price-24.9750910.83213…-48.57626-1.37392 Advertising74.1309625.96732…17.55303130.70888 (continued) Ch. 12-26 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

27 Hypothesis Tests Use t-tests for individual coefficients Shows if a specific independent variable is conditionally important Hypotheses: H 0 : β j = 0 (no linear relationship) H 1 : β j ≠ 0 (linear relationship does exist between x j and y) Ch. 12-27 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

28 Evaluating Individual Regression Coefficients H 0 : β j = 0 (no linear relationship) H 1 : β j ≠ 0 (linear relationship does exist between x i and y) Test Statistic: ( df = n – k – 1) (continued) Ch. 12-28 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

29 Evaluating Individual Regression Coefficients Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSFSignificance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 CoefficientsStandard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 t-value for Price is t = -2.306, with p-value.0398 t-value for Advertising is t = 2.855, with p-value.0145 (continued) Ch. 12-29 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

30 H 0 : β j = 0 H 1 : β j  0 d.f. = 15-2-1 = 12  =.05 t 12,.025 = 2.1788 The test statistic for each variable falls in the rejection region (p-values <.05) There is evidence that both Price and Advertising affect pie sales at  =.05 From Excel output: Reject H 0 for each variable CoefficientsStandard Errort StatP-value Price-24.9750910.83213-2.305650.03979 Advertising74.1309625.967322.854780.01449 Decision: Conclusion: Reject H 0  /2=.025 -t α/2 Do not reject H 0 0 t α/2  /2=.025 -2.17882.1788 Example: Evaluating Individual Regression Coefficients Ch. 12-30 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

31 Tests on Regression Coefficients Tests on All Coefficients F-Test for Overall Significance of the Model Shows if there is a linear relationship between all of the X variables considered together and Y Use F test statistic Hypotheses: H 0 : β 1 = β 2 = … = β K = 0 (no linear relationship) H 1 : at least one β i ≠ 0 (at least one independent variable affects Y) Ch. 12-31 12.5 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

32 F-Test for Overall Significance Test statistic: where F has K (numerator) and (n – K – 1) (denominator) degrees of freedom The decision rule is Ch. 12-32 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

33 F-Test for Overall Significance Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSFSignificance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 CoefficientsStandard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 (continued) With 2 and 12 degrees of freedom P-value for the F-Test Ch. 12-33 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

34 F-Test for Overall Significance H 0 : β 1 = β 2 = 0 H 1 : β 1 and β 2 not both zero  =.05 df 1 = 2 df 2 = 12 Test Statistic: Decision: Conclusion: Since F test statistic is in the rejection region (p- value <.05), reject H 0 There is evidence that at least one independent variable affects Y 0  =.05 F.05 = 3.885 Reject H 0 Do not reject H 0 Critical Value: F  = 3.885 (continued) F Ch. 12-34 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

35 Test on a Subset of Regression Coefficients Consider a multiple regression model involving variables X j and Z j, and the null hypothesis that the Z variable coefficients are all zero: Ch. 12-35 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

36 Tests on a Subset of Regression Coefficients Goal: compare the error sum of squares for the complete model with the error sum of squares for the restricted model First run a regression for the complete model and obtain SSE Next run a restricted regression that excludes the Z variables (the number of variables excluded is R) and obtain the restricted error sum of squares SSE(R) Compute the F statistic and apply the decision rule for a significance level  (continued) Ch. 12-36 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

37 Newbold 12.41 n=30 families Explain household milk consumption Y: milk consumption (quarts per week) X 1 : family income (dolars per week) X 2 : family size LS estimators : b 0 :-0.025, b 1 :0.052, b 2 :1.14 SST: 162.1, SSE: 88.2 : X 3 : number of preschool childeren ? SSE 83.7

38 Test the null hypothesis that Number of preschool childeren in a family dose not significantly ieffects weekly family milk consumption

39

40 Prediction Given a population regression model then given a new observation of a data point (x 1,n+1, x 2,n+1,..., x K,n+1 ) the best linear unbiased forecast of y n+1 is It is risky to forecast for new X values outside the range of the data used to estimate the model coefficients, because we do not have data to support that the linear model extends beyond the observed range. ^ Ch. 12-40 12.6 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

41 Predictions from a Multiple Regression Model Predict sales for a week in which the selling price is $5.50 and advertising is $350: Predicted sales is 428.62 pies Note that Advertising is in $100’s, so $350 means that X 2 = 3.5 Ch. 12-41 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

42 Transformations for Nonlinear Regression Models The relationship between the dependent variable and an independent variable may not be linear Can review the scatter diagram to check for non-linear relationships Example: Quadratic model The second independent variable is the square of the first variable Ch. 12-42 12.7 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

43 Quadratic Model Transformations Let And specify the model as where: β 0 = Y intercept β 1 = regression coefficient for linear effect of X on Y β 2 = regression coefficient for quadratic effect on Y ε i = random error in Y for observation i Quadratic model form: Ch. 12-43 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

44 Linear vs. Nonlinear Fit Linear fit does not give random residuals Nonlinear fit gives random residuals X residuals X Y X Y X Ch. 12-44 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

45 Quadratic Regression Model Quadratic models may be considered when the scatter diagram takes on one of the following shapes: X1X1 Y X1X1 X1X1 YYY β 1 < 0β 1 > 0β 1 < 0β 1 > 0 β 1 = the coefficient of the linear term β 2 = the coefficient of the squared term X1X1 β 2 > 0 β 2 < 0 Ch. 12-45 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

46 Testing for Significance: Quadratic Effect Testing the Quadratic Effect Compare the linear regression estimate with quadratic regression estimate Hypotheses (The quadratic term does not improve the model) (The quadratic term improves the model) H 0 : β 2 = 0 H 1 : β 2  0 Ch. 12-46 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

47 Testing for Significance: Quadratic Effect Testing the Quadratic Effect Hypotheses (The quadratic term does not improve the model) (The quadratic term improves the model) The test statistic is H 0 : β 2 = 0 H 1 : β 2  0 (continued) where: b 2 = squared term slope coefficient β 2 = hypothesized slope (zero) S b = standard error of the slope 2 Ch. 12-47 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

48 Testing for Significance: Quadratic Effect Testing the Quadratic Effect Compare R 2 from simple regression to R 2 from the quadratic model If R 2 from the quadratic model is larger than R 2 from the simple model, then the quadratic model is a better model (continued) Ch. 12-48 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

49 Exmple Performance by age First increases and then falls again for older ages Quadartic relation

50

51

52 Example: Quadratic Model Purity increases as filter time increases: Purity Filter Time 31 72 83 155 227 338 4010 5412 6713 7014 7815 8515 8716 9917 Ch. 12-52 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

53 Example: Quadratic Model Simple regression results: y = -11.283 + 5.985 Time (continued) Regression Statistics R Square0.96888 Adjusted R Square0.96628 Standard Error6.15997 Coefficients Standard Errort StatP-value Intercept-11.282673.46805-3.253320.00691 Time5.985200.3096619.328192.078E-10 FSignificance F 373.579042.0778E-10 ^ t statistic, F statistic, and R 2 are all high, but the residuals are not random: Ch. 12-53 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

54 Example: Quadratic Model Coefficients Standard Errort StatP-value Intercept1.538702.244650.685500.50722 Time1.564960.601792.600520.02467 Time-squared0.245160.032587.524061.165E-05 Regression Statistics R Square0.99494 Adjusted R Square0.99402 Standard Error2.59513 FSignificance F 1080.73302.368E-13  Quadratic regression results: y = 1.539 + 1.565 Time + 0.245 (Time) 2 ^ (continued) The quadratic term is significant and improves the model: R 2 is higher and s e is lower, residuals are now random Ch. 12-54 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

55 Logarithmic Transformations Original exponential model Transformed logarithmic model The Exponential Model: Ch. 12-55 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

56 Interpretation of coefficients For the logarithmic model: When both dependent and independent variables are logged: The estimated coefficient b k of the independent variable X k can be interpreted as a 1 percent change in X k leads to an estimated b k percentage change in the average value of Y b k is the elasticity of Y with respect to a change in X k Ch. 12-56 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

57 Example Cobb-Douglas Production Function Production function output as a function of labor and capital

58 Original and new Variables Original data and new variables

59 Regression Output

60 Parameter valeus b 1 =0.71 B 2 = 1- 0.71 = 0.29 Ln(b 0 ) = 2.577 so b 0 = exp(2.577) = 13.15 The estimated cobb-douglas Prodcution Function is Output = 13.15*L 0.71 K 0.29,

61 Dummy Variables for Regression Models A dummy variable is a categorical independent variable with two levels: yes or no, on or off, male or female recorded as 0 or 1 Regression intercepts are different if the variable is significant Assumes equal slopes for other variables If more than two levels, the number of dummy variables needed is (number of levels - 1) Ch. 12-61 12.8 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

62 Dummy Variable Example Let: y = Pie Sales x 1 = Price x 2 = Holiday (X 2 = 1 if a holiday occurred during the week) (X 2 = 0 if there was no holiday that week) Ch. 12-62 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

63 Dummy Variable Example Same slope (continued) x 1 (Price) y (sales) b 0 + b 2 b0b0 Holiday No Holiday Different intercept Holiday (x 2 = 1) No Holiday (x 2 = 0) If H 0 : β 2 = 0 is rejected, then “Holiday” has a significant effect on pie sales Ch. 12-63 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

64 Interpreting the Dummy Variable Coefficient Sales: number of pies sold per week Price: pie price in $ Holiday: Example: 1 If a holiday occurred during the week 0 If no holiday occurred b 2 = 15: on average, sales were 15 pies greater in weeks with a holiday than in weeks without a holiday, given the same price Ch. 12-64 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

65 Differences in Slope Hypothesizes interaction between pairs of x variables Response to one x variable may vary at different levels of another x variable Contains two-way cross product terms Ch. 12-65 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

66 Effect of Interaction Given: Without interaction term, effect of X 1 on Y is measured by β 1 With interaction term, effect of X 1 on Y is measured by β 1 + β 3 X 2 Effect changes as X 2 changes Ch. 12-66 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

67 Interaction Example x 2 = 1: y = 1 + 2x 1 + 3(1) + 4x 1 (1) = 4 + 6x 1 x 2 = 0: y = 1 + 2x 1 + 3(0) + 4x 1 (0) = 1 + 2x 1 Slopes are different if the effect of x 1 on y depends on x 2 value x1x1 4 8 12 0 010.51.5 y Suppose x 2 is a dummy variable and the estimated regression equation is ^ ^ Ch. 12-67 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

68 Significance of Interaction Term The coefficient b 3 is an estimate of the difference in the coefficient of x 1 when x 2 = 1 compared to when x 2 = 0 The t statistic for b 3 can be used to test the hypothesis If we reject the null hypothesis we conclude that there is a difference in the slope coefficient for the two subgroups Ch. 12-68 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

69 Multiple Regression Analysis Application Procedure Assumptions: The errors are normally distributed Errors have a constant variance The model errors are independent e i = (y i – y i ) < Errors ( residuals ) from the regression model: Ch. 12-69 12.9 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

70 Analysis of Residuals These residual plots are used in multiple regression: Residuals vs. y i Residuals vs. x 1i Residuals vs. x 2i Residuals vs. time (if time series data) < Use the residual plots to check for violations of regression assumptions Ch. 12-70 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

71 Chapter Summary Developed the multiple regression model Tested the significance of the multiple regression model Discussed adjusted R 2 ( R 2 ) Tested individual regression coefficients Tested portions of the regression model Used quadratic terms and log transformations in regression models Explained dummy variables Evaluated interaction effects Discussed using residual plots to check model assumptions Ch. 12-71 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

72 Rcrtv,dr 12.78 n=93 rate scale of 1(low)-10(high) opinion of residence hall life levels of satisfactions room mates, floor, hall, resident advisor Y. overall opinion X1, X4

73 Source df SS MSS F R-sq model 4 37.016 9.2540 9.958 0.312 error 88 81.780 0.9293 total 92 118.79 param estimate t std error inter. 3.950 5.84 0.676 x1 0.106 1.69 0.063 x2 0.122 1.70 0.072 x3 0.0921.75 0.053 x4 0.169 2.64 0.064

74

75 Exercise 12.82 developing countiry - 48 countries Y:per manifacturing growth x1 per agricultural growth x2 per exorts growth x3 per rate of inflation

76

77

78 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-78 All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher. Printed in the United States of America.


Download ppt "Statistics for Business and Economics 8 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice."

Similar presentations


Ads by Google