Chapter 8-B Multiple Regression

Slides:



Advertisements
Similar presentations
The Multiple Regression Model.
Advertisements

Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 14-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
Qualitative Variables and
Chapter 13 Multiple Regression
Chapter 13 Additional Topics in Regression Analysis
Chapter 14 Introduction to Multiple Regression
Chapter 12 Simple Regression
Interaksi Dalam Regresi (Lanjutan) Pertemuan 25 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Regresi dan Rancangan Faktorial Pertemuan 23 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter 12 Multiple Regression
© 2003 Prentice-Hall, Inc.Chap 14-1 Basic Business Statistics (9 th Edition) Chapter 14 Introduction to Multiple Regression.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 11 th Edition.
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
Ch. 14: The Multiple Regression Model building
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 11 th Edition.
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 15-1 Business Statistics: A Decision-Making Approach 8 th Edition Chapter 15 Multiple.
Chapter 13 Simple Linear Regression
Copyright ©2011 Pearson Education 15-1 Chapter 15 Multiple Regression Model Building Statistics for Managers using Microsoft Excel 6 th Global Edition.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Introduction to Multiple Regression Statistics for Managers.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Chapter 13 Simple Linear Regression
Introduction to Linear Regression and Correlation Analysis
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 12-1 Chapter 12 Simple Linear Regression Statistics for Managers Using.
Chapter 15 Multiple Regression
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 15-1 Chapter 15 Multiple Regression Model Building Statistics for Managers using Microsoft.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Chapter 14 Introduction to Multiple Regression
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Statistics for Business and Economics 8 th Edition Chapter 12 Multiple Regression Ch Copyright © 2013 Pearson Education, Inc. Publishing as Prentice.
Chap 14-1 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Lecture 4 Introduction to Multiple Regression
Lecture 10: Correlation and Regression Model.
Lecture 3 Introduction to Multiple Regression Business and Economic Forecasting.
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 14-1 Chapter 14 Introduction to Multiple Regression Statistics for Managers using Microsoft.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice- Hall, Inc. Chap 14-1 Business Statistics: A Decision-Making Approach 6 th Edition.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 10 th Edition.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 10 th Edition.
Chap 13-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 13 Multiple Regression and.
Introduction to Multiple Regression Lecture 11. The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 14-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Chapter 13 Simple Linear Regression
Chapter 15 Multiple Regression Model Building
Chapter 4: Basic Estimation Techniques
Chapter 14 Introduction to Multiple Regression
Chapter 20 Linear and Multiple Regression
Chapter 4 Basic Estimation Techniques
Chapter 15 Multiple Regression and Model Building
Statistics for Managers using Microsoft Excel 3rd Edition
Correlation and Simple Linear Regression
Linear Regression and Correlation Analysis
Multiple Regression Analysis and Model Building
Simple Linear Regression
Chapter 11 Simple Regression
Chapter 13 Simple Linear Regression
Chapter 15 Multiple Regression Analysis and Model Building
Chapter 11 Hypothesis Tests and Estimation for Population Variances
Prepared by Lee Revere and John Large
Pemeriksaan Sisa dan Data Berpengaruh Pertemuan 17
Korelasi Parsial dan Pengontrolan Parsial Pertemuan 14
Chapter 15 Analysis of Variance
Introduction to Regression
Chapter 13 Simple Linear Regression
Presentation transcript:

Chapter 8-B Multiple Regression SMIS 331 Data Mining 2018/2019 Fall Chapter 8-B Multiple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch. 11-1

Chapter 12 Multiple Regression Statistics for Business and Economics 8th Edition Chapter 12 Multiple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Chapter Goals After completing this chapter, you should be able to: Apply multiple regression analysis to business decision-making situations Analyze and interpret the computer output for a multiple regression model Perform a hypothesis test for all regression coefficients or for a subset of coefficients Fit and interpret nonlinear regression models Incorporate qualitative variables into the regression model by using dummy variables Discuss model specification and analyze residuals Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

The Multiple Regression Model 12.1 Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more independent variables (Xi) Multiple Regression Model with K Independent Variables: Y-intercept Population slopes Random Error Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Multiple Regression Equation The coefficients of the multiple regression model are estimated using sample data Multiple regression equation with K independent variables: Estimated (or predicted) value of y Estimated intercept Estimated slope coefficients In this chapter we will always use a computer to obtain the regression slope coefficients and other regression summary measures. Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Three Dimensional Graphing Two variable model y Slope for variable x1 x2 Slope for variable x2 x1 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Three Dimensional Graphing (continued) Two variable model y Sample observation yi < yi ei = (yi – yi) < x2i x2 x1i The best fit equation, y , is found by minimizing the sum of squared errors, e2 < x1 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Estimation of Coefficients 12.2 Standard Multiple Regression Assumptions 1. The xji terms are fixed numbers, or they are realizations of random variables Xj that are independent of the error terms, εi 2. The expected value of the random variable Y is a linear function of the independent Xj variables. 3. The error terms are normally distributed random variables with mean 0 and a constant variance, 2. (The constant variance property is called homoscedasticity) Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Standard Multiple Regression Assumptions (continued) 4. The random error terms, εi , are not correlated with one another, so that 5. It is not possible to find a set of numbers, c0, c1, . . . , ck, such that (This is the property of no linear relation for the Xjs) Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

The coefficient estimators are computed using the following equations:

Example: 2 Independent Variables A distributor of frozen desert pies wants to evaluate factors thought to influence demand Dependent variable: Pie sales (units per week) Independent variables: Price (in $) Advertising ($100’s) Data are collected for 15 weeks Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Pie Sales Example Week Pie Sales Price ($) Advertising ($100s) 1 350 5.50 3.3 2 460 7.50 3 8.00 3.0 4 430 4.5 5 6.80 6 380 4.0 7 4.50 8 470 6.40 3.7 9 450 7.00 3.5 10 490 5.00 11 340 7.20 12 300 7.90 3.2 13 440 5.90 14 15 2.7 Multiple regression equation: Sales = b0 + b1 (Price) + b2 (Advertising) Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Estimating a Multiple Linear Regression Equation Excel can be used to generate the coefficients and measures of goodness of fit for multiple regression Data / Data Analysis / Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Multiple Regression Output Regression Statistics Multiple R 0.72213 R Square 0.52148 Adjusted R Square 0.44172 Standard Error 47.46341 Observations 15 ANOVA   df SS MS F Significance F Regression 2 29460.027 14730.013 6.53861 0.01201 Residual 12 27033.306 2252.776 Total 14 56493.333   Coefficients t Stat P-value Lower 95% Upper 95% Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404 Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392 Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

The Multiple Regression Equation where Sales is in number of pies per week Price is in $ Advertising is in $100’s. b1 = -24.975: sales will decrease, on average, by 24.975 pies per week for each $1 increase in selling price, net of the effects of changes due to advertising b2 = 74.131: sales will increase, on average, by 74.131 pies per week for each $100 increase in advertising, net of the effects of changes due to price Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Example 12.3 Y profit margines X1: net revenue for deposit dollar X2 nunber of offices Y = 1.56 + 0.237X1 -0.000249X2 Corrleations Y X1 X1 -0.704 X2 -.0.868 0.941

Y = 1.33 -0.169X1 Y = 1.55 – 0.000120X2

Explanatory Power of a Multiple Regression Equation 12.3 Explanatory Power of a Multiple Regression Equation Coefficient of Determination, R2 Reports the proportion of total variation in y explained by all x variables taken together This is the ratio of the explained variability to total sample variability Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Coefficient of Determination, R2 Regression Statistics Multiple R 0.72213 R Square 0.52148 Adjusted R Square 0.44172 Standard Error 47.46341 Observations 15 ANOVA   df SS MS F Significance F Regression 2 29460.027 14730.013 6.53861 0.01201 Residual 12 27033.306 2252.776 Total 14 56493.333   Coefficients t Stat P-value Lower 95% Upper 95% Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404 Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392 Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888 52.1% of the variation in pie sales is explained by the variation in price and advertising Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Estimation of Error Variance Consider the population regression model The unbiased estimate of the variance of the errors is where The square root of the variance, se , is called the standard error of the estimate Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Regression Statistics Standard Error, se Regression Statistics Multiple R 0.72213 R Square 0.52148 Adjusted R Square 0.44172 Standard Error 47.46341 Observations 15 ANOVA   df SS MS F Significance F Regression 2 29460.027 14730.013 6.53861 0.01201 Residual 12 27033.306 2252.776 Total 14 56493.333   Coefficients t Stat P-value Lower 95% Upper 95% Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404 Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392 Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888 The magnitude of this value can be compared to the average y value Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Adjusted Coefficient of Determination, R2 never decreases when a new X variable is added to the model, even if the new variable is not an important predictor variable This can be a disadvantage when comparing models What is the net effect of adding a new variable? We lose a degree of freedom when a new X variable is added Did the new X variable add enough explanatory power to offset the loss of one degree of freedom? Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Adjusted Coefficient of Determination, (continued) Used to correct for the fact that adding non-relevant independent variables will still reduce the error sum of squares (where n = sample size, K = number of independent variables) Adjusted R2 provides a better comparison between multiple regression models with different numbers of independent variables Penalize excessive use of unimportant independent variables Value is less than R2 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Regression Statistics Multiple R 0.72213 R Square 0.52148 Adjusted R Square 0.44172 Standard Error 47.46341 Observations 15 ANOVA   df SS MS F Significance F Regression 2 29460.027 14730.013 6.53861 0.01201 Residual 12 27033.306 2252.776 Total 14 56493.333   Coefficients t Stat P-value Lower 95% Upper 95% Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404 Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392 Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888 44.2% of the variation in pie sales is explained by the variation in price and advertising, taking into account the sample size and number of independent variables Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Coefficient of Multiple Correlation The coefficient of multiple correlation is the correlation between the predicted value and the observed value of the dependent variable Is the square root of the multiple coefficient of determination Used as another measure of the strength of the linear relationship between the dependent variable and the independent variables Comparable to the correlation between Y and X in simple regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Conf. Intervals and Hypothesis Tests for Regression Coefficients 12.4 The variance of a coefficient estimate is affected by: the sample size the spread of the X variables the correlations between the independent variables, and the model error term We are typically more interested in the regression coefficients bj than in the constant or intercept b0 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Confidence Intervals Confidence interval limits for the population slope βj where t has (n – K – 1) d.f.   Coefficients Standard Error Intercept 306.52619 114.25389 Price -24.97509 10.83213 Advertising 74.13096 25.96732 Here, t has (15 – 2 – 1) = 12 d.f. Example: Form a 95% confidence interval for the effect of changes in price (x1) on pie sales: -24.975 ± (2.1788)(10.832) So the interval is -48.576 < β1 < -1.374 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Confidence Intervals Confidence interval for the population slope βi (continued) Confidence interval for the population slope βi   Coefficients Standard Error … Lower 95% Upper 95% Intercept 306.52619 114.25389 57.58835 555.46404 Price -24.97509 10.83213 -48.57626 -1.37392 Advertising 74.13096 25.96732 17.55303 130.70888 Example: Excel output also reports these interval endpoints: Weekly sales are estimated to be reduced by between 1.37 to 48.58 pies for each increase of $1 in the selling price Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Hypothesis Tests Use t-tests for individual coefficients Shows if a specific independent variable is conditionally important Hypotheses: H0: βj = 0 (no linear relationship) H1: βj ≠ 0 (linear relationship does exist between xj and y) Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Evaluating Individual Regression Coefficients (continued) H0: βj = 0 (no linear relationship) H1: βj ≠ 0 (linear relationship does exist between xi and y) Test Statistic: (df = n – k – 1) Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Evaluating Individual Regression Coefficients (continued) Regression Statistics Multiple R 0.72213 R Square 0.52148 Adjusted R Square 0.44172 Standard Error 47.46341 Observations 15 ANOVA   df SS MS F Significance F Regression 2 29460.027 14730.013 6.53861 0.01201 Residual 12 27033.306 2252.776 Total 14 56493.333   Coefficients t Stat P-value Lower 95% Upper 95% Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404 Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392 Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888 t-value for Price is t = -2.306, with p-value .0398 t-value for Advertising is t = 2.855, with p-value .0145 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Example: Evaluating Individual Regression Coefficients From Excel output: H0: βj = 0 H1: βj  0   Coefficients Standard Error t Stat P-value Price -24.97509 10.83213 -2.30565 0.03979 Advertising 74.13096 25.96732 2.85478 0.01449 d.f. = 15-2-1 = 12 = .05 t12, .025 = 2.1788 The test statistic for each variable falls in the rejection region (p-values < .05) Decision: Conclusion: Reject H0 for each variable a/2=.025 a/2=.025 There is evidence that both Price and Advertising affect pie sales at  = .05 Reject H0 Do not reject H0 Reject H0 -tα/2 tα/2 -2.1788 2.1788 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Tests on Regression Coefficients 12.5 Tests on All Coefficients F-Test for Overall Significance of the Model Shows if there is a linear relationship between all of the X variables considered together and Y Use F test statistic Hypotheses: H0: β1 = β2 = … = βK = 0 (no linear relationship) H1: at least one βi ≠ 0 (at least one independent variable affects Y) Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

F-Test for Overall Significance Test statistic: where F has K (numerator) and (n – K – 1) (denominator) degrees of freedom The decision rule is Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

F-Test for Overall Significance (continued) Regression Statistics Multiple R 0.72213 R Square 0.52148 Adjusted R Square 0.44172 Standard Error 47.46341 Observations 15 ANOVA   df SS MS F Significance F Regression 2 29460.027 14730.013 6.53861 0.01201 Residual 12 27033.306 2252.776 Total 14 56493.333   Coefficients t Stat P-value Lower 95% Upper 95% Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404 Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392 Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888 With 2 and 12 degrees of freedom P-value for the F-Test Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

F-Test for Overall Significance (continued) Test Statistic: Decision: Conclusion: H0: β1 = β2 = 0 H1: β1 and β2 not both zero  = .05 df1= 2 df2 = 12 Critical Value: F = 3.885 Since F test statistic is in the rejection region (p-value < .05), reject H0  = .05 F There is evidence that at least one independent variable affects Y Do not reject H0 Reject H0 F.05 = 3.885 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Test on a Subset of Regression Coefficients Consider a multiple regression model involving variables Xj and Zj , and the null hypothesis that the Z variable coefficients are all zero: Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Tests on a Subset of Regression Coefficients (continued) Goal: compare the error sum of squares for the complete model with the error sum of squares for the restricted model First run a regression for the complete model and obtain SSE Next run a restricted regression that excludes the Z variables (the number of variables excluded is R) and obtain the restricted error sum of squares SSE(R) Compute the F statistic and apply the decision rule for a significance level  Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Newbold 12.41 n=30 families Explain household milk consumption Y: milk consumption (quarts per week) X1: family income (dolars per week) X2: family size LS estimators : b0:-0.025, b1:0.052, b2:1.14 SST: 162.1, SSE: 88.2 : X3: number of preschool childeren ? SSE 83.7

Test the null hypothesis that Number of preschool childeren in a family dose not significantly ieffects weekly family milk consumption

Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Prediction 12.6 Given a population regression model then given a new observation of a data point (x1,n+1, x 2,n+1, . . . , x K,n+1) the best linear unbiased forecast of yn+1 is It is risky to forecast for new X values outside the range of the data used to estimate the model coefficients, because we do not have data to support that the linear model extends beyond the observed range. ^ Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Predictions from a Multiple Regression Model Predict sales for a week in which the selling price is $5.50 and advertising is $350: Note that Advertising is in $100’s, so $350 means that X2 = 3.5 Predicted sales is 428.62 pies Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Transformations for Nonlinear Regression Models 12.7 The relationship between the dependent variable and an independent variable may not be linear Can review the scatter diagram to check for non-linear relationships Example: Quadratic model The second independent variable is the square of the first variable Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Quadratic Model Transformations Quadratic model form: Let And specify the model as where: β0 = Y intercept β1 = regression coefficient for linear effect of X on Y β2 = regression coefficient for quadratic effect on Y εi = random error in Y for observation i Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Linear vs. Nonlinear Fit Y Y X X X X residuals residuals Linear fit does not give random residuals Nonlinear fit gives random residuals  Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Quadratic Regression Model Quadratic models may be considered when the scatter diagram takes on one of the following shapes: Y Y Y Y X1 X1 X1 X1 β1 < 0 β1 > 0 β1 < 0 β1 > 0 β2 > 0 β2 > 0 β2 < 0 β2 < 0 β1 = the coefficient of the linear term β2 = the coefficient of the squared term Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Testing for Significance: Quadratic Effect Testing the Quadratic Effect Compare the linear regression estimate with quadratic regression estimate Hypotheses (The quadratic term does not improve the model) (The quadratic term improves the model) H0: β2 = 0 H1: β2  0 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Testing for Significance: Quadratic Effect (continued) Testing the Quadratic Effect Hypotheses (The quadratic term does not improve the model) (The quadratic term improves the model) The test statistic is H0: β2 = 0 H1: β2  0 where: b2 = squared term slope coefficient β2 = hypothesized slope (zero) Sb = standard error of the slope 2 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Testing for Significance: Quadratic Effect (continued) Testing the Quadratic Effect Compare R2 from simple regression to R2 from the quadratic model If R2 from the quadratic model is larger than R2 from the simple model, then the quadratic model is a better model Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Exmple Performance by age First increases and then falls again for older ages Quadartic relation

Example: Quadratic Model Purity increases as filter time increases: Purity Filter Time 3 1 7 2 8 15 5 22 33 40 10 54 12 67 13 70 14 78 85 87 16 99 17 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Example: Quadratic Model (continued) Simple regression results: y = -11.283 + 5.985 Time ^   Coefficients Standard Error t Stat P-value Intercept -11.28267 3.46805 -3.25332 0.00691 Time 5.98520 0.30966 19.32819 2.078E-10 Regression Statistics R Square 0.96888 Adjusted R Square 0.96628 Standard Error 6.15997 F Significance F 373.57904 2.0778E-10 t statistic, F statistic, and R2 are all high, but the residuals are not random: Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Example: Quadratic Model (continued) Quadratic regression results: y = 1.539 + 1.565 Time + 0.245 (Time)2 ^   Coefficients Standard Error t Stat P-value Intercept 1.53870 2.24465 0.68550 0.50722 Time 1.56496 0.60179 2.60052 0.02467 Time-squared 0.24516 0.03258 7.52406 1.165E-05 Regression Statistics R Square 0.99494 Adjusted R Square 0.99402 Standard Error 2.59513 F Significance F 1080.7330 2.368E-13 The quadratic term is significant and improves the model: R2 is higher and se is lower, residuals are now random Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Logarithmic Transformations The Exponential Model: Original exponential model Transformed logarithmic model Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Interpretation of coefficients For the logarithmic model: When both dependent and independent variables are logged: The estimated coefficient bk of the independent variable Xk can be interpreted as a 1 percent change in Xk leads to an estimated bk percentage change in the average value of Y bk is the elasticity of Y with respect to a change in Xk Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Example Cobb-Douglas Production Function Production function output as a function of labor and capital

Original and new Variables Original data and new variables

Regression Output

Parameter valeus b1 =0.71 B2= 1- 0.71 = 0.29 Ln(b0) = 2.577 so b0 = exp(2.577) = 13.15 The estimated cobb-douglas Prodcution Function is Output = 13.15*L0.71K0.29,

Dummy Variables for Regression Models 12.8 A dummy variable is a categorical independent variable with two levels: yes or no, on or off, male or female recorded as 0 or 1 Regression intercepts are different if the variable is significant Assumes equal slopes for other variables If more than two levels, the number of dummy variables needed is (number of levels - 1) Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Dummy Variable Example Let: y = Pie Sales x1 = Price x2 = Holiday (X2 = 1 if a holiday occurred during the week) (X2 = 0 if there was no holiday that week) Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Dummy Variable Example (continued) Holiday No Holiday Different intercept Same slope y (sales) If H0: β2 = 0 is rejected, then “Holiday” has a significant effect on pie sales b0 + b2 Holiday (x2 = 1) b0 No Holiday (x2 = 0) x1 (Price) Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Interpreting the Dummy Variable Coefficient Example: Sales: number of pies sold per week Price: pie price in $ Holiday: 1 If a holiday occurred during the week 0 If no holiday occurred b2 = 15: on average, sales were 15 pies greater in weeks with a holiday than in weeks without a holiday, given the same price Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Differences in Slope Hypothesizes interaction between pairs of x variables Response to one x variable may vary at different levels of another x variable Contains two-way cross product terms Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Effect of Interaction Given: Without interaction term, effect of X1 on Y is measured by β1 With interaction term, effect of X1 on Y is measured by β1 + β3 X2 Effect changes as X2 changes Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Interaction Example Suppose x2 is a dummy variable and the estimated regression equation is y 12 x2 = 1: y = 1 + 2x1 + 3(1) + 4x1(1) = 4 + 6x1 ^ 8 4 x2 = 0: y = 1 + 2x1 + 3(0) + 4x1(0) = 1 + 2x1 ^ x1 0.5 1 1.5 Slopes are different if the effect of x1 on y depends on x2 value Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Significance of Interaction Term The coefficient b3 is an estimate of the difference in the coefficient of x1 when x2 = 1 compared to when x2 = 0 The t statistic for b3 can be used to test the hypothesis If we reject the null hypothesis we conclude that there is a difference in the slope coefficient for the two subgroups Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Multiple Regression Analysis Application Procedure 12.9 Multiple Regression Analysis Application Procedure Errors (residuals) from the regression model: ei = (yi – yi) < Assumptions: The errors are normally distributed Errors have a constant variance The model errors are independent Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Analysis of Residuals These residual plots are used in multiple regression: Residuals vs. yi Residuals vs. x1i Residuals vs. x2i Residuals vs. time (if time series data) < Use the residual plots to check for violations of regression assumptions Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Chapter Summary Developed the multiple regression model Tested the significance of the multiple regression model Discussed adjusted R2 ( R2 ) Tested individual regression coefficients Tested portions of the regression model Used quadratic terms and log transformations in regression models Explained dummy variables Evaluated interaction effects Discussed using residual plots to check model assumptions Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Rcrtv,dr 12.78 n=93 rate scale of 1(low)-10(high) opinion of residence hall life levels of satisfactions room mates, floor, hall, resident advisor Y. overall opinion X1, X4

Source df SS MSS F R-sq model 4 37.016 9.2540 9.958 0.312 error 88 81.780 0.9293 total 92 118.79 param estimate t std error inter. 3.950 5.84 0.676 x1 0.106 1.69 0.063 x2 0.122 1.70 0.072 x3 0.092 1.75 0.053 x4 0.169 2.64 0.064

Exercise 12.82 developing countiry - 48 countries Y:per manifacturing growth x1 per agricultural growth x2 per exorts growth x3 per rate of inflation

Printed in the United States of America. All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher. Printed in the United States of America. Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall

Chapter 13 Additional Topics in Regression Analysis Statistics for Business and Economics 8th Edition Chapter 13 Additional Topics in Regression Analysis Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch. 13-82

Dummy Variables and Experimental Design 13.2 Dummy variables can be used in situations in which the categorical variable of interest has more than two categories Dummy variables can also be useful in experimental design Experimental design is used to identify possible causes of variation in the value of the dependent variable Y outcomes are measured at specific combinations of levels for treatment and blocking variables The goal is to determine how the different treatments influence the Y outcome Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch. 13-83

Dummy Variable Models (More than 2 Levels) Consider a categorical variable with K levels The number of dummy variables needed is one less than the number of levels, K – 1 Example: y = house price ; x1 = square feet If style of the house is also thought to matter: Style = ranch, split level, condo Three levels, so two dummy variables are needed Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch. 13-84

Dummy Variable Models (More than 2 Levels) (continued) Example: Let “condo” be the default category, and let x2 and x3 be used for the other two categories: y = house price x1 = square feet x2 = 1 if ranch, 0 otherwise x3 = 1 if split level, 0 otherwise The multiple regression equation is: Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch. 13-85

Interpreting the Dummy Variable Coefficients (with 3 Levels) Consider the regression equation: For a condo: x2 = x3 = 0 With the same square feet, a ranch will have an estimated average price of 23.53 thousand dollars more than a condo For a ranch: x2 = 1; x3 = 0 With the same square feet, a split-level will have an estimated average price of 18.84 thousand dollars more than a condo. For a split level: x2 = 0; x3 = 1 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch. 13-86

Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Experimental Design Consider an experiment in which four treatments will be used, and the outcome also depends on three environmental factors that cannot be controlled by the experimenter Let variable z1 denote the treatment, where z1 = 1, 2, 3, or 4. Let z2 denote the environment factor (the “blocking variable”), where z2 = 1, 2, or 3 To model the four treatments, three dummy variables are needed To model the three environmental factors, two dummy variables are needed Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch. 13-87

Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Experimental Design (continued) Define five dummy variables, x1, x2, x3, x4, and x5 Let treatment level 1 be the default (z1 = 1) Define x1 = 1 if z1 = 2, x1 = 0 otherwise Define x2 = 1 if z1 = 3, x2 = 0 otherwise Define x3 = 1 if z1 = 4, x3 = 0 otherwise Let environment level 1 be the default (z2 = 1) Define x4 = 1 if z2 = 2, x4 = 0 otherwise Define x5 = 1 if z2 = 3, x5 = 0 otherwise Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch. 13-88

Experimental Design: Dummy Variable Tables The dummy variable values can be summarized in a table: Z1 X1 X2 X3 1 2 3 4 Z2 X4 X5 1 2 3 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch. 13-89

Experimental Design Model The experimental design model can be estimated using the equation The estimated value for β2 , for example, shows the amount by which the y value for treatment 3 exceeds the value for treatment 1 Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch. 13-90

Exercise 13.3 Write the model specification and define the variables for a multiple regression model to predict the cost per unit produced as a function of factory type indicated as: classic technology, computer-controlled machines, and computer-controlled material handling), and as a function of country indicated as Colombia, South Africa, and Japan

Solution