Download presentation
Presentation is loading. Please wait.
Published byAnna Garrison Modified over 9 years ago
1
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS
2
2 2 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination n Model Assumptions n Testing for Significance n Using the Estimated Regression Equation for Estimation and Prediction for Estimation and Prediction n Qualitative Independent Variables n Residual Analysis n Logistic Regression
3
3 3 Slide © 2008 Thomson South-Western. All Rights Reserved The equation that describes how the dependent variable y is related to the independent variables x 1, x 2,... x p and an error term is: The equation that describes how the dependent variable y is related to the independent variables x 1, x 2,... x p and an error term is: Multiple Regression Model y = 0 + 1 x 1 + 2 x 2 +... + p x p + where: 0, 1, 2,..., p are the parameters, and is a random variable called the error term n Multiple Regression Model
4
4 4 Slide © 2008 Thomson South-Western. All Rights Reserved The equation that describes how the mean value of y is related to x 1, x 2,... x p is: The equation that describes how the mean value of y is related to x 1, x 2,... x p is: Multiple Regression Equation E ( y ) = 0 + 1 x 1 + 2 x 2 +... + p x p n Multiple Regression Equation
5
5 5 Slide © 2008 Thomson South-Western. All Rights Reserved A simple random sample is used to compute sample statistics b 0, b 1, b 2,..., b p that are used as the point estimators of the parameters 0, 1, 2,..., p. Estimated Multiple Regression Equation ^ y = b 0 + b 1 x 1 + b 2 x 2 +... + b p x p Estimated Multiple Regression Equation Estimated Multiple Regression Equation
6
6 6 Slide © 2008 Thomson South-Western. All Rights Reserved Estimation Process Multiple Regression Model E ( y ) = 0 + 1 x 1 + 2 x 2 +...+ p x p + Multiple Regression Equation E ( y ) = 0 + 1 x 1 + 2 x 2 +...+ p x p Unknown parameters are 0, 1, 2,..., p Sample Data: x 1 x 2... x p y.... Estimated Multiple Regression Equation Sample statistics are b 0, b 1, b 2,..., b p b 0, b 1, b 2,..., b p b 0, b 1, b 2,..., b p provide estimates of 0, 1, 2,..., p
7
7 7 Slide © 2008 Thomson South-Western. All Rights Reserved Least Squares Method n Least Squares Criterion n Computation of Coefficient Values The formulas for the regression coefficients The formulas for the regression coefficients b 0, b 1, b 2,... b p involve the use of matrix algebra. We will rely on computer software packages to perform the calculations.
8
8 8 Slide © 2008 Thomson South-Western. All Rights Reserved The years of experience, score on the aptitude The years of experience, score on the aptitude test, and corresponding annual salary ($1000s) for a sample of 20 programmers is shown on the next slide. n Example: Programmer Salary Survey Multiple Regression Model A software firm collected data for a sample A software firm collected data for a sample of 20 computer programmers. A suggestion was made that regression analysis could be used to determine if salary was related to the years of experience and the score on the firm’s programmer aptitude test.
9
9 9 Slide © 2008 Thomson South-Western. All Rights Reserved 471581001669210568463378100868286847580839188737581748779947089 24.043.023.734.335.838.022.223.130.033.0 38.026.636.231.629.034.030.133.928.230.0 Exper.ScoreScoreExper.SalarySalary Multiple Regression Model
10
10 Slide © 2008 Thomson South-Western. All Rights Reserved Suppose we believe that salary ( y ) is Suppose we believe that salary ( y ) is related to the years of experience ( x 1 ) and the score on the programmer aptitude test ( x 2 ) by the following regression model: Multiple Regression Model where y = annual salary ($1000) y = annual salary ($1000) x 1 = years of experience x 1 = years of experience x 2 = score on programmer aptitude test x 2 = score on programmer aptitude test y = 0 + 1 x 1 + 2 x 2 +
11
11 Slide © 2008 Thomson South-Western. All Rights Reserved Solving for the Estimates of 0, 1, 2 Input Data Least Squares Output x 1 x 2 y 4 78 24 4 78 24 7 100 43 7 100 43...... 3 89 30 3 89 30 ComputerPackage for Solving MultipleRegressionProblems b 0 = b 0 = b 1 = b 2 = R 2 = etc.
12
12 Slide © 2008 Thomson South-Western. All Rights Reserved n Excel’s Regression Equation Output Note: Columns F-I are not shown. Solving for the Estimates of 0, 1, 2
13
13 Slide © 2008 Thomson South-Western. All Rights Reserved Estimated Regression Equation SALARY = 3.174 + 1.404(EXPER) + 0.251(SCORE) Note: Predicted salary will be in thousands of dollars.
14
14 Slide © 2008 Thomson South-Western. All Rights Reserved Interpreting the Coefficients In multiple regression analysis, we interpret each In multiple regression analysis, we interpret each regression coefficient as follows: regression coefficient as follows: b i represents an estimate of the change in y b i represents an estimate of the change in y corresponding to a 1-unit increase in x i when all corresponding to a 1-unit increase in x i when all other independent variables are held constant. other independent variables are held constant.
15
15 Slide © 2008 Thomson South-Western. All Rights Reserved Salary is expected to increase by $1,404 for Salary is expected to increase by $1,404 for each additional year of experience (when the variable score on programmer attitude test is held constant). b 1 = 1.404 Interpreting the Coefficients
16
16 Slide © 2008 Thomson South-Western. All Rights Reserved Salary is expected to increase by $251 for each Salary is expected to increase by $251 for each additional point scored on the programmer aptitude additional point scored on the programmer aptitude test (when the variable years of experience is held constant). b 2 = 0.251 Interpreting the Coefficients
17
17 Slide © 2008 Thomson South-Western. All Rights Reserved Multiple Coefficient of Determination n Relationship Among SST, SSR, SSE where: SST = total sum of squares SST = total sum of squares SSR = sum of squares due to regression SSR = sum of squares due to regression SSE = sum of squares due to error SSE = sum of squares due to error SST = SSR + SSE = +
18
18 Slide © 2008 Thomson South-Western. All Rights Reserved n Excel’s ANOVA Output Multiple Coefficient of Determination SSR SST
19
19 Slide © 2008 Thomson South-Western. All Rights Reserved Multiple Coefficient of Determination R 2 = 500.3285/599.7855 =.83418 R 2 = SSR/SST
20
20 Slide © 2008 Thomson South-Western. All Rights Reserved Adjusted Multiple Coefficient of Determination
21
21 Slide © 2008 Thomson South-Western. All Rights Reserved The variance of , denoted by 2, is the same for all The variance of , denoted by 2, is the same for all values of the independent variables. values of the independent variables. The variance of , denoted by 2, is the same for all The variance of , denoted by 2, is the same for all values of the independent variables. values of the independent variables. The error is a normally distributed random variable The error is a normally distributed random variable reflecting the deviation between the y value and the reflecting the deviation between the y value and the expected value of y given by 0 + 1 x 1 + 2 x 2 +.. + p x p. expected value of y given by 0 + 1 x 1 + 2 x 2 +.. + p x p. The error is a normally distributed random variable The error is a normally distributed random variable reflecting the deviation between the y value and the reflecting the deviation between the y value and the expected value of y given by 0 + 1 x 1 + 2 x 2 +.. + p x p. expected value of y given by 0 + 1 x 1 + 2 x 2 +.. + p x p. Assumptions About the Error Term The error is a random variable with mean of zero. The error is a random variable with mean of zero. The values of are independent. The values of are independent.
22
22 Slide © 2008 Thomson South-Western. All Rights Reserved In simple linear regression, the F and t tests provide In simple linear regression, the F and t tests provide the same conclusion. the same conclusion. In simple linear regression, the F and t tests provide In simple linear regression, the F and t tests provide the same conclusion. the same conclusion. Testing for Significance In multiple regression, the F and t tests have different In multiple regression, the F and t tests have different purposes. purposes. In multiple regression, the F and t tests have different In multiple regression, the F and t tests have different purposes. purposes.
23
23 Slide © 2008 Thomson South-Western. All Rights Reserved Testing for Significance: F Test The F test is referred to as the test for overall The F test is referred to as the test for overall significance. significance. The F test is referred to as the test for overall The F test is referred to as the test for overall significance. significance. The F test is used to determine whether a significant The F test is used to determine whether a significant relationship exists between the dependent variable relationship exists between the dependent variable and the set of all the independent variables. and the set of all the independent variables. The F test is used to determine whether a significant The F test is used to determine whether a significant relationship exists between the dependent variable relationship exists between the dependent variable and the set of all the independent variables. and the set of all the independent variables.
24
24 Slide © 2008 Thomson South-Western. All Rights Reserved A separate t test is conducted for each of the A separate t test is conducted for each of the independent variables in the model. independent variables in the model. A separate t test is conducted for each of the A separate t test is conducted for each of the independent variables in the model. independent variables in the model. If the F test shows an overall significance, the t test is If the F test shows an overall significance, the t test is used to determine whether each of the individual used to determine whether each of the individual independent variables is significant. independent variables is significant. If the F test shows an overall significance, the t test is If the F test shows an overall significance, the t test is used to determine whether each of the individual used to determine whether each of the individual independent variables is significant. independent variables is significant. Testing for Significance: t Test We refer to each of these t tests as a test for individual We refer to each of these t tests as a test for individual significance. significance. We refer to each of these t tests as a test for individual We refer to each of these t tests as a test for individual significance. significance.
25
25 Slide © 2008 Thomson South-Western. All Rights Reserved Testing for Significance: F Test Hypotheses Rejection Rule Test Statistics H 0 : 1 = 2 =... = p = 0 H 0 : 1 = 2 =... = p = 0 H a : One or more of the parameters H a : One or more of the parameters is not equal to zero. is not equal to zero. F = MSR/MSE Reject H 0 if p -value F where F is based on an F distribution with p d.f. in the numerator and n - p - 1 d.f. in the denominator.
26
26 Slide © 2008 Thomson South-Western. All Rights Reserved Testing for Significance: t Test Hypotheses Rejection Rule Test Statistics Reject H 0 if p -value < or if t t where t is based on a t distribution with n - p - 1 degrees of freedom.
27
27 Slide © 2008 Thomson South-Western. All Rights Reserved Testing for Significance: Multicollinearity The term multicollinearity refers to the correlation The term multicollinearity refers to the correlation among the independent variables. among the independent variables. The term multicollinearity refers to the correlation The term multicollinearity refers to the correlation among the independent variables. among the independent variables. When the independent variables are highly correlated When the independent variables are highly correlated (say, | r | >.7), it is not possible to determine the (say, | r | >.7), it is not possible to determine the separate effect of any particular independent variable separate effect of any particular independent variable on the dependent variable. on the dependent variable. When the independent variables are highly correlated When the independent variables are highly correlated (say, | r | >.7), it is not possible to determine the (say, | r | >.7), it is not possible to determine the separate effect of any particular independent variable separate effect of any particular independent variable on the dependent variable. on the dependent variable.
28
28 Slide © 2008 Thomson South-Western. All Rights Reserved Testing for Significance: Multicollinearity Every attempt should be made to avoid including Every attempt should be made to avoid including independent variables that are highly correlated. independent variables that are highly correlated. Every attempt should be made to avoid including Every attempt should be made to avoid including independent variables that are highly correlated. independent variables that are highly correlated. If the estimated regression equation is to be used only If the estimated regression equation is to be used only for predictive purposes, multicollinearity is usually for predictive purposes, multicollinearity is usually not a serious problem. not a serious problem. If the estimated regression equation is to be used only If the estimated regression equation is to be used only for predictive purposes, multicollinearity is usually for predictive purposes, multicollinearity is usually not a serious problem. not a serious problem.
29
29 Slide © 2008 Thomson South-Western. All Rights Reserved Using the Estimated Regression Equation for Estimation and Prediction The procedures for estimating the mean value of y The procedures for estimating the mean value of y and predicting an individual value of y in multiple and predicting an individual value of y in multiple regression are similar to those in simple regression. regression are similar to those in simple regression. The procedures for estimating the mean value of y The procedures for estimating the mean value of y and predicting an individual value of y in multiple and predicting an individual value of y in multiple regression are similar to those in simple regression. regression are similar to those in simple regression. We substitute the given values of x 1, x 2,..., x p into We substitute the given values of x 1, x 2,..., x p into the estimated regression equation and use the the estimated regression equation and use the corresponding value of y as the point estimate. corresponding value of y as the point estimate. We substitute the given values of x 1, x 2,..., x p into We substitute the given values of x 1, x 2,..., x p into the estimated regression equation and use the the estimated regression equation and use the corresponding value of y as the point estimate. corresponding value of y as the point estimate.
30
30 Slide © 2008 Thomson South-Western. All Rights Reserved Using the Estimated Regression Equation for Estimation and Prediction Software packages for multiple regression will often Software packages for multiple regression will often provide these interval estimates. provide these interval estimates. Software packages for multiple regression will often Software packages for multiple regression will often provide these interval estimates. provide these interval estimates. The formulas required to develop interval estimates The formulas required to develop interval estimates for the mean value of y and for an individual value for the mean value of y and for an individual value of y are beyond the scope of the textbook. of y are beyond the scope of the textbook. The formulas required to develop interval estimates The formulas required to develop interval estimates for the mean value of y and for an individual value for the mean value of y and for an individual value of y are beyond the scope of the textbook. of y are beyond the scope of the textbook.^^
31
31 Slide © 2008 Thomson South-Western. All Rights Reserved In many situations we must work with qualitative In many situations we must work with qualitative independent variables such as gender (male, female), independent variables such as gender (male, female), method of payment (cash, check, credit card), etc. method of payment (cash, check, credit card), etc. In many situations we must work with qualitative In many situations we must work with qualitative independent variables such as gender (male, female), independent variables such as gender (male, female), method of payment (cash, check, credit card), etc. method of payment (cash, check, credit card), etc. For example, x 2 might represent gender where x 2 = 0 For example, x 2 might represent gender where x 2 = 0 indicates male and x 2 = 1 indicates female. indicates male and x 2 = 1 indicates female. For example, x 2 might represent gender where x 2 = 0 For example, x 2 might represent gender where x 2 = 0 indicates male and x 2 = 1 indicates female. indicates male and x 2 = 1 indicates female. Qualitative Independent Variables In this case, x 2 is called a dummy or indicator variable. In this case, x 2 is called a dummy or indicator variable.
32
32 Slide © 2008 Thomson South-Western. All Rights Reserved The years of experience, the score on the programmer The years of experience, the score on the programmer aptitude test, whether the individual has a relevant graduate degree, and the annual salary ($1000) for each of the sampled 20 programmers are shown on the next slide. Qualitative Independent Variables n Example: Programmer Salary Survey As an extension of the problem involving the As an extension of the problem involving the computer programmer salary survey, suppose that management also believes that the annual salary is related to whether the individual has a graduate degree in computer science or information systems.
33
33 Slide © 2008 Thomson South-Western. All Rights Reserved 471581001669210568463378100868286847580839188737581748779947089 24.043.023.734.335.838.022.223.130.033.0 38.026.636.231.629.034.030.133.928.230.0 Exper.ScoreScoreExper.SalarySalaryDegr. No NoYes YesYesYes Yes Degr. Yes Yes No NoYes Yes Yes Qualitative Independent Variables
34
34 Slide © 2008 Thomson South-Western. All Rights Reserved Estimated Regression Equation y = b 0 + b 1 x 1 + b 2 x 2 + b 3 x 3 ^where: y = annual salary ($1000) y = annual salary ($1000) x 1 = years of experience x 1 = years of experience x 2 = score on programmer aptitude test x 2 = score on programmer aptitude test x 3 = 0 if individual does not have a graduate degree x 3 = 0 if individual does not have a graduate degree 1 if individual does have a graduate degree 1 if individual does have a graduate degree x 3 is a dummy variable
35
35 Slide © 2008 Thomson South-Western. All Rights Reserved n Excel’s Regression Equation Output Note: Columns F-I are not shown. Qualitative Independent Variables Not significant
36
36 Slide © 2008 Thomson South-Western. All Rights Reserved More Complex Qualitative Variables If a qualitative variable has k levels, k - 1 dummy If a qualitative variable has k levels, k - 1 dummy variables are required, with each dummy variable variables are required, with each dummy variable being coded as 0 or 1. being coded as 0 or 1. If a qualitative variable has k levels, k - 1 dummy If a qualitative variable has k levels, k - 1 dummy variables are required, with each dummy variable variables are required, with each dummy variable being coded as 0 or 1. being coded as 0 or 1. For example, a variable with levels A, B, and C could For example, a variable with levels A, B, and C could be represented by x 1 and x 2 values of (0, 0) for A, (1, 0) be represented by x 1 and x 2 values of (0, 0) for A, (1, 0) for B, and (0,1) for C. for B, and (0,1) for C. For example, a variable with levels A, B, and C could For example, a variable with levels A, B, and C could be represented by x 1 and x 2 values of (0, 0) for A, (1, 0) be represented by x 1 and x 2 values of (0, 0) for A, (1, 0) for B, and (0,1) for C. for B, and (0,1) for C. Care must be taken in defining and interpreting the Care must be taken in defining and interpreting the dummy variables. dummy variables. Care must be taken in defining and interpreting the Care must be taken in defining and interpreting the dummy variables. dummy variables.
37
37 Slide © 2008 Thomson South-Western. All Rights Reserved For example, a variable indicating level of education could be represented by x 1 and x 2 values as follows: For example, a variable indicating level of education could be represented by x 1 and x 2 values as follows: More Complex Qualitative Variables Highest Degree x 1 x 2 Bachelor’s00 Master’s10 Ph.D.01
38
38 Slide © 2008 Thomson South-Western. All Rights Reserved Residual Analysis n For simple linear regression the residual plot against and the residual plot against x provide the same information. and the residual plot against x provide the same information. n In multiple regression analysis it is preferable to use the residual plot against to determine if the model assumptions are satisfied.
39
39 Slide © 2008 Thomson South-Western. All Rights Reserved Standardized Residual Plot Against n Standardized residuals are frequently used in residual plots for purposes of: Identifying outliers (typically, standardized residuals +2) Identifying outliers (typically, standardized residuals +2) Providing insight about the assumption that the error term has a normal distribution Providing insight about the assumption that the error term has a normal distribution n The computation of the standardized residuals in multiple regression analysis is too complex to be done by hand n Excel’s Regression tool can be used
40
40 Slide © 2008 Thomson South-Western. All Rights Reserved n Excel Value Worksheet Note: Rows 37-51 are not shown. Standardized Residual Plot Against
41
41 Slide © 2008 Thomson South-Western. All Rights Reserved Standardized Residual Plot Against n Excel’s Standardized Residual Plot Outlier
42
42 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression n Logistic regression can be used to model situations in which the dependent variable, y, may only assume two discrete values, such as 0 and 1. n In many ways logistic regression is like ordinary regression. It requires a dependent variable, y, and one or more independent variables. n The ordinary multiple regression model is not applicable.
43
43 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression Logistic Regression Equation Logistic Regression Equation The relationship between E ( y ) and x 1, x 2,..., x p is The relationship between E ( y ) and x 1, x 2,..., x p is better described by the following nonlinear equation.
44
44 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression Interpretation of E ( y ) as a Interpretation of E ( y ) as a Probability in Logistic Regression Probability in Logistic Regression If the two values of y are coded as 0 or 1, the value If the two values of y are coded as 0 or 1, the value of E ( y ) provides the probability that y = 1 given a particular set of values for x 1, x 2,..., x p.
45
45 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression Estimated Logistic Regression Equation Estimated Logistic Regression Equation A simple random sample is used to compute sample statistics b 0, b 1, b 2,..., b p that are used as the point estimators of the parameters 0, 1, 2,..., p. A simple random sample is used to compute sample statistics b 0, b 1, b 2,..., b p that are used as the point estimators of the parameters 0, 1, 2,..., p.
46
46 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression n Example: Simmons Stores Simmons’ catalogs are expensive and Simmons Simmons’ catalogs are expensive and Simmons would like to send them to only those customers who have the highest probability of making a $200 purchase using the discount coupon included in the catalog. Simmons’ management thinks that annual spending Simmons’ management thinks that annual spending at Simmons Stores and whether a customer has a Simmons credit card are two variables that might be helpful in predicting whether a customer who receives the catalog will use the coupon to make a $200 purchase.
47
47 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression n Example: Simmons Stores Simmons conducted a study by sending out 100 Simmons conducted a study by sending out 100 catalogs, 50 to customers who have a Simmons credit card and 50 to customers who do not have the card. At the end of the test period, Simmons noted for each of the 100 customers: 1) the amount the customer spent last year at Simmons, 2) whether the customer had a Simmons credit card, and 3) whether the customer made a $200 purchase. A portion of the test data is shown on the next slide. A portion of the test data is shown on the next slide.
48
48 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression n Simmons Test Data (partial) Customer12345678910 Annual Spending ($1000)2.2913.2152.1353.9242.5282.4732.3847.0761.1823.345 Simmons Credit Card 1110100010 $200Purchase0000010010 y x2x2x2x2 x1x1x1x1
49
49 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression ConstantSpendingCard -2.14640.34161.0987 0.57720.12870.4447 0.0000.0080.013 PredictorCoef SE Coef p 1.413.00 OddsRatio 95% CI Lower Upper 1.091.25 Simmons Logistic Regression Table (using Minitab) Simmons Logistic Regression Table (using Minitab) -3.722.662.47 Z Log-Likelihood = -60.487 Test that all slopes are zero: G = 13.628, DF = 2, P-Value = 0.001 1.817.17
50
50 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression Simmons Estimated Logistic Regression Equation Simmons Estimated Logistic Regression Equation
51
51 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression Using the Estimated Logistic Regression Equation Using the Estimated Logistic Regression Equation For customers that spend $2000 annually For customers that spend $2000 annually and do not have a Simmons credit card: and do not have a Simmons credit card: For customers that spend $2000 annually For customers that spend $2000 annually and do have a Simmons credit card: and do have a Simmons credit card:
52
52 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression Testing for Significance Testing for Significance H 0 : 1 = 2 = 0 H 0 : 1 = 2 = 0 H a : One or both of the parameters H a : One or both of the parameters is not equal to zero. is not equal to zero. Hypotheses Rejection Rule Test Statistics z = b i / s b i Reject H 0 if p -value <
53
53 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression Testing for Significance Testing for Significance Conclusions Conclusions For independent variable x 1 : z = 2.66 and the p -value Hence, 1 = 0. In other words, x 1 is statistically significant. For independent variable x 2 : z = 2.47 and the p -value Hence, 2 = 0. In other words, x 2 is also statistically significant.
54
54 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression With logistic regression is difficult to interpret the relation- ship between the variables because the equation is not linear so we use the concept called the odds ratio. The odds in favor of an event occurring is defined as the probability the event will occur divided by the probability probability the event will occur divided by the probability the event will not occur. Odds in Favor of an Event Occurring Odds in Favor of an Event Occurring Odds Ratio Odds Ratio
55
55 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression Estimated Probabilities Estimated Probabilities Credit Card Card Yes No No $1000 $2000 $3000 $4000 $5000 $6000 $7000 Annual Spending 0.3305 0.4099 0.4943 0.5790 0.6593 0.7314 0.7931 0.1413 0.1880 0.2457 0.3143 0.3921 0.4758 0.5609 Computedearlier
56
56 Slide © 2008 Thomson South-Western. All Rights Reserved Logistic Regression Comparing Odds Comparing Odds Suppose we want to compare the odds of making a Suppose we want to compare the odds of making a $200 purchase for customers who spend $2000 annually and have a Simmons credit card to the odds of making a $200 purchase for customers who spend $2000 annually and do not have a Simmons credit card.
57
57 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination n Model Assumptions n Testing for Significance n Using the Estimated Regression Equation for Estimation and Prediction for Estimation and Prediction n Qualitative Independent Variables n Residual Analysis n Logistic Regression
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.