Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Correlation and Regression Analysis Lecture 11.

Similar presentations


Presentation on theme: "1 Correlation and Regression Analysis Lecture 11."— Presentation transcript:

1 1 Correlation and Regression Analysis Lecture 11

2 2 Strength of Association Direction Nature Presence Concepts About Relationships

3 3.... assesses whether a systematic relationship exists between two or more variables. If we find statistical significance between the variables we say a relationship is present. Relationship Presence

4 4 Nonlinear relationship = often referred to as curvilinear, it is best described by a curve instead of a straight line. Linear relationship = a “straight-line association” between two or more variables. Relationships between variables typically are described as either linear or nonlinear. are described as either linear or nonlinear. Nature of Relationships

5 Direction of Relationship 5 The direction of a relationship can be either positive or negative. Positive relationship = when one variable increases, e.g., loyalty to employer, then so does another related one, e.g. effort put forth for employer. Negative relationship = when one variable increases, e.g., satisfaction with job, then a related one decreases, e.g. likelihood of searching for another job.

6 Strength of Association 6 When a consistent and systematic relationship is present, the researcher must determine the strength of association. The strength ranges from very strong to slight.

7 7.... exists when one variable consistently and systematically changes relative to another variable. The correlation coefficient is used to assess this linkage. Covariation

8 8 + 1.0 0.0 Zero Correlation = the value of Y does not increase or decrease with the value of X. - 1.0 Positive Correlation = when the value of X increases, the value of Y also increases. When the value of X decreases, the value of Y also decreases. Negative Correlation = when the value of X increases, the value of Y decreases. When the value of X decreases, the value of Y increases.

9 Exhibit 11-1 Rules of Thumb about Correlation Coefficient Size 9 Coefficient Strength of Range Association +/–.91 to +/– 1.00 Very Strong +/–.71 to +/–.90 High +/–.41 to +/–.70 Moderate +/–.21 to +/–.40 Small +/–.01 to +/–.20 Slight

10 Pearson Correlation 10 The Pearson correlation coefficient measures the linear association between two metric variables. It ranges from – 1.00 to + 1.00, with zero representing absolutely no association. The larger the coefficient the stronger the linkage and the smaller the coefficient the weaker the relationship.

11 Coefficient of Determination 11 The coefficient of determination is the square of the correlation coefficient, or r 2. It ranges from 0.00 to 1.00 and is the amount of variation in one variable explained by one or more other variables.

12 Exhibit 11-3 Bivariate Correlation Between Work Group Cooperation and Intention to Search for another Job VariablesMeanStandard Deviation N X 4 – Work Group Cooperation 3.891.34563 X 16 – Intention to Search 4.271.80763 12 Descriptive Statistics

13 Exhibit 11-3 Bivariate Correlation Between Work Group Cooperation and Intention to Search for another Job X 4 – Work Group Cooperation X 16 – Intention to Search X 4 – Work Group Cooperation Pearson Correlation 1.00-.585* Sig. (2-tailed)..000 N63 X 16 – Intention to Search Pearson Correlation -.585*1.00 Sig. (2-tailed).000. N63 13 Correlations * Coefficient is significant at the 0.01 level (2-tailed).

14 Exhibit 11-5 Bar Charts for Rankings for Food Quality and Atmosphere 14

15 Exhibit 11-4 Correlation of Food Quality and Atmosphere Using Spearman’s rho X 13 – Food Quality Ranking X 14 – Atmosphere Ranking Spearman’s rho X 13 – Food Quality Ranking Correlation Coefficient 1.000-.801* Sig. (2-tailed)..000 N 200 X 14 – Atmosphere Ranking Correlation Coefficient -.801*1.000 Sig. (2-tailed).000. N 200 15 * Coefficient is significant at the 0.01 level (2-tailed).

16 Exhibit 11-6 Customer Rankings of Restaurant Selection Factors X 13 – Food Quality Ranking X 14 – Atmosphere Ranking X 15 – Prices Ranking X 16 – Employees Ranking NValid200 Missing0000 Median4.003.002.001.00 Minimum2211 Maximum4434 16 Statistics

17 Exhibit 11-7 Classification of Statistical Techniques 17

18 Exhibit 11-8 Definitions of Statistical Techniques 18 ANOVA (analysis of variance) is used to examine statistical differences between the means of two or more groups. The dependent variable is metric and the independent variable(s) is nonmetric. One-way ANOVA has a single nonmetric independent variable and two-way ANOVA can have two or more nonmetric independent variables. Bivariate regression has a single metric dependent variable and a single metric independent variable. Cluster analysis enables researchers to place objects (e.g., customers, brands, products) into groups so that objects within the groups are similar to each other. At the same time, objects in any particular group are different from objects in all other groups. Correlation examines the association between two metric variables. The strength of the association is measured by the correlation coefficient. Conjoint analysis enables researchers to determine the preferences individuals have for various products and services, and which product features are valued the most.

19 Exhibit 11-8 Definitions of Statistical Techniques 19 Discriminant analysis enables the researcher to predict group membership using two or more metric dependent variables. The group membership variable is a nonmetric dependent variable. Factor analysis is used to summarize the information from a large number of variables into a much smaller number of variables or factors. This technique is used to combine variables whereas cluster analysis is used to identify groups with similar characteristics. Logistic regression is a special type of regression that can have a non-metric/categorical dependent variable. Multiple regression has a single metric dependent variable and several metric independent variables. MANOVA is similar to ANOVA, but it can examine group differences across two or more metric dependent variables at the same time. Perceptual mapping uses information from other statistical techniques to map customer perceptions of products, brands, companies, and so forth.

20 Exhibit 11-10 Bivariate Regression of Satisfaction and Food Quality X 25 – CompetitorVariablesMean Samouel’s X 17 – Satisfaction4.78 X 1 – Food Quality5.24 Gino’s X 17 – Satisfaction5.96 X 1 – Food Quality5.81 20 Descriptive Statistics

21 Exhibit 11-10 Bivariate Regression of Satisfaction and Food Quality X 25 – Competitor ModelRR Square Samouel’s1.513 *.263 Gino’s1.331 *.110 21 Model Summary * Predictors: (Constant), X 1 – Excellent Food Quality

22 Exhibit 11-11 Other Aspects of Bivariate Regression X 25 – Competitor ModelSum of Squares Mean Square FSig. Samouel’s1 Regression35.001 34.945.000 * Residual98.1591.002 Total133.160 Gino’s1 Regression10.310 12.095.001 * Residual83.530.852 Total93.840 22 * Predictors: (Constant), X 1 – Excellent Food Quality Dependent Variable: X 17 – Satisfaction

23 Exhibit 11-11 Other Aspects of Bivariate Regression continued X 25 – Competitor Model Unstandardized Coefficients Standardized Coefficients tSig. BStd. Error Beta Samouel’s 1 (Constant)2.376.4195.671.000.459.078.5135.911.000 Gino’s 1 (Constant)4.307.4848.897.000.284.082.3313.478001 23 Coefficients * Dependent Variable: X 17 – Satisfaction

24 Calculating the “Explained” and “Unexplained” Variance in Regression 24 The unexplained variance in regression, referred to as residual variance, is calculated by dividing the residual sum of squares by the total sum of squares. For example, in Exhibit 11-11, divide the residual sum of squares for Samouel’s of 98.159 by 133.160 and you get.737. This tells us that a lot of variance (73.7%) in the dependent variable in not explained by this regression equation. The explained variance in regression, referred to as r 2, is calculated by dividing the regression sum of squares by the total sum of squares. For example, in Exhibit 11-11, divide the regression sum of squares for Samouel’s of 35.00l by 133.160 and you get.263.

25 How to calculate the t-value? 25 The t-value is calculated by dividing the regression coefficient by its standard error. In Exhibit 11-11 in the Coefficients table, if you divide the Unstandardized Coefficient for Samouel’s of.459 by the Standard Error of.078, the result will be a t-value of 5.8846. Note that the number in the table for the t-value is 5.911. The difference between the calculated 5.8846 and the 5.911 reported in the table is due to the fact that the computer reported the “rounded off” numbers for the Unstandardized Coefficient and the Standard Error but the t-value is calculated and reported without rounding.

26 How to interpret the regression coefficient ? 26 The regression coefficient of.459 for Samouel’s X 1 – Food Quality reported in Exhibit 11-11 is interpreted as follows: “... for every unit that X 1 increases, X 17 will increase by.459 units.” Recall that in this example X 1 is the independent (predictor) variable and X 17 is the dependent variable.

27 Exhibit 11-13 Multiple Regression of Return in Future and Food Independent Variables X 25 – CompetitorVariablesMean Samouel’sX 18 – Return in Future4.37 X 1 – Excellent Food Quality5.24 X 4 – Excellent Food Taste5.16 X 9 – Wide Variety of Menu Items5.45 Gino’sX 18 – Return in Future5.55 X 1 – Excellent Food Quality5.81 X 4 – Excellent Food Taste5.73 X 9 – Wide Variety of Menu Items5.56 27 Descriptive Statistics

28 Exhibit 11-13 Multiple Regression of Return in Future and Food Independent Variables (continued) X 25 – Competitor ModelRR SquareAdjusted R Square Samouel’s1.512*.262.239 Gino’s1.482*.232.208 28 Model Summary * Predictors: (Constant), X 9 – Wide Variety of Menu Items, X 1 – Excellent Food Quality, X 4 – Excellent Food Taste Dependent Variable: X 18 – Return in Future

29 Exhibit 11-14 Other Information for Multiple Regression Models X 25 – Competitor ModelSum of Squares Mean Square FSig. Samouel’s1 Regression28.1559.38511.382.000 * Residual79.155.825 Total107.310 Gino’s1 Regression22.0197.3409.688.000 * Residual72.731.758 Total94.750 29 * Predictors: (Constant), X 9 – Wide Variety of Menu Items, X 1 – Excellent Food Quality, X 4 – Excellent Food Taste Dependent Variable: X 18 – Return in Future ANOVA

30 Exhibit 11-14 Other Information for Multiple Regression Models X 25 – Competitor Unstandardized Coefficients Standardized Coefficients tSig. ModelBStd. ErrorBeta Samouel’s1 (Constant)2.206.4434.985.000 X 1 – Exc. Food Quality.260.116.3242.236.028 X 4 – Exc. Food Taste.242.137.2911.770.080 X 9 – Wide Variety -8.191E-02.123-.094-.668.506 Gino’s1 (Constant)2.877.5075.680.000 X 1 – Exc. Food Quality.272.119.3162.295.024 X 4 – Exc. Food Taste.241.132.2641.823.071 X 9 – Wide Variety -5.275E-02.125-.065-.421.675 30 Coefficients * * Dependent Variable: X 18 – Return in Future

31 Exhibit 11-17 Summary Statistics for Employee Regression Model 31 ModelRR SquareAdjusted R Square 1.506.256.218 ModelSum of Squares Mean Square FSig. 1 Regression17.0415.6806.762.001 Residual49.563.840 Total66.603 Model Summary * Predictors: (Constant), X 12 – Benefits Reasonable, X 9 – Pay Reflects Effort, X 1 – Paid Fairly Dependent Variable: X 14 – Effort

32 Exhibit 11-18 Coefficients for Employee Regression Model Unstandardized Coefficients Standardized Coefficients tSig.Collinearity Statistics ModelBStd. Error BetaToleranceVIF 1(Constant)3.089.6804.541.000 X 1 – Paid Fairly.178.281.157.633.529.2044.894 X 4 – Pay Reflects Effort.553.157.5163.521.001.5881.701 X 9 – Benefits Reasonable -.256.300-.203-.855.396.2244.456 32 Coefficients * * Dependent Variable: X 14 – Effort

33 Exhibit 11-19 Bivariate Correlations of Effort and Compensation Variables X 14 – EffortX 1 – Paid FairlyX 1 – Pay Reflects Effort X 12 – Reasonable Benefits X 14 – Effort 1.000.309.496.241 X 1 – Paid Fairly.3091.000.639.880 X 1 – Pay Reflects Effort.496.6391.000.592 X 12 – Reasonable Benefits.241.880.5921.000 33 Pearson Correlations

34 Exhibit 11-19 Bivariate Correlations of Effort and Compensation Variables X 14 – EffortX 1 – Paid FairlyX 1 – Pay Reflects Effort X 12 – Reasonable Benefits X 14 – Effort..007.000.028 X 1 – Paid Fairly.309..000 X 1 – Pay Reflects Effort.000. X 12 – Reasonable Benefits.028.000. 34 Statistical Significance of Pearson Correlations (1 – tailed)

35 Exhibit 11-20 Stepwise Regression Based on Samouel’s Customer Survey ModelRR Square Adjusted R Square Std. Error of the Estimate 1.513.263.2551.00 1.597.356.343.94 35 * Predictors: (Constant), X 1 – Excellent Food Quality, X 6 – Friendly Employees Dependent Variable: X 17 – Satisfaction Model Summary

36 Exhibit 11-20 Stepwise Regression Based on Samouel’s Customer Survey ModelSum of Squares Mean Square FSig. 1 Regression35.001 34.945.000 Residual98.1591.002 Total133.16026.825.000 1 Regression47.42123.711 Residual85.739 Total133.160 36 * Predictors: (Constant), X 1 – Excellent Food Quality, X 6 – Friendly Employees Dependent Variable: X 17 – Satisfaction ANOVA

37 Exhibit 11-21 Means and Correlations for Selected Variables from Samouel’s Customer Survey VariablesMean X 17 – Satisfaction4.78 X 1 – Excellent Food Quality5.24 X 4 – Excellent Food Taste5.16 X 9 – Wide Variety of Menu Items5.45 X 6 – Friendly Employees2.89 X 11 – Courteous Employees1.96 X 12 – Competent Employees1.62 37 Descriptive Statistics

38 Exhibit 11-20 Independent Variables in Stepwise Regression Model ModelVariables Entered Variables Removed Method 1 X1 – Excellent Food Quality. Stepwise (Criteria: Probability-of-R- to-enter =.100 2 X6 – Friendly Employees. 38 * Predictors: (Constant), X 1 – Excellent Food Quality, X 6 – Friendly Employees Dependent Variable: X 17 – Satisfaction ANOVA

39 Exhibit 11-23 Coefficients for Stepwise Regression Model Unstandardized Coefficients Standardized Coefficients t Sig.Collinearity Statistics ModelBStd. Error BetaToleranceVIF 1(Constant)2.376.4195.67.000 X 1 – Excellent Food Quality.459.078.5135.91.0001.00 2(Constant) 1.716.4313.98.000 X 1 – Excellent Food Quality.402.074.4495.39.000.9581.044 X 6 – Friendly Employees.332.088.3123.75.000.9581.044 39 Coefficients * * Dependent Variable: X 17 – Satisfaction

40 40 THANK YOU


Download ppt "1 Correlation and Regression Analysis Lecture 11."

Similar presentations


Ads by Google