Determining and Interpreting Associations Among Variables
Ch 182 Associative Analyses Associative analyses: determine where stable relationships exist between two variables Examples –What methods of doing business are associated with level of customer satisfaction? –What demographic variables are associated with repeat buying of Brand A? –Is type of sales training associated with sales performance of sales representatives? –Are purchase intention scores of a new product associated with actual sales of the product?
Ch 183 Relationships Between Two Variables Relationship: a consistent, systematic linkage between the levels or labels for two variables “Levels” refers to the characteristics of description for interval or ratio scales…the level of temperature, etc. “Labels” refers to the characteristics of description for nominal or ordinal scales, buyers v. non-buyers, etc. As we shall see, this concept is important in understanding the type of relationship…
Ch 184 Relationships Between Two Variables Nonmonotonic: two variables are associated, but only in a very general sense; don’t know “direction” of relationship, but we do know that the presence (or absence) of one variable is associated with the presence (or absence) of another. At the presence of breakfast, we shall have the presence of orders for coffee. At the presence of lunch, we shall have the absence of orders for coffee.
Ch 185 Nonmonotonic Relationship
Ch 186 Relationships Between Two Variables Monotonic: the general direction of a relationship between two variables is known –Increasing –Decreasing Shoe store managers know that there is an association between the age of a child and shoe size. The older a child, the larger the shoe size. The direction is increasing, though we only know general direction, not actual size.
Ch 187 Monotonic Increasing Relationship
Ch 188 Relationships Between Two Variables Linear: “straight-line” association between two variables Here knowledge of one variable will yield knowledge of another variable “100 customers produce $500 in revenue at Jack-in-the-Box” (p. 525)
Ch 189 Relationships Between Two Variables Curvilinear: some smooth curve pattern describes the association Example: Research shows that job satisfaction is high when one first starts to work for a company but goes down after a few years and then back up after workers have been with the same company for many years. This would be a U-shaped relationship.
Ch 1810 Characterizing Relationships Between Variables 1.Presence: whether any systematic relationship exists between two variables of interest 2.Direction: whether the relationship is positive or negative 3.Strength of association: how strong the relationship is: strong? moderate? weak? Assess relationships in the order shown above.
Ch 1811 Cross-Tabulations Cross-tabulation: consists of rows and columns defined by the categories classifying each variable…used for nonmonotonic relationships Cross-tabulation table: four types of numbers in each cell –Frequency –Raw percentage –Column percentage –Row percentage
Ch 1812 Cross-Tabulations Using SPSS, commands are ANALYZE, DESCRIPTIVE STATISTICS, CROSSTABS You will find a detailed discussion of cross-tabulation tables in your text, pages
Ch 1813 Cross-Tabulations
Ch 1814 Cross-Tabulations When we have two nominal-scaled variables and we want to know if they are associated, we use cross- tabulations to examine the relationship and the Chi-Square test to test for presence of a systematic relationship. In this situation: two variables, both with nominal scales, we are testing for a nonmonotonic relationship.
Ch 1815 Chi-Square Analysis Chi-square (X2) analysis: is the examination of frequencies for two nominal-scaled variables in a cross- tabulation table to determine whether the variables have a significant relationship. The null hypothesis is that the two variables are not related. Observed and expected frequencies:
Ch 1816 Cross-Tabulations Example: Let’s suppose we want to know if there is a relationship between studying and test performance and both of these variables are measured using nominal scales…
Ch 1817 Interpreting a Significant Cross-Tabulation Finding If the chi-square analysis determines that you have a significant relationship (no support for the null hypothesis) you may use the following to determine the nature of the relationship: –The column percentages table or –The raw percentages table
Ch 1818 Cross-Tabulations Did you study for the midterm test? __yes __no How did you perform on the midterm test? __pass __fail Now, let’s look at the data in a crosstabulation table:
Ch 1819 Cross-Tabulations Do you “see” a relationship? Do you “see” the “presence” of studying with the “presence” of passing? Do you “see” the “absence” of passing with the presence of not studying? Congratulations! You have just “seen” a nonmonotonic relationship.
Ch 1820 Cross-Tabulations Bar charts can be used to “see” nonmonotonic relationships.
Ch 1821 Cross-Tabulations But while we can “see” this association, how do we know there is the presence of a systematic association? In other words, is this association statistically significant? Would it likely appear again and again if we sampled other students? We use the Chi-Square test to tell us if nonmonotonic relationships are really present.
Ch 1822 Cross-Tabulations Using SPSS, commands are ANALYZE, DESCRIPTIVE STATISTICS, CROSSTABS and within the CROSSTABS dialog box, STATISTICS, CHI-SQUARE.
Ch 1823 Chi-Square Analysis Chi-square analysis: assesses nonmonotonic associations in cross- tabulation tables and is based upon differences between observed and expected frequencies Observed frequencies: counts for each cell found in the sample Expected frequencies: calculated on the null of “no association” between the two variables under examination
Ch 1824 Chi-Square Analysis Computed Chi-Square values:
Ch 1825 Chi-Square Analysis The chi-square distribution’s shape changes depending on the number of degrees of freedom The computed chi-square value is compared to a table value to determine statistical significance
Ch 1826 Chi-Square Analysis How do I interpret a Chi-square result? –The chi-square analysis yields the probability that the researcher would find evidence in support of the null hypothesis if he or she repeated the study many, many times with independent samples. –If the P value is < or = to 0.05, this means there is little support for the null hypothesis (no association). Therefore, we have a significant association…we have the PRESENCE of a systematic relationship between the two variables.
Ch 1827 Chi-Square Analysis Read the P value (Asympt. Sig) across from Pearson Chi-Square. Since the P value is <0.05, we have a SIGNIFICANT association.
Ch 1828 Chi-Square Analysis How do I interpret a Chi-square result? –A significant chi-square result means the researcher should look at the cross-tabulation row and column percentages to “see” the association pattern –SPSS will calculate row, column, (or both) percentages for you. See the CELLS box at the bottom of the CROSSTABS dialog box.
Ch 1829 Chi-Square Analysis Look at the ROW %’s: 92% of those who studied passed; almost 70% of those who didn’t study failed. “See” the relationship!
Ch 1830 Presence, Direction and Strength Presence? Yes, our Chi-Square was significant. This means that the pattern we observe between studying/not studying and passing/failing is a systematic relationship if we ran our study many, many times. Direction? Nonmonotonic relationships do not have direction…only presence and absence.
Ch 1831 Presence, Direction and Strength Strength? Since the Chi-Square only tells us presence, you must judge the strength by looking at the pattern. Don’t you think there is a “strong” relationship between study/not studying and passing/failing?
Ch 1832 When can you use Crosstabs and Chi-Square test? When you want to know if there is an association between two variables and… Both of those variables have nominal (or ordinal) scales
Ch 1833
Ch 1834
Ch 1835 Correlation Coefficients and Covariation The correlation coefficient: is an index number, constrained to fall between the range of −1.0 and The correlation coefficient communicates both the strength and the direction of the linear relationship between two metric variables.
Ch 1836 Correlation Coefficients and Covariation The amount of linear relationship between two variables is communicated by the absolute size of the correlation coefficient. The direction of the association is communicated by the sign (+, -) of the correlation coefficient. Covariation: is defined as the amount of change in one variable systematically associated with a change in another variable.
Ch 1837 Measuring the Association Between Interval- or Ratio-Scaled Variables In this case, we are trying to assess presence, direction and strength of a monotonic relationship. We are aided in doing this by using: Using SPSS, commands are ANALYZE, CORRELATE, BIVARIATE. Pearson Product Moment Correlation
Ch 1838 Correlation Coefficients and Covariation Covariation can be examined with use of a scatter diagram.
Ch 1839 Pearson Product Moment Correlation Coefficient (r) Presence? Determine if there is a significant association. The P value should be examined FIRST! If it is significant, there is a significant association. If not, there is no association. Direction? Look at the coefficient. Is it positive or negative?
Ch 1840 Pearson Product Moment Correlation Coefficient (r) Strength? The correlation coefficient (r) is a number ranging from -1.0 to the closer to 1.00 (+ or -), the stronger the association. There are “rules of thumb”…
Ch 1841 Rules of Thumb Determining Strength of Association A correlation coefficient’s size indicates the strength of association between two variables. The sign (+ or -) indicates the direction of the association
Ch 1842 Pearson Product Moment Correlation Coefficient (r) Pearson product moment correlation: measures the degree of linear association between the two variables.
Ch 1843 Pearson Product Moment Correlation Coefficient (r) Special considerations in linear procedures: –Correlation takes into account only the relationship between two variables, not interaction with other variables. –Correlation does not demonstrate cause and effect. –Correlations will not detect non- linear relationships between variables.
Ch 1844 When there is NO association, the P value for the Pearson r will be >0.05.
Ch 1845 When there IS association, the P value for the Pearson r will be < or =0.05. Examples: negative association between sales force rewards and turnover; positive association between length of sales force training and sales.
Ch 1846 Example What items are associated with preference for a waterfront view among restaurant patrons? –Are preferences for unusual entrées, simple décor, and unusual desserts associated with preference for waterfront view while dining? –Since all of these variables are interval- scaled we can run a Pearson Correlation to determine the association between each variable with the preference for waterfront view.
Ch 1847 Using SPSS, commands are ANALYZE, CORRELATE, BIVARIATE.
Ch 1848 The output shows presence, direction and strength of the association. Do you see any managerial significance to these associations?
Ch 1849 Concluding Remarks on Associative Analyses Researchers will always test the null hypothesis of NO relationship or no correlation. When the null hypothesis is rejected, then the researcher may have a managerially important relationship to share with the manager.