Presentation is loading. Please wait.

Presentation is loading. Please wait.

FACTOR ANALYSIS.  The basic objective of Factor Analysis is data reduction or structure detection.  The purpose of data reduction is to remove redundant.

Similar presentations


Presentation on theme: "FACTOR ANALYSIS.  The basic objective of Factor Analysis is data reduction or structure detection.  The purpose of data reduction is to remove redundant."— Presentation transcript:

1 FACTOR ANALYSIS

2  The basic objective of Factor Analysis is data reduction or structure detection.  The purpose of data reduction is to remove redundant (highly correlated) variables from the data file, perhaps replacing the entire data file with a smaller number of uncorrelated variables.  The purpose of structure detection is to examine the underlying (or latent) relationships between the variables.

3 FACTOR ANALYSIS cont.  It represents a complex array of structure-analyzing procedures used to identify interrelationships among a large set of observed variables and then, through data reduction, to group a smaller set of these variables into dimensions or factors that have common characteristics.

4 FACTOR ANALYSIS  Finally it is, A technique that analyses data on a relatively large set of variables and produces a smaller set of factors which are used to represent the original variables as linear combinations of the smaller set of factors; so that the set of factor captures as much information as possible from the original set of data.

5 Uses of Factor Analysis  Scale construction: Factor analysis could be used to develop concise multiple item scales for measuring various constructs.  Establish antecedents: This method reduces multiple input variables into grouped factors. Thus, the independent variables can be grouped into broad factors.  Psychographic profiling: Different independent variables are grouped to measure independent factors. These are then used for identifying personality types.  Segmentation analysis: Factor analysis could also be used for segmentation. For example, there could be different sets of two-wheelers-customers owning two-wheelers because of different importance they give to factors like prestige, economy consideration and functional features.

6 Uses of Factor Analysis  Marketing studies: The technique has extensive use in the field of marketing and can be successfully used for new product development; product acceptance research, developing of advertising copy, pricing studies and for branding studies. For example we can use it to: identify the attributes of brands that influence consumers’ choice; get an insight into the media habits of various consumers; identify the characteristics of price-sensitive customers

7 Requirements for Factor Analysis The following conditions must be ensured before executing the technique:  Factor analysis exercise requires metric data. This means the data should be either interval or ratio scale in nature.  The variables for factor analysis are generally identified through exploratory research which may be conducted by reviewing the literature on the subject, by informal interviews of knowledgeable persons, qualitative analysis like focus group discussions, analysis of case studies and judgment of the researcher.  If the responses to different statements are obtained through different scales, all the responses need to be standardized.

8 Requirements for Factor Analysis  The size of the sample respondents should be at least four to five times more than the number of variables (number of statements).  The basic principle behind the application of factor analysis is that the initial set of variables should be highly correlated. If the correlation coefficients between all the variables are small, factor analysis may not be an appropriate technique.

9 Bartlett's test of sphericity  It tests the hypothesis that your correlation matrix is an identity matrix, which would indicate that your variables are unrelated and therefore unsuitable for structure detection.  H 0 : correlation matrix is an identity matrix where diagonal elements are one and off diagonal elements are zero. H 1 : Correlation matrix is significant. H 0 must be rejected for data to be suited for Factor Analysis.

10 Kaiser-Meyer-Olkin (KMO) Test  The KMO is a statistic that indicates the proportion of variance in your variables that might be caused by underlying factors. High values (close to 1.0) generally indicate that a factor analysis may be useful with your data. If the value is less than 0.50, the results of the factor analysis probably won't be very useful.

11 FACTOR ANALYSIS... ( assumptions )  Variables are correlated and some are having high degree of co-variance. underlying factors  The co-variation among variables is due to some underlying factors which are smaller in number than the observed variables. linear combinations  Observed variables can be expressed as some linear combinations of the underlying factors.  Reduction in data would not effect significantly conclusions of the research

12 TYPES OF FACTOR ANALYSIS  There are basically two types of Factor Analysis – EXPLORATORY FACTOR ANALYSIS CONFIRMATORY FACTOR ANALYSIS

13 EXPLORATORY FACTOR ANALYSIS… (EFA)  Researchers use EFA when they do not know how many factors are necessary to explain the interrelationships among a set of characteristics, indicators, or items.  It means that EFA is basically used to explore the underlying dimensions of the construct of interest.

14 CONFIRMATORY FACTOR ANALYSIS … (CFA)  Researchers use CFA when they want to assess the extent to which the hypothesized organization of a set of identified factors fits the data..  It means that CFA is used when the researcher has some knowledge about the underlying structure of the construct under investigation.

15 FACTOR ANALYSIS MODEL CONTINUED...

16 FACTOR  A factor is an unobservable / hypothetical underlying variable or construct. It is a linear combination of related observed variables that represents a specific underlying dimension of a construct, which is as distinct as possible from other factors included in solution.

17 Key terms in Factor Analysis  FACTOR LOADING: It is a coefficient of a factor in the factor model equation. If variables are standardized, then factor loadings are correlation coefficients between a factor and a variable.

18 FACTOR LOADING  Correlation between the factors and the original variables is represented by Factor Loading. Factor loading are present in the “Component Matrix” table of SPSS output.  The squared factor loading is the percentage of variance in the variable, explained by a factor.

19 COMMUNALITY(h 2 )  It indicates how much of each variable is accounted for by the underlying factors taken together. In other words, it is a measure of the percentage of variable’s variation that is explained by the factors.  It is defined for each variable. It is equal to summation of squares of factor loadings of all factors for a given variable.  It represents variations due to common factors and (1-h 2 ) is called specific variation or unique variation of the variable not explained by the factors.

20 Initial communalities  Initial communalities are estimates of the variance in each variable accounted for by all components or factors. For principal components extraction, this is always equal to 1.0 for correlation analyses.

21 Extraction communalities  Extraction communalities are estimates of the variance in each variable accounted for by the components. The high value of communalities indicates that the extracted components represent the variables well. If any communalities are very low in a principal components extraction, you may need to extract another component.

22 Communality example  Extracted Communality for Variable 5 = (Factor loading of var5 with Factor1) 2 + (Factor loading of var5 with Factor2) 2 + …..+ (Factor loading of var5 with Factor N) 2  Where N is the total factor extracted or retained.  If we do the same computation for all the factors, then we will get the initial communality for that variable, which has to be 1 (in principal components extraction, for correlation analyses).

23 EIGENVALUES  The percentage of variance explained by each factor can be computed using eigenvalue. The eigenvalue of any factor is obtained by taking the sum of squares of the factor loadings for all variables.

24 EIGENVALUES  The eigen values are calculated for a given factor.  They reflects the variance in all the variables, which is accounted for by that factor.  A factor's eigen value may be computed as the sum of its squared factor loadings for all the variables.  They are contained in the “Total variance explained” table of SPSS output.

25 EIGENVALUES  Eigen value for Factor 1 = (Factor loading of var1 with Factor1) 2 + (Factor loading of var2 with Factor1) 2 + …..+ (Factor loading of varN with Factor 1) 2  Where N is the total number of variables present

26 Extraction methods  Principal component Analysis: It begins by finding a linear combination of variables (a component) that accounts for as much variation in the original variables as possible.  It then finds another component that accounts for as much of the remaining variation as possible and is uncorrelated with the previous component, continuing in this way until there are as many components as original variables.  Usually, a few components will account for most of the variation, and these components can be used to replace the original variables. This method is most often used to reduce the number of variables in the data file.

27 Principal component Analysis  The principal component methodology involves searching for those values of weights i so that the first factor explains the largest portion of total variance. This is called the first principal factor.  This explained variance is then subtracted from the original input matrix so as to yield a residual matrix.  A second principal factor is extracted from the residual matrix in a way such that the second factor takes care of most of the residual variance.  One point that has to be kept in mind is that the second principal factor has to be statistically independent of the first principal factor. The same principle is then repeated until there is little variance to be explained.

28 Factor Rotation  Factor analysis can generate several solutions for any data set. Each solution is termed a particular factor rotation and is generated by a particular factor rotation scheme.  Broadly there are two types of rotation:  Orthogonal & Oblique  Orthogonal rotation methods assume that the factors in the analysis are uncorrelated, where as Oblique consider otherwise.

29 Orthogonal METHODS OF ROTATION: Varimax. An orthogonal rotation method that minimizes the number of variables that have high loadings on each factor. It simplifies the interpretation of the factors. Quartimax. A rotation method that minimizes the number of factors needed to explain each variable. It simplifies the interpretation of the observed variables.

30 Orthogonal METHODS OF ROTATION  Equamax. A rotation method that is a combination of the varimax method, which simplifies the factors, and the quartimax method, which simplifies the variables. The number of variables that load highly on a factor and the number of factors needed to explain a variable are minimized.

31 Oblique METHODS OF ROTATION: Orthomax. Specifies families of orthogonal rotations. Gamma specifies the member of the family to use. Varying Gamma changes maximization of the variances of the loadings from columns (Varimax) to rows (Quartimax). Oblimin. Specifies families of oblique (non-orthogonal) rotations. Gamma specifies the member of the family to use. For Gamma, specify 0 for moderate correlations, positive values to allow higher correlations, and negative values to restrict correlations.

32 FACTOR ANALYSIS…How Many Factors?  Rule of Thumb All included factors (prior to rotation) must explain at least as much variance as an "average variable"  Eigenvalues Criteria Eigenvalue represents the amount of variance in the original variables associated with a factor Sum of the square of the factor loadings of each variable on a factor represents the eigenvalue Only factors with eigenvalues greater than 1.0 are retained  Predefined Number of Factors

33 Example

34 FACTOR LOADING: CUSTOMERS PERCEPTION ABOUT A MANUFACTURER

35 Example


Download ppt "FACTOR ANALYSIS.  The basic objective of Factor Analysis is data reduction or structure detection.  The purpose of data reduction is to remove redundant."

Similar presentations


Ads by Google