Download presentation
Presentation is loading. Please wait.
1
FACTOR ANALYSIS LECTURE 11 EPSY 625
2
PURPOSES SUPPORT VALIDITY OF TEST SCALE WITH RESPECT TO UNDERLYING TRAITS (FACTORS) EFA - EXPLORE/UNDERSTAND UNDERLYING FACTORS FOR A TEST CFA - CONFIRM THEORETICAL STRUCTURE IN A TEST
3
HISTORICAL DEVELOPMENT PEARSON (1901)- eigenvalue/eigenvector problem (dimensional reduction) “method of principal axes) SPEARMAN (1904) “General Intelligence, Objectively Measured and Determined” Others: Burt, Thompson, Garnett, Holzinger, Harmon, Thurstone
4
FACTOR MODELS FIXEDSAMPLE FixedFixed Principal components, common factors Image SUBJECTS VARIABLESVARIABLES ALPHA Factor Analysis Canonical Factor Analysis SampleSample
5
EXPLORATORY FACTOR ANALYSIS USE PRINCIPAL AXIS METHOD: ASSUMES THERE ARE 3 VARIANCE COMPONENTS IN EACH ITEM: COMMONALITY (h 2 ) UNIQUENESS: SPECIFICITY (s 2 ) ERROR (e 2 )
6
SINGLE FACTOR REQUIRES AT LEAST 3 ITEMS OR MEASUREMENTS TO UNIQUELY DETERMINE
7
FACTOR ITEM1 SPECIFICITY e ITEM2 ITEM3 e e.7.8.6 CALLED FACTOR LOADING CORRELATION BETWEEN ITEM AND FACTOR ASSUMED= 0 FOR PARALLEL ITEMS.6.8.714
8
FACTOR ITEM1 SPECIFICITY e ITEM2 ITEM3 e e.7.8.6 ALPHA= SPEARMAN- BROWN STEPPED UP AVERAGE INTER ITEM CORRELATION: (.56 +.42+.48)/3=.49 ALPHA= 3(.49)/[1+2(.49)] =.74 ASSUMED=0 FOR PARALLEL ITEMS.6.8.714 = 1-.7 2
9
TWO FACTORS NEED AT LEAST 2 ITEMS OR MEASUREMENTS PER FACTOR, ASSUMING FACTORS ARE CORRELATED
10
FACTOR 1 ITEM1 e ITEM2 e ITEM 3 ITEM 4 FACTOR 2 e e.5 CORRELATION BETWEEN FACTORS.7.8.6.7
11
FACTOR 1 ITEM1 e ITEM2 e ITEM 3 ITEM 4 FACTOR 2 e e.5 CORRELATION BETWEEN FACTORS.7.8.6.7 CORRELATION BETWEEN ANY TWO ITEMS = PRODUCT OF ALL PATHS BETWEEN THEM; EX. R(ITEM1, ITEM4) =.7 x.5 x.7 =.245
12
SIMPLE STRUCTURE TRY TO CREATE SCALE IN WHICH EACH ITEM CORRELATES WITH ONLY ONE FACTOR: ITEMFACTOR 123 ITEM 1100 ITEM 2100 ITEM 3010 ETC
13
CRITERIA FOR SIMPLE STRUCTURE Structural equation modeling provides chi square test of fit Compares observed covariance (correlation) matrix with predicted/fitted matrix Alternatively, look at RMSEA (Root mean square error of approximation) of deviations from fitted matrix
14
MATHEMATICAL MODEL Z = persons by variables matrix of p x k standardized variables (mean=0, SD=1) Z’Z = NR (covariance matrix) k x k Z i = a i F i + e i
15
MATHEMATICAL MODEL Z = AF = C + U ZZ’/N = R = AFF’A’ + U 2 S = ZF’/N (structure matrix: correlations between Z and F) = AFF’/N = FF’/N (correlations among factors) A = Pattern matrix
16
MATHEMATICAL MODEL S = A A = S -1 (If factors uncorrelated, A=S) Pattern matrix = Structure matrix R = ZZ’/N = CC’/N + U 2
17
MATHEMATICAL MODEL If we take the covariance matrix of F to be diagonal, and the metric of variances of F i to be 1.0, R = AA’/N = SA’ = AS’
18
MATHEMATICAL MODEL Now let Z i = a i F i + s i + e i Let Ŕ = R - D 2, where D 2 is a diagonal matrix of specificities and error: s i + e 2 i Then Ŕ = AFF’A/N = A A’ = SA’ = AS’ = I Ŕ = AA’
19
MATHEMATICAL MODEL How do we estimate s 2 i ? Instead, estimate [R 2 - U 2 ] ii = [I- s 2 i - e 2 i ] ii Consider for each z i that it is predictable from the rest: z i = b 1 z 1 + b 2 z 2 + …b i-1 z i-1 +... Then R 2 i = variance common to all other variables (squared multiple correlation or SMC) h 2 i = communality for item i Due to Dwyer (1939)
20
MATHEMATICAL MODEL SMC is estimable from the observed data, so that Ŕ = R - [1-SMC i ] where [SMC i ] = diagonal matrix with SMCs for each variable on the diagonals and zeros off-diagonal Theorem states “SMCs guarantee that the number of factors # eigenvalues>1.0
21
MATHEMATICAL MODEL Ŕ = R 2 1.234.. 0000… 0 R 2 2.134.. 000… 00 R 2 3.124.. 00… 000 R 2 4.123.. 0…
22
MATHEMATICAL MODEL SOLUTIONS: PRINCIPAL COMPONENTS (R = Ŕ ) Rq = q, RQ = Q , = diagonal [ i ] Q -1 RQ = QQ’ = I = Q -1 = Q’ Q ’ RQ = (Spectral Theorem)
23
MATHEMATICAL MODEL SOLUTIONS: PRINCIPAL AXIS ( Ŕ- I)q = 0 That is, solve for first eigenvalue | Ŕ- I | = 0, solved by R m q = m q begin with m=2: R 2 q = 2 q, then put solution in R(Rq 1 ) = 2 q 1, iterate for m=4
24
MATHEMATICAL MODEL Now compute residual correlation matrix: R 2 1 = R 2 - Ŕ, iterate
25
EIGENVALUES i = variance of i th factor i / i = proportion of total variance accounted for by the i th factor i < 1 chance factor Scree plot (value x factor eigenvalue ordered from greatest to lowest)
26
K 1.0 0 1 2 3 4 5 6 7.... K SCREE PLOT
27
ROTATION MEANING CRITERION: SIMPLE STRUCTURE POSITIVE MANIFOLD B=AT A=INITIAL FACTOR MATRIX T=TRIANGULAR MATRIX B=FINAL FACTOR MATRIX TT’=
28
VARIMAX ROTATION (uncorrelated Factors) ORTHOGONAL (RIGID) ROTATION Maximize V=n (b jp /h j ) 4 - (b 2 jp /h 2 j ) 2 Geometric problem: (X,Y) = (x,y) cos - sin sin - cos
29
VARIMAX ROTATION (X,Y) = (x,y) cos - sin sin - cos uj = x 2 j - y 2 j vj = 2x j y j A= u j B= v j C= (u j - v j ) 2 D=2 u j v j solve tan4 = [D-2AB/h]/[C-(A 2 -B 2 )/h] -45 o 45 o
30
Unrotated Factor 1 loading values Unrotated Factor 2 loading values Orthogonal (perpendicular) Rotation of Axes
31
OBLIQUE SOLUTION (correlated Factors) MINIMIZE S (OBLIMIN) S = [n (v 2 jp /h 2 j ) (v 2 jg /h 2 j ) - ( (v 2 jp /h 2 j )( (v 2 jg /h 2 j )] PROMAX: Start with VARIMAX, B=AT, transform with v jp = (b jp 4 )/b jp
32
FACTOR CORRELATION = TT’ T ij = cos( ij ) -sin( ij ) sin( ij ) cos( ij ) r ij = [ cos( ij )(-sin( ij )] + [ sin( ij )cos( ij )] = T 11 T 12 + T 21 T 22
33
FACTOR CORRELATION S = P (Structure matrix= Pattern matrix x factor correlation matrix) P = A(T’) -1 A = PT’
34
ij Oblique Rotation of Axes
35
ALPHA FACTOR ANALYSIS Estimates population h 2 i for each variable Little different from common factors
36
Canonical Factor Analysis Uses canonical analysis to maximize R between factors and variables, iterative Maximum Likelihood analysis
37
Image Analysis h 2 i = R 2 i.1,2,…K p j = w jk z k (standard regression) e j = z j - p j called anti-image Var(e j )> Var( j ) where Var( j ) = anti- image for the regression of z j on the factors F 1,F 2, …F K
38
FACTOR CONGRUENCE Alternative to Confirmatory Analysis for two groups who it is hypothesized have the same factor structure: S pq = a jp b jq / [ a 2 jp b 2 jq ] This is basically the correlation between factor loadings on the comparable factors for two groups
39
Example of 2 factor structure Achievement (reading, math) and IQ (verbal, nonverbal) quasi-multitrait multimethod analysis: reading is verbal math is “nonverbal”
42
CONFIRMATORY FACTOR ANALYSIS
43
BASIC PRINCIPLES x x´) 2 x 1 xx = x 1 x 2 2 x 2 x 1 x 3 x 2 x 3 2 x 3
44
BASIC PRINCIPLES 2 x 1 = 2 11 1 + 2 1 2 x k = 2 k1 1 + 2 k x i x k = x 1 1 1 x k x1x1 11 xkxk kk 11
45
IDENTIFICATION RULES t-rule : t q(q+1), q=#manifest variables necessary but not sufficient 3-indicator rule: 1 factor 3 indicators sufficient but not necessary 2-indicator rule: 2+ factors 2 indicators @ local vs. global identification: local: sample estimates of parameters independent- necessary but not sufficient global: population parameters independent- necessary and sufficient
46
ESTIMATION MODEL EVALUATION FIT: F ML used to evaluate , S Residuals: E= S - RMR = SD(s ij - ij ) RMSEA = √[( 2 /df - 1) /(N - 1)] note: factor analyze E, should be 0 ˆ ˆ ˆ ˆ ˆ
47
Hancock’s Formula- reliability for a given factor Hj = 1/ [ 1 + {1 / (Σ[l 2 ij /(1- l 2 ij )] ) } Ex. l 1 =.7, l 2 =.8, l 3 =.6 H = 1 / [ 1 +1/(.49/.51 +.64/.36 +.36/.64 )] = 1 / [ 1 + 1/ (.98 +1.67 +.56 ) ] = 1/ (1 + 1/3.21) =.76
48
Hancock’s Formula Explained Hj = 1/ [ 1 + {1 / (Σ[l 2 ij /(1- l 2 ij )] ) } now assume strict parallelism: then l 2 ij = 2 xt thus Hj = 1/ [ 1 + {1 / (Σ[ 2 xt /(1- 2 xt )] ) } = k 2 xt / [1 + (k-1) 2 xt ] = Spearman-Brown formula
49
TEST (n-1)F ML ~ t used for nested model: model with one or more restrictions from original restriction = known parameter, equality of two or more parameters Proof: Bollen shows (N-1)[-2Log(L 0 /L 1 )= (N-1)F ML where L 0 is unrestricted, L 1 restricted models
50
INCREMENTAL FIT Bentler and Bonnet: 1 = F b - F m F b = b - m b can be used to compare improvements over original model or against a standard or baseline
51
Bentler & Bonnet Baseline conventions b= S Alternatives: b= [.5] or b = from a previous study example: Willson & Rupley (1997) was used by Nichols (1997) dissertation
52
Bollen’s fit index 2 = F b - F m F b = b - m b - df Logic: the difference in the numerator has expected value equal to the denominator
53
AMOS SEM PROGRAM Uses SPSS to input data- select SPSS file Draw factor model Circles for factors, boxes for observed variables Arrows from circles to boxes to indicate loadings Errors for each box (special drawing character) Label all circles and boxes with names- SPSS variable names for boxes, your own name for factors and circles Correlate factors with curved arrows as needed
54
AMOS Drawing ANX e1 e2 e3 DEPSE F1
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.