THREE-WAY COMPONENT MODELS 880305- pages 66-76 By: Maryam Khoshkam 1.

Slides:



Advertisements
Similar presentations
بنام خدا 1. An Introduction to multi-way analysis Mohsen Kompany-Zareh IASBS, Nov 1-3, Session one.
Advertisements

Matrices A matrix is a rectangular array of quantities (numbers, expressions or function), arranged in m rows and n columns x 3y.
Tensors and Component Analysis Musawir Ali. Tensor: Generalization of an n-dimensional array Vector: order-1 tensor Matrix: order-2 tensor Order-3 tensor.
Exploratory Factor Analysis
1 Maarten De Vos SISTA – SCD - BIOMED K.U.Leuven On the combination of ICA and CPA Maarten De Vos Dimitri Nion Sabine Van Huffel Lieven De Lathauwer.
Dimension reduction (1)
1 Bootstrap Confidence Intervals for Three-way Component Methods Henk A.L. Kiers University of Groningen The Netherlands.
Lecture 7: Principal component analysis (PCA)
Psychology 202b Advanced Psychological Statistics, II April 7, 2011.
By: S.M. Sajjadi Islamic Azad University, Parsian Branch, Parsian,Iran.
Motion Analysis Slides are from RPI Registration Class.
Factor Analysis Purpose of Factor Analysis Maximum likelihood Factor Analysis Least-squares Factor rotation techniques R commands for factor analysis References.
Factor Analysis Research Methods and Statistics. Learning Outcomes At the end of this lecture and with additional reading you will be able to Describe.
Factor Analysis Purpose of Factor Analysis
Dimensional reduction, PCA
Ch. 2: Rigid Body Motions and Homogeneous Transforms
Face Recognition Jeremy Wyatt.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
GG313 Lecture 12 Matrix Operations Sept 29, 2005.
1 Bootstrap Confidence Intervals in Variants of Component Analysis Marieke E. Timmerman 1, Henk A.L. Kiers 1, Age K. Smilde 2 & Cajo J.F. ter Braak 3 1.
1 2. The PARAFAC model Quimiometria Teórica e Aplicada Instituto de Química - UNICAMP.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
Matrices CS485/685 Computer Vision Dr. George Bebis.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Numerical Analysis 1 EE, NCKU Tien-Hao Chang (Darby Chang)
Factor Analysis Psy 524 Ainsworth.
Compiled By Raj G. Tiwari
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Some matrix stuff.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Factor Analysis © 2007 Prentice Hall. Chapter Outline 1) Overview 2) Basic Concept 3) Factor Analysis Model 4) Statistics Associated with Factor Analysis.
Quantum One: Lecture Representation Independent Properties of Linear Operators 3.
1 Variational and Weighted Residual Methods. 2 The Weighted Residual Method The governing equation for 1-D heat conduction A solution to this equation.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Principal Component vs. Common Factor. Varimax Rotation Principal Component vs. Maximum Likelihood.
Estimation of Number of PARAFAC Components
Chapter 5 MATRIX ALGEBRA: DETEMINANT, REVERSE, EIGENVALUES.
Math 5364/66 Notes Principal Components and Factor Analysis in SAS Jesse Crawford Department of Mathematics Tarleton State University.
Introduction to Matrices Douglas N. Greve
Rotation matrices 1 Constructing rotation matricesEigenvectors and eigenvalues 0 x y.
In the name of GOD. Zeinab Mokhtari 1-Mar-2010 In data analysis, many situations arise where plotting and visualization are helpful or an absolute requirement.
Multimodal Interaction Dr. Mike Spann
1 4. Model constraints Quimiometria Teórica e Aplicada Instituto de Química - UNICAMP.
Multiway Data Analysis
Quimiometria Teórica e Aplicada Instituto de Química - UNICAMP
Feature Extraction 主講人:虞台文. Content Principal Component Analysis (PCA) PCA Calculation — for Fewer-Sample Case Factor Analysis Fisher’s Linear Discriminant.
Factor Analysis Basics. Why Factor? Combine similar variables into more meaningful factors. Reduce the number of variables dramatically while retaining.
Principal Component Analysis
3 “Products” of Principle Component Analysis
Université d’Ottawa / University of Ottawa 2001 Bio 8100s Applied Multivariate Biostatistics L11.1 Lecture 11: Canonical correlation analysis (CANCOR)
Feature Extraction 主講人:虞台文.
Unsupervised Learning II Feature Extraction
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Chapter 14 EXPLORATORY FACTOR ANALYSIS. Exploratory Factor Analysis  Statistical technique for dealing with multiple variables  Many variables are reduced.
Unsupervised Learning II Feature Extraction
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
Basic statistical concepts Variance Covariance Correlation and covariance Standardisation.
Sect. 4.5: Cayley-Klein Parameters 3 independent quantities are needed to specify a rigid body orientation. Most often, we choose them to be the Euler.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
2. Matrix Methods
Measuring latent variables
Measuring latent variables
Descriptive Statistics vs. Factor Analysis
Measuring latent variables
Principal Components Analysis
Principal Component Analysis
Principal Component Analysis
Measuring latent variables
Presentation transcript:

THREE-WAY COMPONENT MODELS pages By: Maryam Khoshkam 1

Tucker component models Ledyard Tucker was one of the pioneers in multi-way analysis. He proposed a series of models nowadays called N-mode PCA or Tucker models [Tucker ] 2

3 TUCKER3 MODELS : nonzero off-diagonal elements in its core.

In Kronecker product notation the Tucker3 model 4

PROPERTIES OF THE TUCKER3 MODEL Tucker3 model has rotational freedom. T A : arbitrary nonsingular matrix Such a transformation of the loading matrix A can be defined similarly for B and C, using T B and T C, respectively 5

Tucker3 model has rotational freedom, But: it is not possible to rotate Tucker3 core-array to a superdiagonal form (and to obtain a PARAFAC model.! 6 The Tucker3 model : not give unique component matrices  it has rotational freedom.

rotational freedom Orthogonal component matrices (at no cost in fit by defining proper matrices T A, T B and T C ) convenient : to make the component matrices orthogonal  easy interpretation of the elements of the core- array and of the loadings by the loading plots 7

8 SS of elements of core-array amount of variation explained by combination of factors in different modes. variation in X: unexplained and explained by model Using a proper rotation all the variance of explained part can be gathered in core.

9 The rotational freedom of Tucker3 models can also be used to rotate the core-array to a simple structure as is also common in two-way analysis (will be explained).

Imposing the restrictions A’A = B’B = C’C = I : not sufficient for obtaining a unique solution To obtain uniqe estimates of parameters, 1. loading matrices should be orthogonal, 2. A should also contain eigenvectors of X(CC’ ⊗ BB’)X’ corresp. to decreasing eigenvalues of that same matrix; similar restrictions should be put on B and C [De Lathauwer 1997, Kroonenberg et al. 1989]. 10

11 Unique Tucker Simulated data: Two components, PARAFAC model

12 Unique Tucker3 component model P=Q=R=3 Only two significant elements in core

13

14

all three modes are reduced In tucker 3 models where only two of the three modes are reduced, :Tucker2 models. a Tucker3 model is made for X (I × J × K) C is chosen to be the identity matrix I, of size K × K. no reduction sought in the third mode (basis is not changed. ↘ Tucker2 model : 15

Tucker2 has rotational freedom: G : postmultiplied by U ⊗ V (B ⊗ A) : premultiplied by (U ⊗ V) −1 =>(B(U’) −1 ⊗ A(V’) −1 ) without changing the fit.  component matrices A and B can be made orthogonal without loss of fit. (using othog U and V) 16

Tucker1 models : reduce only one of the modes. + X (and accordingly G) are matricized : 17

18 different models [Kiers 1991, Smilde 1997]. Threeway component models for X (I × J × K), A : the (I × P) component matrix (of first (reduced) mode, X(I×JK) : matricized X; A,B,C : component matrices; G : different matricized core-arrays ; I :superdiagonal array (ones on superdiagonal. (compon matrices, core-arrays and residual error arrays : differ for each model => PARAFAC model is a special case of Tucker3 model.