Multivariate Transformation. Multivariate Transformations  Started in statistics of psychology and sociology.  Also called multivariate analyses and.

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

3D Geometry for Computer Graphics
Canonical Correlation
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Component Analysis (Review)
Lecture 3: A brief background to multivariate statistics
Factor Analysis Continued
Chapter 10 Curve Fitting and Regression Analysis
Multiple Regression. Outline Purpose and logic : page 3 Purpose and logic : page 3 Parameters estimation : page 9 Parameters estimation : page 9 R-square.
Linear Regression and Binary Variables The independent variable does not necessarily need to be continuous. If the independent variable is binary (e.g.,
Lecture 7: Principal component analysis (PCA)
Eigenvalues and eigenvectors
Principal Component Analysis
Linear Transformations
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 14 Using Multivariate Design and Analysis.
Principal component analysis (PCA)
Data mining and statistical learning, lecture 4 Outline Regression on a large number of correlated inputs  A few comments about shrinkage methods, such.
Chapter 11 Multiple Regression.
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Principal component analysis (PCA)
Principal component analysis (PCA) Purpose of PCA Covariance and correlation matrices PCA using eigenvalues PCA using singular value decompositions Selection.
Correlation. The sample covariance matrix: where.
The Multivariate Normal Distribution, Part 2 BMTRY 726 1/14/2014.
Arithmetic Operations on Matrices. 1. Definition of Matrix 2. Column, Row and Square Matrix 3. Addition and Subtraction of Matrices 4. Multiplying Row.
Copyright © Cengage Learning. All rights reserved. 7.6 The Inverse of a Square Matrix.
Variance and covariance Sums of squares General linear models.
Separate multivariate observations
A vector can be interpreted as a file of data A matrix is a collection of vectors and can be interpreted as a data base The red matrix contain three column.
Chapter 2 Dimensionality Reduction. Linear Methods
Multivariate Analysis of Variance (MANOVA). Outline Purpose and logic : page 3 Purpose and logic : page 3 Hypothesis testing : page 6 Hypothesis testing.
The Multiple Correlation Coefficient. has (p +1)-variate Normal distribution with mean vector and Covariance matrix We are interested if the variable.
Some matrix stuff.
Matrix Algebra and Regression a matrix is a rectangular array of elements m=#rows, n=#columns  m x n a single value is called a ‘scalar’ a single row.
1 Multivariate Linear Regression Models Shyh-Kang Jeng Department of Electrical Engineering/ Graduate Institute of Communication/ Graduate Institute of.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Multivariate Statistics Matrix Algebra I W. M. van der Veld University of Amsterdam.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Canonical Correlation Psy 524 Andrew Ainsworth. Matrices Summaries and reconfiguration.
Principal Component Analysis (PCA). Data Reduction summarization of data with many (p) variables by a smaller set of (k) derived (synthetic, composite)
Chapter 7 Multivariate techniques with text Parallel embedded system design lab 이청용.
Discriminant Analysis
Special Topic: Matrix Algebra and the ANOVA Matrix properties Types of matrices Matrix operations Matrix algebra in Excel Regression using matrices ANOVA.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Genotype x Environment Interactions Analyses of Multiple Location Trials.
Stats & Summary. The Woodbury Theorem where the inverses.
Venn diagram shows (R 2 ) the amount of variance in Y that is explained by X. Unexplained Variance in Y. (1-R 2 ) =.36, 36% R 2 =.64 (64%)
Genotype x Environment Interactions Analyses of Multiple Location Trials.
Université d’Ottawa / University of Ottawa 2001 Bio 8100s Applied Multivariate Biostatistics L11.1 Lecture 11: Canonical correlation analysis (CANCOR)
Biostatistics Regression and Correlation Methods Class #10 April 4, 2000.
FACTOR ANALYSIS.  The basic objective of Factor Analysis is data reduction or structure detection.  The purpose of data reduction is to remove redundant.
Multivariate statistical methods. Multivariate methods multivariate dataset – group of n objects, m variables (as a rule n>m, if possible). confirmation.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Chapter 14 EXPLORATORY FACTOR ANALYSIS. Exploratory Factor Analysis  Statistical technique for dealing with multiple variables  Many variables are reduced.
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
Canonical Correlation Analysis (CCA). CCA This is it! The mother of all linear statistical analysis When ? We want to find a structural relation between.
Stats Methods at IC Lecture 3: Regression.
CSE 4705 Artificial Intelligence
Principal component analysis (PCA)
Chapter 7. Classification and Prediction
LECTURE 10: DISCRIMINANT ANALYSIS
Regression.
More Regression.
Techniques for studying correlation and covariance structure
Multivariate Statistics
Multivariate Linear Regression Models
LECTURE 09: DISCRIMINANT ANALYSIS
Feature Selection Methods
Principal Component Analysis
Marios Mattheakis and Pavlos Protopapas
Presentation transcript:

Multivariate Transformation

Multivariate Transformations  Started in statistics of psychology and sociology.  Also called multivariate analyses and multivariate statistics.  Have been used by biological scientists since Fisher  Different from all other forms of statistics.  Explained in form of matrix algebra.

Bus Time-Table

Properties of Matrices Sum of leading diagonal is called the trace There is the possibility that a symmetrical matrix may be singular

Singular Matrices a -3b -c = 0

If A and B are symmetrical matrices, of size p x p then there are p values of A that make A - ΔB singular These values are called latent- roots or eigen-values The multipliers (transformants) are called eigen-vectors Singular Matrices

A= B= There should be 3 values of Δ that make A - ΔB singular One would be 21

A= B= There should be 3 values of Δ that make A - ΔB singular One would be B= A- 21B = Which has the eigen-vector [8 -5 1]

A= B= There should be 3 values of Δ that make A - ΔB singular Another one would be B= A- 6 B = Which has the eigen-vector [ ]

A= B= There should be 3 values of Δ that make A - ΔB singular Another one would be B= A- 6 B = Which has the eigen-vector [4 1 -3]

Eigen values and eigen vectors

Use of Singular Matrices  Used in several multivariate transformations where A and B represent variability of sets of characters.  Making A - ΔB singular may be regarded as subtracting B from A as often as possible, until the determinential value is zero.

What do plant scientists do?  They test hypothesis: “Does this treatment affect the crop?”  They estimate a quantity in a hypothesis: “What is the expected yield increase resulting from adding 100 lbs of nitrogen?”  Multivariate transformations serve neither purpose, but rather they set hypothesis!

Reduce the dimensions of complex situations Why use multivariate Transformations Principal Components Canonical Analyses

Matrix of Interest XX’ = = A

Principal Components Example # 1  Extracted from the work of Moore.  Concerned with the effect of size of apple trees at planting on future tree development  Tree weight (w); trunk circumference squared (x); length of laterals (y) and length of central leader (z)

CharacterWeight Weight1.00Trunk Trunk Lateral Lateral Leader Leader Principal Components Example # 1 = A

CharacterWeight Weight1.00Trunk Trunk01.00Lateral Lateral001.00Leader Leader Principal Components Example # 1 = B

 The sum of the eigen values equals the trace of A (the original correlation matrix).  The trace of A is the total variance of the four variables.  The value of the eigen value indicates the proportion of the total variation that is accounted for by that transformation. Principal Components Example # 1

 Twenty different Brassica cultivars.  Effect of insect damage and plant morphology.  Record 10 variables, three treatments. Principal Components Example # 2

+54%+23%

Principal Components Example # 2

S. alba B. napus S. alba x B. napus

Problems for Statisticians  It should be noted that multivariate transformations are often speculative.  Analyses are laborious and require unique and specific computer software.  There are large dangers that we let the computer reduce the dimensions of a problem but in a non-biological manner.

Multivariate Transformations Applicable to multiple dimension problems Reduce the dimensions of complex problems Must be treated with knowledge of biological systems. Can be considered as a “try it and see” technique Can point researchers in the correct direction and indicates possible hypothesis that might be tested in future studies

Summary  Association between characters.  Simple linear regression model.  Estimation of parameters.  Analysis of variance of regression.  Testing regression parameters (t tests).

Summary  Prediction using regression.  Outliers.  Scatter diagrams.  Making a curved line strait.  Transformation, probit analysis.  Optimal assent, where strait lines meet.

Summary  Correlation.  Bi-variate distribution.  Testing correlation coefficients.  Transforming to z.  Use of correlation.

Summary  Multiple regression.  Analysis of variance.  Forward step-wise regression.  Polynomial regression.  Multivariate transformation.

Multiple Experiments Genotype x Environment Interactions