Probability theory 2008 Outline of lecture 5 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Eigen Decomposition and Singular Value Decomposition
A. The Basic Principle We consider the multivariate extension of multiple linear regression – modeling the relationship between m responses Y 1,…,Y m and.
Component Analysis (Review)
Lecture 2 Today: Statistical Review cont’d:
Multivariate distributions. The Normal distribution.
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
1 Chapter 3 Multiple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Probability theory 2011 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different definitions.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
CSE 300: Software Reliability Engineering Topics covered: Software metrics and software reliability Software complexity and software quality.
Probability theory 2008 Conditional probability mass function  Discrete case  Continuous case.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Probability theory 2010 Conditional distributions  Conditional probability:  Conditional probability mass function: Discrete case  Conditional probability.
Continuous Random Variables and Probability Distributions
Orthogonality and Least Squares
Correlation. The sample covariance matrix: where.
Lecture II-2: Probability Review
The Multivariate Normal Distribution, Part 2 BMTRY 726 1/14/2014.
Modern Navigation Thomas Herring
Separate multivariate observations
5.1 Orthogonality.
Computer vision: models, learning and inference Chapter 5 The Normal Distribution.
Maximum Likelihood Estimation
Principles of Pattern Recognition
The Multiple Correlation Coefficient. has (p +1)-variate Normal distribution with mean vector and Covariance matrix We are interested if the variable.
Linear Regression Andy Jacobson July 2006 Statistical Anecdotes: Do hospitals make you sick? Student’s story Etymology of “regression”
AN ORTHOGONAL PROJECTION
Polynomials P4.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
Ch 2. Probability Distributions (1/2) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by Yung-Kyun Noh and Joo-kyung Kim Biointelligence.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
A Review of Some Fundamental Mathematical and Statistical Concepts UnB Mestrado em Ciências Contábeis Prof. Otávio Medeiros, MSc, PhD.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Geology 5670/6670 Inverse Theory 21 Jan 2015 © A.R. Lowry 2015 Read for Fri 23 Jan: Menke Ch 3 (39-68) Last time: Ordinary Least Squares Inversion Ordinary.
1 Matrix Algebra and Random Vectors Shyh-Kang Jeng Department of Electrical Engineering/ Graduate Institute of Communication/ Graduate Institute of Networking.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Discriminant Analysis
Geology 6600/7600 Signal Analysis 02 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Chapter 5 Statistical Inference Estimation and Testing Hypotheses.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Continuous Random Variables and Probability Distributions
- 1 - Preliminaries Multivariate normal model (section 3.6, Gelman) –For a multi-parameter vector y, multivariate normal distribution is where  is covariance.
Geology 6600/7600 Signal Analysis 04 Sep 2014 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Probability Theory and Parameter Estimation I
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
CH 5: Multivariate Methods
Regression.
Computer vision: models, learning and inference
Statistical Analysis Professor Lynne Stokes
The Multivariate Normal Distribution, Part 2
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Singular Value Decomposition SVD
Matrix Algebra and Random Vectors
Generally Discriminant Analysis
Symmetric Matrices and Quadratic Forms
6.3 Sampling Distributions
Multivariate Methods Berlin Chen
Multivariate Methods Berlin Chen, 2005 References:
Chapter-1 Multivariate Normal Distributions
Further Topics on Random Variables: Covariance and Correlation
Topic 11: Matrix Approach to Linear Regression
Linear Vector Space and Matrix Mechanics
Further Topics on Random Variables: Covariance and Correlation
Presentation transcript:

Probability theory 2008 Outline of lecture 5 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different definitions of normal random vectors  Conditional distributions  Independence  Cochran’s theorem

Probability theory 2008 The univariate normal distribution - defining properties  A distribution is normal if and only if it has the probability density where   R and  > 0.  A distribution is normal if and only if the sample mean and the sample variance are independent for all n.

Probability theory 2008 The univariate normal distribution - defining properties  Suppose that X 1 and X 2 are independent of each other, and that the same is true for the pair where no coefficient vanishes. Then all four variables are normal. Special case: rotations other than multiples of 90 degrees x1x1 x2x2

Probability theory 2008 The univariate normal distribution - defining properties Let F be a class of distributions such that X  F  a + bX  F Can F be comprised of distributions other than the normal distributions? cf. Cauchy distributions

Probability theory 2008 The multivariate normal distribution - a first definition  A random vector is normal if and only if every linear combination of its components is normal Immediate consequences: Every component is normal The sum of all components is normal Every marginal distribution is normal Vectors in which the components are independent normal random variables are normal Linear transformations of normal random vectors give rise to new normal vectors

Probability theory 2008 Illustrations of independent and dependent normal distributions

Probability theory 2008 Illustrations of independent and dependent normal distributions

Probability theory 2008 Parameterization of the multivariate normal distribution  Is a multivariate normal distribution uniquely determined by the vector of expected values and the covariance matrix?  Is there a multivariate normal distribution for any covariance matrix?

Probability theory 2008 Fundamental results for covariance matrices Let  be a covariance matrix. Since  is symmetric there exists an orthogonal matrix C ( C’C = C C’ = I ) such that C’  C = D and  = CD C’ where D is a diagonal matrix. Since  is also nonnegative-definite, there exists a symmetric matrix B such that B B =  If X has independent components with variance 1, Y = BX has covariance matrix 

Probability theory 2008 The multivariate normal distribution - a second definition  A random vector is normal if and only if it has a characteristic function of the form where  is a nonnegative-definite, symmetric matrix and  is a vector of constants Proof of the equivalence of definition I and II: Let X  N( ,  ) according to definition I, and set Z = t’X. Then E(Z) = t’u and Var(Z) = t’  t, and  Z (1) gives the desired expression. Let X  N( ,  ) according to definition II. Then we can derive the characteristic function of any linear combination of its components and show that it is normally distributed.

Probability theory 2008 The multivariate normal distribution - a third definition  Let Y be normal with independent standard normal components and set Then provided that the determinant is non-zero.

Probability theory 2008 The multivariate normal distribution - a fourth definition  Let Y be normal with independent standard normal components and set Then X is said to be a normal random vector.

Probability theory 2008 The multivariate normal distribution - conditional distributions  All conditional distributions in a multivariate normal vector are normal  The conditional distribution of each component is equal to that of a linear combination of the other components plus a random error

Probability theory 2008 The multivariate normal distribution - conditional distributions and optimal predictors  For any random vector X it is known that E(X n | X 1, …, X n-1 ) is an optimal predictor of Xn based on X 1, …, X n-1 and that X n = E(X n | X 1, …, X n-1 ) +  where  is uncorrelated to the conditional expectation.  For normal random vectors X, the optimal predictor E(X n | X 1, …, X n-1 ) is a linear expression in X 1, …, X n-1

Probability theory 2008 The multivariate normal distribution - calculation of conditional distributions  Let X  N (0,  ) where Determine the conditional distribution of X 3 given X 1 and X 2  Set Z = a X 1 + bX 2 + c Minimize the variance of the prediction error Z - X 3

Probability theory 2008 The multivariate normal vector - uncorrelated and independent components The components of a normal random vector are independent if and only if they are uncorrelated

Probability theory 2008 The multivariate normal distribution - orthogonal transformations  Let X be a normal random vector with independent standard normal components, and let C be an orthogonal matrix.  Then Y = CX has independent, standard normal components

Probability theory 2008 Quadratic forms of the components of a multivariate normal distribution – one-way analysis of variance Let X ijij, i = 1, …, k, j = 1, …, n i, be k samples of observations. Then, the total variation in the X -values can be decomposed as follows:

Probability theory 2008

Decomposition theorem for nonnegative- definite quadratic forms Let where Then there exists an orthogonal matrix C such that with x = Cy (y = C’x)

Probability theory 2008 Decomposition theorem for nonnegative-definite quadratic forms (Cochran’s theorem) Let X 1, …, X n be independent and N(0;  2 ) and suppose that where Then there exists an orthogonal matrix C such that with X = CY (Y = C’X) Furthermore, Q 1, …, Q p are independent and  2  2 -distrubuted with r 1, …r p degrees of freedom

Probability theory 2008 Quadratic forms of the components of a multivariate normal distribution – one-way analysis of variance Let X ijij, i = 1, …, k, j = 1, …, n i, be independent and N( ,  2 ). Then, the total sum of squares can be decomposed into three quadratic forms which are independent and  2  2 -distrubuted with 1, k-1, and n-k degrees of freedom

Probability theory 2008 Exercises: Chapter V 5.1, 5.2, 5.6, 5.8, 5.14, 5.16, 5.17, 5.27