Chapter 4 Multivariate Normal Distribution. 4.1 Random Vector Random Variable Random Vector X X 1, , X p are random variables.

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Multivariate distributions. The Normal distribution.
06/05/2008 Jae Hyun Kim Chapter 2 Probability Theory (ii) : Many Random Variables Bioinformatics Tea Seminar: Statistical Methods in Bioinformatics.
Probability theory 2010 Main topics in the course on probability theory  Multivariate random variables  Conditional distributions  Transforms  Order.
Discrete Random Variables and Probability Distributions
1 Engineering Computation Part 6. 2 Probability density function.
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
Probability theory 2011 Main topics in the course on probability theory  The concept of probability – Repetition of basic skills  Multivariate random.
Probability theory 2011 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different definitions.
Today Today: Chapter 5 Reading: –Chapter 5 (not 5.12) –Exam includes sections from Chapter 5 –Suggested problems: 5.1, 5.2, 5.3, 5.15, 5.25, 5.33,
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Chapter 4: Joint and Conditional Distributions
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Probability theory 2008 Outline of lecture 5 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
The Multivariate Normal Distribution, Part 2 BMTRY 726 1/14/2014.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Maximum Likelihood Estimation
Review of Probability.
Jointly Distributed Random Variables
Random variables Petter Mostad Repetition Sample space, set theory, events, probability Conditional probability, Bayes theorem, independence,
§4 Continuous source and Gaussian channel
IRDM WS Chapter 2: Basics from Probability Theory and Statistics 2.1 Probability Theory Events, Probabilities, Random Variables, Distributions,
Chapter 14 Monte Carlo Simulation Introduction Find several parameters Parameter follow the specific probability distribution Generate parameter.
Marginal and Conditional distributions. Theorem: (Marginal distributions for the Multivariate Normal distribution) have p-variate Normal distribution.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Chapter 3 Random vectors and their numerical characteristics.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Statistics for Business & Economics
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Geology 6600/7600 Signal Analysis 02 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Math 4030 – 6a Joint Distributions (Discrete)
Chapter 3 Multivariate Random Variables
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
LECTURE 18 TUESDAY, 27 OCTOBER STA 291 Fall
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
Expectations Introduction to Probability & Statistics Expectations.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Geology 6600/7600 Signal Analysis 04 Sep 2014 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Chapter 5 Joint Probability Distributions and Random Samples
Lecture 3 B Maysaa ELmahi.
Main topics in the course on probability theory
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The distribution function F(x)
Two Discrete Random Variables
South Dakota School of Mines & Technology Introduction to Probability & Statistics Industrial Engineering.
Chapter 4: Mathematical Expectation:
Chapter 3: Getting the Hang of Statistics
ECE 417 Lecture 4: Multivariate Gaussians
Multivariate distributions
Random Variable X, with pmf p(x) or pdf f(x)
The Multivariate Normal Distribution, Part 2
Introduction to Probability & Statistics Expectations
... DISCRETE random variables X, Y Joint Probability Mass Function y1
Chapter 3: Getting the Hang of Statistics
Analysis of Engineering and Scientific Data
6.3 Sampling Distributions
Chapter-1 Multivariate Normal Distributions
ASV Chapters 1 - Sample Spaces and Probabilities
Financial Econometrics Fin. 505
Mathematical Expectation
Presentation transcript:

Chapter 4 Multivariate Normal Distribution

4.1 Random Vector Random Variable Random Vector X X 1, , X p are random variables

A. Cumulative Distribution Function (c.d.f.) Random Variable F(x) = P(X  x) F(x) = F(x 1, ,x p ) = P(X 1  x 1, , X p  x p ) Marginal distribution F(x 1 ) = P(X 1  x 1 ) = P(X 1  x 1, X 2 , , X p  ) = F(x 1, , ,  ) F(x 1, x 2 ) = P(X 1  x 1,X 2  x 2 ) = F(x 1, x 2, , ,  ) Random Vector

B. Density Random Variable Random Vector

C. Conditional Distribution Random Variable Random Vector Conditional Probability of A given B when A and B are not independent Conditional Density of x 1, , x q given x q+1 = x q+1, , x p = x p. h g where h: the joint density of x 1, , x p ; g: the marginal density of x q+1, , x p.

D. Independence Random Variable Random Vector (X 1,X 2 ) ~ F(x 1, x 2 ) If F(x 1, x 2 )= F 1 (x 1 ) F 2 (x 2 ),  x 1, x 2  x 1 and x 2 are said to be independent. (X 1, ,X p ) ~ F(x 1, ,x p ) If  X 1, ,X p are said to be mutually independent.

(X 1,X 2 ) ~ F(x 1, x 2 ) If F(x 1, x 2 )= F 1 (x 1 ) F 2 (x 2 ),  x 1, x 2  x 1 and x 2 are said to be independent. (X 1, ,X p ) ~ F(x 1, ,x p ) If  X 1, ,X p are said to be mutually independent. Random Vector X ~ F(x 1, ,x p ), Y ~ G(y 1, , y q ) X and Y are independent if

E. Expectation Random Variable Random Vector

Some Properties: E(AX) = AE(X) E(AXB + C) = AE(X)B + C E(AX + BY) = AE(X) + BE(Y) E(tr AX) = tr(AE(X))

F. Variance - Covariance Random Variable Random Vector

Other Properties:Cov(x) = Cov(x, x) Cov(Ax, By) = A Cov(x, y) B Cov(Ax) = A Cov(x) A Cov(x - a, y - b) = Cov(x, y), where a and b are constant vectors Cov(x - a) = Cov(x), where a is constant vector E(xx) = Cov(x) + E(x)E(x) E(x - a)(x - a) = Cov(x) + (E(x)- a)(E(x)- a)  a  R n Assume that E(x)=  and Cov(x) =  exist, and A is an p  p constant matrix, then E(xAx) = tr(A  ) +  A 

G. Correlation Random Variable Random Vector x = (X 1, ,X p ) that is called correlation matrix of x. Corr(x) = (Corr(X i,X j )): p  p

4.2 Multivariate Normal Distribution Random Variable: X ~ N( ,  2 )

Definition of Multivariate Normal Distribution standard normal:y = (Y 1, ,Y q ), Y 1, ,Y q i.i.d, N(0, 1) y ~ N q (0, I q )

Definition of Multivariate Normal Distribution

4.3 The bivariate normal distribution

The density function x is The contour of p( x 1, x 2 ) is an ellipsoid

4.4 Marginal and conditional distributions Theorem 4.4.1

Corollary 1 Corollary 2 All marginal distributions of are still normal distributions. Example Then,

The distribution of Ax is multivariate normal with mean And covariance matrix

Theorem Let x be a p × 1 random vector. Then x has a multivariate normal distribution if and only if a’x follows a normal distribution for any. Note:

Theorem The assumption is the same as in corollary 1 of Theorem Then the conditional distribution of x 1 given x 2 = x 2 is where Example 4.4.2

Example 1 Let x = ( x 1, …, x s ) be some body characteristics of women, where x 1 : Hight ( 身高 ) x 2 : Bust ( 胸圍 ) x 3 : Waist ( 腰圍 ) x 4 : Height below neck ( 頸下高度 ) x 5 : Buttocks ( 臀圍 )

The correlation of R can be computer from Take x (1) = ( x 1, x 2, x 3 ), x (1) = ( x 4 ) and x (3) = ( x 5 ).

Homework 3.5. Please directly computeand computer it by the recursion formula.

We see that

4.5 Independent Theorem Corollary 1