Download presentation
Presentation is loading. Please wait.
Published byJayce Sperling Modified over 9 years ago
1
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee
2
Random Vector Given a random experiment with a sample space C. Consider two random variables X 1 and X 2 which assign to each element c of C one and only one ordered pair of numbers X 1 (c)=x 1 and X 2 (c)=x 2. Then we say that (X 1, X 2 ) is a random vector. The space of (X 1, X 2 ) is the set of ordered pairs D={(x 1, x 2 ) : X 1 (c)=x 1 and X 2 (c)=x 2 }
3
Cumulative Distribution Function The joint cumulative distribution function of (X 1, X 2 ) is denoted by F X1,X2 (x 1, x 2 ) and is given as F X1,X2 (x 1, x 2 ) =P[X 1 ≤x 1, X 2 ≤x 2 )]. A random vector (X 1, X 2 )is a discrete random variable is its space D is finite or countable. A random vector (X 1, X 2 ) with space D is continuous if its cdf F X1,X2 (x 1, x 2 ) is continuous.
4
Probability Mass Function For discrete random variables X 1 and X 2, the joint pmf is defined as
5
Probability Density Function For a continuous random vector
6
Marginals The marginal distributions can be obtained from the joint probability density function. For a discrete and continuous random vector the marginals can be obtained as below:
7
Expectation Suppose (X 1, X 2 ) is of the continuous type. Then E(Y) exists if
8
Theorem Let (X 1, X 2 ) be a random vector. Let Y 1 = g 1 (X 1, X 2 ) and Y 2 = g 2 (X 1, X 2 ) be a random variable whose expectations exits. Then for any real numbers k 1 and k 2. E(k 1 Y 1 + k 2 Y 2 )= k 1 E(Y 1 ) + k 2 E(Y 2 )
9
Note
10
Moment Generating Function Let X = (X 1. X 2 ) ’ be a random vector. If E(e t1x1+t2x2 ) exists for |t 1 |<h 1 and |t 2 |<h 2 where h 1 and h 2 are positive, the mgf is given as
11
2.3 CONDITIONAL DISTRIBUTIONS AND EXPECTATIONS So far we know – How to find marginals given the joint distribution. Now – Look at conditional distribution, distribution of one of the random variable when the other has a specific value.
12
Conditional pmf We define S X2 is the support of X 2. Here we assume p X1 (x 1 ) > 0. Thus conditional probability is the joint divvied by the marginal.
13
Conditional pdf Let f X1x2 (x 1, x 2 ) be the joint pdf and f x1 (x 1 ) and f x2 (x 2 ) be the marginals for X1 and X2 respectively then the conditional pdf of X2, given X1 is
14
Note
15
Conditional Expectation and Variance
16
Theorem Let (X 1, X 2 ) be a random vector such that the variance of X 2 is finite. Then – E[E(X 2 |X 1 )]=E(X 2 ) – Var[E(X 2 |X 1 )]≤ var(X 2 )
17
2.4 The Correlation Coefficient Here ρ is called the correlation coefficient of X and Y. Cov(X,Y) is the covariance between X and Y.
18
The Correlation Coefficient Note that -1 ≤ ρ≤ 1. For the bivariate case – If ρ = 1, the graph of the line y = a + bx (b > 0) contains all the probability of the distribution of X and Y. – For ρ = -1, the above is true for the line y = a + bx with b < 0. – For the non-extreme case, ρ can be looked as a measure of the intensity of the concentration of the probability of X and Y about a line y = a + bx.
19
Theorem Suppose (X,Y) have a joint distribution with the variance of X and Y finite and positive. Denote the means and variances of X and Y by µ 1, µ 2 and σ 1 2, σ 2 2 respectively, and let ρ be the correlation coefficient between X and Y. If E(Y|X) is linear in X then
20
2.5 Independent random Variables If the conditional pdf f 2|1 (x 2 |x 1 ) does not depend upon x 1 then the marginal pdf of X 2 equals the conditional pdf f 2|1 (x 2 |x 1 ). Let the random variables X and Y have joint pdf f(x,y) and the marginals f x (x) and f y (y) respectively. The random variables X and Y are said to be independent if and only if – f(x,y)= f x (x) f y (y) – Similar defintion can be wriiten for discrete random variables. – Random variables that are not independent are said to be dependent.
21
Theorem Let the random variables X and Y have support S1 and S2, respectively and have the joint pdf f(x,y). Then X and Y are independent if and only if f(x,y) can be written as a product of a nonnegative function of x and a nonnegative function of y. That is f(x,y)=g(x)h(y) where g(x)>0 and h(y)>0.
22
Note In general X and Y must be dependent of the space of positive probability density of X and Y is bounded by a curve that is neither a horizontal or vertical line. Example; f(x,y)=8xy, 0< x< y < 1 – S={(x,y): 0< x< y < 1} This is not a product space.
23
Theorems Let (X, Y) have the joint cfd F(x,y) and let A and Y have the marginal cdfs F x (x) and F y (y) respectively. Then X and Y are independent if and only if – F(x,y)= F x (x)F y (y) The random variable X and Y are independent if and only if the following condition holds. – P(a < X≤ b, c < Y ≤ d)= P(a < X≤ b)P( c < Y ≤ d) – For ever a < b, c < d and a,b,c and are constants.
24
Theorems Suppose X and Y are independent and that E(u(X)) and E(v(Y)) exist, then – E[u(x), v(Y)]=E[u(X)]E[v(Y)] Suppose the joint mgf M(t1,t2) exists for the random variables X and Y. Then X and Y are independent if and only if – M(t1,t2) = M(t1,0)M(0,t2) That is the joint mfg if the product of the marginal mgfs.
25
Note If X and Y are independent then the correlation coefficient is zero. However a zero correlation coefficient does not imply independence.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.