Presentation is loading. Please wait.

Presentation is loading. Please wait.

Independence of random variables

Similar presentations


Presentation on theme: "Independence of random variables"— Presentation transcript:

1 Independence of random variables
Definition Random variables X and Y are independent if their joint distribution function factors into the product of their marginal distribution functions Theorem Suppose X and Y are jointly continuous random variables. X and Y are independent if and only if given any two densities for X and Y their product is the joint density for the pair (X,Y) i.e. Proof: If X and Y are independent random variables and Z =g(X), W = h(Y) then Z, W are also independent. week 9

2 Example Suppose X and Y are discrete random variables whose values are the non-negative integers and their joint probability function is Are X and Y independent? What are their marginal distributions? Factorization is enough for independence, but we need to be careful of constant terms for factors to be marginal probability functions. week 9

3 Example and Important Comment
The joint density for X, Y is given by Are X, Y independent? Independence requires that the set of points where the joint density is positive must be the Cartesian product of the set of points where the marginal densities are positive i.e. the set of points where fX,Y(x,y) >0 must be (possibly infinite) rectangles. week 9

4 Conditional densities
If X, Y jointly distributed continuous random variables, the conditional density function of Y | X is defined to be if fX(x) > 0 and 0 otherwise. If X, Y are independent then Also, Integrating both sides over x we get This is a useful application of the law of total probability for the continuous case. week 9

5 Example Consider the joint density
Find the conditional density of X given Y and the conditional density of Y given X. week 9

6 Properties of Expectations Involving Joint Distributions
For random variables X, Y and constants E(aX + bY) = aE(X) + bE(Y) Proof: For independent random variables X, Y E(XY) = E(X)E(Y) whenever these expectations exist. week 9

7 Covariance Recall: Var(X+Y) = Var(X) + Var(Y) +2 E[(X-E(X))(Y-E(Y))]
Definition For random variables X, Y with E(X), E(Y) < ∞, the covariance of X and Y is Covariance measures whether or not X-E(X) and Y-E(Y) have the same sign. Claim: Proof: Note: If X, Y independent then E(XY) =E(X)E(Y), and Cov(X,Y) = 0. week 9

8 Example Suppose X, Y are discrete random variables with probability function given by Find Cov(X,Y). Are X,Y independent? y x -1 1 pX(x) 1/8 pY(y) week 9

9 Important Facts Independence of X, Y implies Cov(X,Y) = 0 but NOT vice versa. If X, Y independent then Var(X+Y) = Var(X) + Var(Y). If X, Y are NOT independent then Var(X+Y) = Var(X) + Var(Y) + 2Cov(X,Y). Cov(X,X) = Var(X). week 9

10 Example Suppose Y ~ Binomial(n, p). Find Var(Y). week 9

11 Properties of Covariance
For random variables X, Y, Z and constants Cov(aX+b, cY+d) = acCov(X,Y) Cov(X+Y, Z) = Cov(X,Z) + Cov(Y,Z) Cov(X,Y) = Cov(Y, X) week 9

12 Correlation Definition
For X, Y random variables the correlation of X and Y is whenever V(X), V(Y) ≠ 0 and all these quantities exists. Claim: ρ(aX+b,cY+d) = ρ(X,Y) Proof: This claim means that the correlation is scale invariant. week 9

13 Theorem For X, Y random variables, whenever the correlation ρ(X,Y) exists it must satisfy -1 ≤ ρ(X,Y) ≤ 1 Proof: week 9

14 Interpretation of Correlation ρ
ρ(X,Y) is a measure of the strength and direction of the linear relationship between X, Y. If X, Y have non-zero variance, then If X, Y independent, then ρ(X,Y) = 0. Note, it is not the only time when ρ(X,Y) = 0 !!! Y is a linearly increasing function of X if and only if ρ(X,Y) = 1. Y is a linearly decreasing function of X if and only if ρ(X,Y) = -1. week 9

15 Example Find Var(X - Y) and ρ(X,Y) if X, Y have the following joint density week 9


Download ppt "Independence of random variables"

Similar presentations


Ads by Google