Download presentation
Presentation is loading. Please wait.
1
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained by i. summing the probabilities corresponding to each y value and the given x. (discrete case) f X (x)= Y f XY (x,y)|X=x Joint Probability Distribution
2
The conditional probability distribution function of X given Y is denoted by f X|Y (x|y). f X|Y (x|y)=f X, Y (x,y)/ f Y (y) [We similarly define f Y|X (y|x)] Random variables X and Y are independent if and only if f X|Y (x|y)=f X (x) for all x and y ii. Integrating out Y from the joint pdf (continuous case) f X (x)= Y f XY (x,y)dy
3
Joint Probability Distribution Covariance of two random variables X and Y Cov(X,Y) E{(X- X )(Y- Y )} E(XY- Y X –X Y + X Y ) E(XY) - X E(Y) – Y E(X) + X Y E(XY)– X Y - X Y + X Y E(XY)- X Y
4
The Coefficient of Correlation between two random variables X and Y (X,Y) Cov(X,Y)/ X Y Two random variables X and Y are uncorrelated if (X,Y) 0; or if E(XY)= X Y
5
An important result: Suppose that X and Y are two random variables that have mean X, Y and standard deviation X and Y respectively. Then for Z aX +bY E(Z) a X + b Y VarZ a 2 VarX + b 2 VarY + 2abcov(X,Y) a 2 X + b 2 Y + 2ab XY X Y
6
VarZ a 2 VarX + b 2 VarY + 2abcov(X,Y) a 2 X + b 2 Y + 2ab XY X Y where XY Coefficient of Correlation between X and Y. If XY = -1 then VarZ = (a X -b Y ) 2
7
Joint Probability Distribution Recall: Two random variables x and y are independent if their joint p.d.f. f XY (x,y) is the product of the respective marginal p.d.f. f X (x) and f Y (y). That is, f XY (x,y) = f X (x). f y (y)
8
Theorem: Independence of two random variables X and Y imply that they are uncorrelated (but the converse is not always true)
9
Proof: E(XY) = xy f XY (x,y)dxdy E(XY) = xy g(x)h(y)dxdy E(XY) = ( x g(x)dx)y h(y)dy E(XY) = ( X )y h(y)dy E(XY) = X y h(y)dy E(XY) = X Y
10
The Normal Distribution A continuous distribution with the pdf: f(X) = {1/( } e –1/2[(X- x )/ x )2 ] For the Standard Normal Distribution Variable Z, f(z) = {1/ } e –(1/2) z 2
11
Suppose that X and Y are two random variables such that they have mean X, Y and standard deviation X and Y respectively. Also assume that both X and Y are normally distributed. Then if W aX +bY
12
W ~ Normal( w, w ) with w a X + b Y and w a 2 X + b 2 X + 2ab XY X Y where XY is the relevant correlation coefficient.
13
Message: A linear combination of two or more normally distributed (independent or not) r.v.s has a normal distribution as well.
14
The distribution: Consider Z ~ Normal(0,1). Consider Y = Z 2. Then Y has a dd istribution of 1 degree of freedom (d.o.f.). We write it as Y ~ .
15
Consider Z 1, Z 2, …Z n independent random variables each ~ Normal(0,1) Then their sum Z i 2 has a distribution with d.o.f. = n. That is, Z i 2 ~ (n)
16
Consider two independent random variables X ~ Normal(0,1) and Y ~ (n) Then the variable w X/ (Y/n) has a t-distribution with d.o.f. = n. That is, w ~ tt (n)
17
An Application Then the variable w (X MEAN – s/ n) hh as a t-distribution with d.o.f. = n-1 where s is an unbiased estimator of Consider X ~ ( ). Then X MEAN ~ Normal ( /n) if n is ‘large’ (CLT) Consider X ~ Normal( ). Then X MEAN ~ Normal ( /n)
18
Suppose that X ~ (m). and Y ~ (n) and the variables X and Y are independent. V ~ F m,n Then the variable v ≡ (x/m) /(y/n) has an F distribution with the numerator d.o.f. = m and the denominator d.o.f = n.
19
Suppose that X ~ (1). and Y ~ (n) and the variables X and Y are independent. Then the variable v ≡ x /(y/n) has an F distribution with the numerator d.o.f. = 1 and the denominator d.o.f = n. V ~ F 1,n Clearly, V ~ t (n)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.