5.4 Joint Distributions and Independence
A joint probability function for discrete random variables X and Y is a nonnegative function f(x,y), giving the probability that (simultaneously) X takes the value x and Y takes the value y. That is
The function f(x,y) is a joint probability distribution or probability mass function of the discrete random variable X and Y if 1. 2. 3. P(X=x, Y=y)=f(x,y)
Example A large insurance agency services a number of customers who have purchased both a homeowner’s policy and an automobile policy from the agency. For each type of policy, a deductible amount must be specified. For an automobile policy, the choices are $100 and $250, whereas for a homeowner’s policy, the choices are 0, $100, and $200. Suppose an individual with both types of policy is selected at random from the agency’s files. Let X=deductible amount on the auto policy and Y=the deductible amount on the homeowner’s policy.
The joint probability table is p(x,y) 100 200 x 0.2 0.1 250 0.05 0.15 0.3 Then p(100, 100)=P(X=100 and Y=100) =P($100 deductible on both policies)=.1
y p(x,y) 100 200 x 0.2 0.1 250 0.05 0.15 0.3 The probability P(Y≥100) is computed by summing probabilities of all (x,y) pairs for which y ≥100. P(Y≥100)=p(100,100)+p(250,100)+p(100,200) +p(250,100)=.75
From Joint probability to individual distributions marginal distributions The joint probability function, f(x,y), of X and Y, contains more information than individual probabilities functions of X, and Y. Individual probabilities functions of X, and Y can be obtained from their joint probability function. We call the individual probability functions of X and Y marginal distributions.
Marginal Distributions Definition: The individual probability functions for discrete random variables X and Y with joint probability function f(x,y) are called marginal probability functions. They are obtained by summing f(x,y) values over all possible values of the other variable. The marginal probability function for X is And the marginal probability function for Y is
Marginal probabilities y p(x,y) 100 200 fX(x) x 0.2 0.1 0.5 250 0.05 0.15 0.3 fY(y) 0.25
Conditional Distributions For discrete random variables X and Y with joint probability function f(x,y), the conditional probability function of Y given X=x is Similarly, the conditional distribution of X given Y=y is
Example Given the joint probabilities and marginal probabilities Find the probability of Y , given X=0. f(x,y) x= 0 x=1 x=2 fY(y) y=0 1/6 2/9 1/36 15/36 y=1 y=2 1 /3 0 1/2 1/12 fX(x) 7/12 7/18 1
Solution fX(0)=7/12 f(0,0)=1/6, f(0,1)=1/3, f(0,2)=1/12 Then fY|X(0|0)=2/7; fY|X (1|0)=4/7, fY|X (2|0)=1/7
Statistical Independence f(x|y) doesn’t depend on y; f(y|x) doesn’t depend on x. Verify that f(x|y)=fX(x) & f(y|x)= fY(y). f(x,y)=f(x|y) fY(y) f(x,y)= fX(x) fY(y). Then X and Y are independent
Definition Discrete random variables X and Y are called independent if their joint probability function f(x,y) is the product of their respective marginal distributions. That is, X and Y are said to be statistically independent if f(x,y)=fX(x)fY(y) for all x,y.
Example X and Y have the following joint distribution function Verify that X and Y are independent. F(x,y) x=1 2 3 y=1 0.16 0.08 0.24 0.12
Example X and Y have the following joint distribution function Verify that X and Y are dependent. F(x,y) x=1 2 3 y=1 0.16 0.08 0.24 0.12