Presentation is loading. Please wait.

Presentation is loading. Please wait.

Conditional Probability on a joint discrete distribution

Similar presentations


Presentation on theme: "Conditional Probability on a joint discrete distribution"— Presentation transcript:

1 Conditional Probability on a joint discrete distribution
Given the joint pmf of X and Y, we want to find and These are the base for defining conditional distributions… STA347

2 Definition For X, Y discrete random variables with joint pmf pX,Y(x,y) and marginal mass function pX(x) and pY(y). If x is a number such that pX(x) > 0, then the conditional pmf of Y given X = x is Is this a valid pmf? Similarly, the conditional pmf of X given Y = y is Note, from the above conditional pmf we get Summing both sides over all possible values of Y we get This is an extremely useful application of the law of total probability. Note: If X, Y are independent random variables then PX|Y(x|y) = PX(x). STA347

3 Example Suppose we roll a fair die; whatever number comes up we toss a coin that many times. What is the distribution of the number of heads? Let X = number of heads, Y = number on die. We know that Want to find pX(x). The conditional probability function of X given Y = y is given by for x = 0, 1, …, y. By the Law of Total Probability we have Possible values of x: 0,1,2,…,6. STA347

4 Conditional densities
If X, Y jointly distributed continuous random variables, the conditional density function of Y | X is defined to be if fX(x) > 0 and 0 otherwise. If X, Y are independent then Also, Integrating both sides over x we get This is a useful application of the law of total probability for the continuous case. STA347

5 Example Consider the joint density
Find the conditional density of X given Y and the conditional density of Y given X. STA347

6 Conditional Expectation
For X, Y discrete random variables, the conditional expectation of Y given X = x is and the conditional variance of Y given X = x is where these are defined only if the sums converges absolutely. In general, STA347

7 For X, Y continuous random variables, the conditional expectation of
Y given X = x is and the conditional variance of Y given X = x is In general, STA347

8 Example Suppose X, Y are continuous random variables with joint density function Find E(X | Y = 2). STA347

9 More on Conditional Expectation
Assume that E(Y | X = x) exists for every x in the range of X. Then, E(Y | X ) is a random variable. The expectation of this random variable is E [E(Y | X )] Theorem E [E(Y | X )] = E(Y) This is called the “Law of Total Expectation”. Proof: STA347

10 Example Suppose we roll a fair die; whatever number comes up we toss a coin that many times. What is the expected number of heads? STA347

11 Theorem For random variables X, Y V(Y) = V [E(Y|X)] + E[V(Y|X)] Proof:
STA347

12 Example Let X ~ Geometric(p).
Given X = x, let Y have conditionally the Binomial(x, p) distribution. Scenario: doing Bernoulli trails with success probability p until 1st success so X : number of trails. Then do x more trails and count the number of success which is Y. Find, E(Y), V(Y). STA347


Download ppt "Conditional Probability on a joint discrete distribution"

Similar presentations


Ads by Google