Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which a carcass is divided by a butcher. 4. adj., shared or common to two or more Chapter 5A Discrete RV
This week in Prob/Stat today’s good stuff time permitting
Joint Probability Distributions It is often useful (necessary) to have more than one RV defined in a random experiment. Examples: Polyethylene Specs X = Melt Point Y = Density Dimensions of a part X = length Y = width If X & Y are two RV’s, the probability distribution that defines their simultaneous behavior is a Joint Probability Distribution
Two Discrete Random Variables Let X = a discrete random variable, the number of orders placed per day for a high cost item Let Y = a discrete random variable, the number of items in stock Joint Probability Mass Function, f xy (x,y)
5-1 Two Discrete Random Variables Joint Probability Distributions
Two Discrete Random Variables Let X = a discrete random variable, the number of orders placed per day for a high cost item Let Y = a discrete random variable, the number of items in stock Pr{X=0, Y=1} = f xy (0,1) =.15
5-1 Two Discrete Random Variables Marginal Probability Distributions The individual probability distribution of a random variable is referred to as its marginal probability distribution. The marginal probability distribution of X can be determined from the joint probability distribution of X and other random variables. To determine P(X = x), we sum P(X = x, Y = y) over all points in the range of (X, Y ) for which X = x. Subscripts on the probability mass functions distinguish between the random variables.
5-1 Two Discrete Random Variables Definition: Marginal Probability Mass Functions
Two Discrete Random Variables Let X = a discrete random variable, the number of orders placed per day for a high cost item Let Y = a discrete random variable, the number of items in stock Pr{X = 1} = Pr{X=1, Y=0} + Pr{X=1, Y=1} + Pr{X=1, Y=2} + Pr{X=1, Y=3} = f x (1) = =.33 Pr{Y 2} = f y (2) + f y (3) = =.21
Marginal Mean & Variance If the marginal probability distribution of X has the probability mass function f X (x), then R x denotes all points of (X,Y) for which X = x and R denotes all points in the range of (X,Y)
Using the Marginal Distributions E[X] = x = 0 (.39) + 1 (.33) + 2 (.28) =.89 E[Y] = y = 0 (.38) + 1 (.41) + 2 (.16) + 3 (.05) =.88 Var[X] = x 2 = 0 2 (.39) (.33) (.28) =.6579 Var[Y] = y 2 = 0 2 (.38) (.41) (.16) (.05) =.7256
5-1.3 Conditional Probability Distributions
Conditional Distribution of Y f xy (x,y) f Y|x (y) = f xy (x,y) / f x (x) f Y|x = 1 (2) = f xy (1,2) / f x (1) =.05 /.33 =
Conditional Distribution of X f xy (x,y) f X|y (x) = f xy (x,y) / f y (y) f X|y = 2 (1) = f xy (1,2) / f y (2) =.05 /.16 =.3125
Conditional Mean and Variance
Conditional Mean & Var of Y f Y|x (y) = f xy (x,y) / f x (x) E[Y|x=1] = 0 (.30303) + 1 ( ) + 2 ( ) + 3 ( ) =.9091 Var[Y|x=1] = 0 (.30303) + 1 ( ) + 4( ) + 9 ( ) =.5675
Conditional Mean & Var of X E[X|y =2] = 0 (.1875) + 1 (.3125) + 2 (.5) = f X|y (x) Var[X|y =2] = 0 (.1875) + 1 (.3125) + 4 (.5) =.5896
5-1.4 Independence
Are X and Y Independent? f x (1) f y (2) = (.33) (.16) =.0528 f xy (1,2) =.05 f x (2) f y (0) = (.28) (.38) =.1064 f xy (2,0) =.08 No Chuck, they are not independent.
More on Independence Many evaluations of independence are based on knowledge of the physical situation. If we are reasoning based on data, we will need statistical tools to help us. It is very, very unlikely that counts and estimated probabilities will yield exact equalities as in the conditions for establishing independence.
The Search for Independence Let X = a discrete random variable, the number of defects in a lot of size 3 where the probability of a defect is a constant.1. Let Y = a discrete random, the demands in a given day for the number of units from the above lot.
The Search Continues assuming independence: f xy (x,y) = f x (x) f y (y) f xy (1,2) = f x (1) f y (2) = (.243) (.4) =.0972 Remember: P(A B) = P(A) P(B) if A and B are independent
Recap - Sample Problem Assume X & Y are jointly distributed with the following joint probability mass function: Y X 1/8 1/16 3/16 1/4
Sample Problem Cont’d Determine the marginal probability distribution of X P(X = -1) = 1/8 + 1/8 = 1/4 P(X = -0.5) = 1/16 + 1/16 = 1/8 P(X = 0.5) = 3/16 + 1/4 = 7/16 P(X = 1) = 1/16 + 1/8 = 3/16
Sample Problem Cont’d Determine the conditional probability distribution of Y given that X = 1. P(Y = 1 | X = 1) = P(X = 1, Y = 1)/P(X = 1) = (1/16)/(3/16) = 1/3 P(Y = 2 | X = 1) = P(X = 1, Y = 2)/P(X = 1) = (1/8)/(3/16) = 2/3
Sample Problem Cont’d Determine the conditional probability distribution of X given that Y = 1. P(X = 0.5 | Y = 1) = P(X = 0.5, Y = 1)/P(Y = 1) = (1/4)/(5/16) = 4/5 P(X = 1 | Y = 1) = P(X = 1, Y = 1)/P(Y = 1) = (1/16)/(5/16) = 1/5
5-1.5 Multiple Discrete Random Variables Definition: Joint Probability Mass Function
5-1.5 Multiple Discrete Random Variables Definition: Marginal Probability Mass Function
5-1.5 Multiple Discrete Random Variables Mean and Variance from Joint Probability
5-1.6 Multinomial Probability Distribution
The Necessary Example Final inspection of products coming off of the assembly line categorizes every item as either acceptable, needing rework, or rejected. Historically, 90 percent have been acceptable, 7 percent needed rework, and 3 percent have been rejected. For the next 10 items that are produced, what is the probability that there will be 8 acceptable, 2 reworks, and no rejects? Let X 1 = number acceptable, X 2 = number reworks, X 3 = number rejects
More of the Necessary Example The production process is assumed to be out of control (i.e. the probability of an acceptable item is less than.9) if there are fewer than 8 acceptable items produced from a lot size of 10? What is the probability that the production process will be assumed to be out of control when the probability of an acceptable item remains.9? Let X 1 = number acceptable
This week in Prob/Stat Wednesday’s good stuff