Download presentation
Presentation is loading. Please wait.
Published byGodwin Morris Modified over 9 years ago
1
1 Lecture 14: Jointly Distributed Random Variables Devore, Ch. 5.1 and 5.2
2
Topics I.Jointly Distributed Variables II.Joint Distributions of Uncorrelated Variables – Two Independent Random Variables III.Expected Values of Joint Distributions IV.Joint Distributions of Correlated Variables – Covariance and correlation measures of “degree of association”
3
I. Jointly Distributed Variables Many problems in statistics and probability involve more than a single random variable. Therefore, sometimes it is necessary to study several random variables simultaneously. XYZ 100 011 0 10 101 X = height (1 in spec, 0 out of spec) Y = width (1 in spec, 0 out of spec) Z = depth (1 in spec, 0 out of spec) X Y Z
4
Types of Jointly Distributed Variables Two Discrete RVs – Joint PMF – Marginal PMF (each 1 of 2 joint variables) Two Continuous RVs – Joint PDF – Marginal PDF (each 1 of 2 joint variables) Independent variables More than two variables
5
Joint Distribution – Two Discrete RV’s Joint PMF Let X and Y represent 2 discrete rv’s on space S p XY (x,y) >= 0 p XY (x,y) = P(X=x, Y=y) Marginal PMF To obtain a marginal pmf for say X=100, P(100,y) - you compute prob for all possible y values.
6
Example - Joint Probability TV Brand Example (repeated from conditional probability lecture notes) –Event A: buy a TV brand; Event B: repair TV sold –Suppose Selling Mix: A 1 = 50%, A 2 = 30% and A 3 = 20% –Likelihood to Repair Given Model A 1 = 25% –Likelihood to Repair Given Model A 2 = 20% –Likelihood to Repair Given Model A 3 = 10% Types of Questions: What is the probability of a repair?, What is the probability that you have non A 1 models? First Convert information to joint probability table. Example: p(x=A 1, y=repair) = 0.5*0.25 =.125
7
Joint Probability Table What are some requirements of joint probability table? –Sum of all pairs= 1, values >=0. What is the marginal pmf of Y=repair? –p Y (x, repair) What is the probability of having a non-model A 1 ? –p(x=A2,y) or p(x=A3,y) = p(x=A2, y=0) + p(x=A2, y = 1) + p(x=A3, y=0) + p(x=A3, y = 1) 0 (no repair)1 (repair) A10.3750.125 xA20.240.06 A30.180.02 y
8
Joint Dist. – Two Continuous RV’s Joint PDF* Let X and Y represent 2 continous rv’s f XY (x,y) >= 0 for all x,y Marginal PDF To obtain a marginal pdf for say X=x1, P(x1,y) - you compute prob for all possible y values.
9
Two Continuous Dist RV’s- Example Suppose you are producing 1.7 L engine with a bore range of 81 +/- 0.2mm and stroke range of 83.5 +/- 0.2 mm. f(x,y) = K(x 2 + y 2 ) where K = 0.000462 –80.8 <= x <= 81.2; 83.3 <= y <= 83.7 –nominal bore = 81; nominal stroke = 83.5 a) What is the probability that the bore and the stroke will be under their nominal values (x<=81 and y<=83.5)? b) What is the marginal distribution of bore size alone? – f X (x)? – f X (x) is between 80.8 and 80.9?
10
10 Mixture Experiments and Joint Distributions Useful application relates to Mixture Experiments –Let X - proportion of a mix that is component A –Let Y - proportion of a mix that is not A: y = 1 – x The sum of X and Y represent 100% of the mixture f(x,y) = k(xy) where 0 <= x <= 1, 0 <= y <= 1, x+y = 1 –What is k if f(x,y) is pdf?:
11
II. Joint Distributions of Independent Random Variables Independent events: P(A B) = P(A). P(B) If two variables are (fully, or statistically) independent, then –DISCRETE: p(x,y) = p X (x). p Y (y), for ALL possible (x,y) pairs! –CONTINUOUS: f(x,y) = f X (x). f Y (y), for ALL possible (x,y) pairs also! If two variables do not satisfy the above for all (x,y) then they are said to be dependent. Therefore, if you can find even ONE pair not satisfying the above, you just proved dependence!
12
TV Example, continued. Are X and Y independent? –Does p(A1,1) = p X (A1) * p Y (1) ? p X (A1) = p(A1,0) + p(A1,1) = 0.375 +.125 = 0.5 p Y (1) = p(A1,1) + p(A2,1) + p(A3,1) = 0.205 p(A1,1) = 0.125 P x (A1)*p y (1) = 0.5*0.205 = 0.1025 -- So, Dependent –(repair rates are not the same for all brands) 0 (no repair)1 (repair) A10.3750.125 xA20.240.06 A30.180.02 y
13
Reliability Example - Jointly Independent Components Suppose two components in an engine have exponential distributions with expected lifetimes of 1 = 2 = 0.00067). What is the probability that both components will last at least 2500 hours? – Pr(X 1 >= 2500; X 2 >= 2500)
14
III. Expected Values using Joint Distributions Let X and Y be jointly distributed rv’s If two variables are: –X,Y DISCRETE: E[h(x,y)] = –X,Y CONTINUOUS: E[h(x,y)] =
15
Mixture Example Mixture Experiments –Let X - proportion of a mix that is component A –Let Y - proportion of a mix that is not A: y = 1 - x –Let Cost of A = $1 and Cost of B = $1.5 f(x,y) = 24(xy) where 0 <= x <= 1, 0 <= y <= 1, x+y = 1 h(x,y) = 1X + 1.5Y –What is the equation for E[h(X,Y)]?
16
IV. Joint Distributions of Related Variables Suppose X and Y are not independent but dependent. Useful Question: what is the degree of association between X and Y? –Measures of degree of association: Covariance Correlation
17
Covariance of X and Y Covariance between two rv’s X and Y is: –Cov(X,Y) = E[(X- x )(Y- y )] Covariance Results: –If X and Y tend to both be greater than their respective means at the same time, then the covariance becomes a large positive number. –If X and Y tend to both be lower than their respective means at the same time, then the covariance becomes a large negative number. –If X and Y tend to cancel one another, then the covariance should be near 0.
18
Joint Patterns for X and Y Shortcut: Cov(X,Y) = E(XY) - X * Y Concern with COV: compute value depends critically on the units of measure. Positive Covariance Negative Covariance Covariance near zero X X X Y Y Y
19
Covariance Example: Compute the covariance of mixture example: –f(x,y) = 24xy, where 0 <= x <= 1, 0 <= y <= 1, x+y = 1 What is Cov(X,Y)?
20
Correlation Coefficient The Correlation coefficient simply provides a scaling (normalization) of the covariance. X,Y = Cov(X,Y) / ( X * Y ), -1<= Corr. Coef. <=+1 If X and Y are independent, then = 0 –Note: = 0 does not imply Full, or statistical independence,but only a weaker form, called “linear’’ independence. If Y is completely predictable from X (Y= aX + b) – = 1 implies a perfect positive linear relationship between X and Y – = -1 implies a perfect negative linear relationship between X and Y.
21
Answers Slide 7 –p Y (x, repair)=0.125+0.06+0.02=0.205 –p(x=A2, y=0) + p(x=A2, y = 1) + p(x=A3, y=0) + p(x=A3, y = 1)=0.24+.06+.18+.02=0.5 Slide 9 –part a: –part b: 0.0001848 x 2 + 1.288 Slide 13 –R(2500)=e^(-lambda*t)=e^(-0.00067*2500)=0.187 (each component) –Pr(X1 >= 2500; X2 >= 2500)=0.187*0.187=0.035 Slide 15
22
Answers Slide 19
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.