1 Lecture 14: Jointly Distributed Random Variables Devore, Ch. 5.1 and 5.2.

Slides:



Advertisements
Similar presentations
STATISTICS Joint and Conditional Distributions
Advertisements

Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Joint Probability Distributions and Random Samples
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Multivariate Distributions
Ch 4 & 5 Important Ideas Sampling Theory. Density Integration for Probability (Ch 4 Sec 1-2) Integral over (a,b) of density is P(a
Chapter 2: Probability Random Variable (r.v.) is a variable whose value is unknown until it is observed. The value of a random variable results from an.
5.4 Joint Distributions and Independence
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Correlation and Simple Regression Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
Statistics Lecture 18. Will begin Chapter 5 today.
Probability Densities
Class notes for ISE 201 San Jose State University
Statistics Lecture 20. Last Day…completed 5.1 Today Section 5.2 Next Day: Parts of Section 5.3 and 5.4.
Chapter 6 Continuous Random Variables and Probability Distributions
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
1 Multivariable Distributions ch4. 2  It may be favorable to take more than one measurement on a random experiment.  The data may then be collected.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Today Today: More Chapter 5 Reading: –Important Sections in Chapter 5: Only material covered in class Note we have not, and will not cover moment/probability.
1 Fin500J Topic 10Fall 2010 Olin Business School Fin500J: Mathematical Foundations in Finance Topic 10: Probability and Statistics Philip H. Dybvig Reference:
Joint Probability Distributions
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Lecture 5 Correlation and Regression
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
: Appendix A: Mathematical Foundations 1 Montri Karnjanadecha ac.th/~montri Principles of.
Jointly Distributed Random Variables
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Two Random Variables W&W, Chapter 5. Joint Distributions So far we have been talking about the probability of a single variable, or a variable conditional.
LECTURE IV Random Variables and Probability Distributions I.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
1 G Lect 8b G Lecture 8b Correlation: quantifying linear association between random variables Example: Okazaki’s inferences from a survey.
Lecture 15: Statistics and Their Distributions, Central Limit Theorem
Chapters 7 and 10: Expected Values of Two or More Random Variables
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
Probability and Statistics
Statistics for Business & Economics
1 Lecture 9: The Poisson Random Variable and its PMF Devore, Ch. 3.6.
Chapter 5 Joint Probability Distributions The adventure continues as we consider two or more random variables all at the same time. Chapter 5B Discrete.
STATISTICS Joint and Conditional Distributions Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
Lecture 29 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Lecture 1: Basic Statistical Tools. A random variable (RV) = outcome (realization) not a set value, but rather drawn from some probability distribution.
Joint Moments and Joint Characteristic Functions.
The Erik Jonsson School of Engineering and Computer Science Chapter 4 pp William J. Pervin The University of Texas at Dallas Richardson, Texas.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
1 Two Discrete Random Variables The probability mass function (pmf) of a single discrete rv X specifies how much probability mass is placed on each possible.
5 pair of RVs.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
3.4 Joint Probability Distributions
Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.
Expected values, covariance, and correlation
Statistics Lecture 19.
STATISTICS Joint and Conditional Distributions
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Two Discrete Random Variables
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
CS723 - Probability and Stochastic Processes
Handout Ch 4 實習.
CS723 - Probability and Stochastic Processes
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
Presentation transcript:

1 Lecture 14: Jointly Distributed Random Variables Devore, Ch. 5.1 and 5.2

Topics I.Jointly Distributed Variables II.Joint Distributions of Uncorrelated Variables – Two Independent Random Variables III.Expected Values of Joint Distributions IV.Joint Distributions of Correlated Variables – Covariance and correlation  measures of “degree of association”

I. Jointly Distributed Variables Many problems in statistics and probability involve more than a single random variable. Therefore, sometimes it is necessary to study several random variables simultaneously. XYZ X = height (1 in spec, 0 out of spec) Y = width (1 in spec, 0 out of spec) Z = depth (1 in spec, 0 out of spec) X Y Z

Types of Jointly Distributed Variables Two Discrete RVs – Joint PMF – Marginal PMF (each 1 of 2 joint variables) Two Continuous RVs – Joint PDF – Marginal PDF (each 1 of 2 joint variables) Independent variables More than two variables

Joint Distribution – Two Discrete RV’s Joint PMF Let X and Y represent 2 discrete rv’s on space S p XY (x,y) >= 0 p XY (x,y) = P(X=x, Y=y) Marginal PMF To obtain a marginal pmf for say X=100, P(100,y) - you compute prob for all possible y values.

Example - Joint Probability TV Brand Example (repeated from conditional probability lecture notes) –Event A: buy a TV brand; Event B: repair TV sold –Suppose Selling Mix: A 1 = 50%, A 2 = 30% and A 3 = 20% –Likelihood to Repair Given Model A 1 = 25% –Likelihood to Repair Given Model A 2 = 20% –Likelihood to Repair Given Model A 3 = 10% Types of Questions: What is the probability of a repair?, What is the probability that you have non A 1 models? First Convert information to joint probability table. Example: p(x=A 1, y=repair) = 0.5*0.25 =.125

Joint Probability Table What are some requirements of joint probability table? –Sum of all pairs= 1, values >=0. What is the marginal pmf of Y=repair? –p Y (x, repair) What is the probability of having a non-model A 1 ? –p(x=A2,y) or p(x=A3,y) = p(x=A2, y=0) + p(x=A2, y = 1) + p(x=A3, y=0) + p(x=A3, y = 1) 0 (no repair)1 (repair) A xA A y

Joint Dist. – Two Continuous RV’s Joint PDF* Let X and Y represent 2 continous rv’s f XY (x,y) >= 0 for all x,y Marginal PDF To obtain a marginal pdf for say X=x1, P(x1,y) - you compute prob for all possible y values.

Two Continuous Dist RV’s- Example Suppose you are producing 1.7 L engine with a bore range of 81 +/- 0.2mm and stroke range of /- 0.2 mm. f(x,y) = K(x 2 + y 2 ) where K = –80.8 <= x <= 81.2; 83.3 <= y <= 83.7 –nominal bore = 81; nominal stroke = 83.5 a) What is the probability that the bore and the stroke will be under their nominal values (x<=81 and y<=83.5)? b) What is the marginal distribution of bore size alone? – f X (x)? – f X (x) is between 80.8 and 80.9?

10 Mixture Experiments and Joint Distributions Useful application relates to Mixture Experiments –Let X - proportion of a mix that is component A –Let Y - proportion of a mix that is not A: y = 1 – x The sum of X and Y represent 100% of the mixture f(x,y) = k(xy) where 0 <= x <= 1, 0 <= y <= 1, x+y = 1 –What is k if f(x,y) is pdf?:

II. Joint Distributions of Independent Random Variables Independent events: P(A B) = P(A). P(B) If two variables are (fully, or statistically) independent, then –DISCRETE: p(x,y) = p X (x). p Y (y), for ALL possible (x,y) pairs! –CONTINUOUS: f(x,y) = f X (x). f Y (y), for ALL possible (x,y) pairs also! If two variables do not satisfy the above for all (x,y) then they are said to be dependent. Therefore, if you can find even ONE pair not satisfying the above, you just proved dependence!

TV Example, continued. Are X and Y independent? –Does p(A1,1) = p X (A1) * p Y (1) ? p X (A1) = p(A1,0) + p(A1,1) = = 0.5 p Y (1) = p(A1,1) + p(A2,1) + p(A3,1) = p(A1,1) = P x (A1)*p y (1) = 0.5*0.205 = So, Dependent –(repair rates are not the same for all brands) 0 (no repair)1 (repair) A xA A y

Reliability Example - Jointly Independent Components Suppose two components in an engine have exponential distributions with expected lifetimes of 1 = 2 = ). What is the probability that both components will last at least 2500 hours? – Pr(X 1 >= 2500; X 2 >= 2500)

III. Expected Values using Joint Distributions Let X and Y be jointly distributed rv’s If two variables are: –X,Y DISCRETE: E[h(x,y)] = –X,Y CONTINUOUS: E[h(x,y)] =

Mixture Example Mixture Experiments –Let X - proportion of a mix that is component A –Let Y - proportion of a mix that is not A: y = 1 - x –Let Cost of A = $1 and Cost of B = $1.5 f(x,y) = 24(xy) where 0 <= x <= 1, 0 <= y <= 1, x+y = 1 h(x,y) = 1X + 1.5Y –What is the equation for E[h(X,Y)]?

IV. Joint Distributions of Related Variables Suppose X and Y are not independent but dependent. Useful Question: what is the degree of association between X and Y? –Measures of degree of association: Covariance Correlation

Covariance of X and Y Covariance between two rv’s X and Y is: –Cov(X,Y) = E[(X-  x )(Y-  y )] Covariance Results: –If X and Y tend to both be greater than their respective means at the same time, then the covariance becomes a large positive number. –If X and Y tend to both be lower than their respective means at the same time, then the covariance becomes a large negative number. –If X and Y tend to cancel one another, then the covariance should be near 0.

Joint Patterns for X and Y Shortcut: Cov(X,Y) = E(XY) -  X *  Y Concern with COV: compute value depends critically on the units of measure. Positive Covariance Negative Covariance Covariance near zero X X X Y Y Y

Covariance Example: Compute the covariance of mixture example: –f(x,y) = 24xy, where 0 <= x <= 1, 0 <= y <= 1, x+y = 1 What is Cov(X,Y)?

Correlation Coefficient The Correlation coefficient simply provides a scaling (normalization) of the covariance.  X,Y = Cov(X,Y) / (  X *  Y ), -1<= Corr. Coef. <=+1 If X and Y are independent, then  = 0 –Note:  = 0 does not imply Full, or statistical independence,but only a weaker form, called “linear’’ independence. If Y is completely predictable from X (Y= aX + b) –  = 1 implies a perfect positive linear relationship between X and Y –  = -1 implies a perfect negative linear relationship between X and Y.

Answers Slide 7 –p Y (x, repair)= =0.205 –p(x=A2, y=0) + p(x=A2, y = 1) + p(x=A3, y=0) + p(x=A3, y = 1)= =0.5 Slide 9 –part a: –part b: x Slide 13 –R(2500)=e^(-lambda*t)=e^( *2500)=0.187 (each component) –Pr(X1 >= 2500; X2 >= 2500)=0.187*0.187=0.035 Slide 15

Answers Slide 19