Independence of random variables

Slides:



Advertisements
Similar presentations
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Advertisements

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Independence of random variables
Ch 4 & 5 Important Ideas Sampling Theory. Density Integration for Probability (Ch 4 Sec 1-2) Integral over (a,b) of density is P(a
Chapter 2: Probability Random Variable (r.v.) is a variable whose value is unknown until it is observed. The value of a random variable results from an.
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Today Today: Chapter 5 Reading: –Chapter 5 (not 5.12) –Exam includes sections from Chapter 5 –Suggested problems: 5.1, 5.2, 5.3, 5.15, 5.25, 5.33,
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Chapter 4: Joint and Conditional Distributions
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Review of Probability and Statistics
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Today Today: More Chapter 5 Reading: –Important Sections in Chapter 5: Only material covered in class Note we have not, and will not cover moment/probability.
Random Variable and Probability Distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Definition of Covariance The covariance of X & Y, denoted Cov(X,Y), is the number where  X = E(X) and  Y = E(Y). Computational Formula:
Section 8 – Joint, Marginal, and Conditional Distributions.
Lecture 15: Expectation for Multivariate Distributions Probability Theory and Applications Fall 2008 Those who ignore Statistics are condemned to reinvent.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
1 G Lect 4a G Lecture 4a f(X) of special interest: Normal Distribution Are These Random Variables Normally Distributed? Probability Statements.
Discrete Random Variables A random variable is a function that assigns a numerical value to each simple event in a sample space. Range – the set of real.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
Multiple Random Variables & OperationsUnit-2. MULTIPLE CHOICE TRUE OR FALSE FILL IN THE BLANKS Multiple.
Statistics for Business & Economics
Math 4030 – 6a Joint Distributions (Discrete)
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
Chapter Eight Expectation of Discrete Random Variable
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
Lecture 1: Basic Statistical Tools. A random variable (RV) = outcome (realization) not a set value, but rather drawn from some probability distribution.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Probability Refresher
Chapter 5 Joint Probability Distributions and Random Samples
Expected values, covariance, and correlation
Inequalities, Covariance, examples
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The distribution function F(x)
Lectures prepared by: Elchanan Mossel Yelena Shvets
Keller: Stats for Mgmt & Econ, 7th Ed
Conditional Probability on a joint discrete distribution
Chapter 4: Mathematical Expectation:
Some Rules for Expectation
Monte Carlo Approximations – Introduction
Review of Probability Concepts
Inference about the Slope and Intercept
Multinomial Distribution
Inference about the Slope and Intercept
Multiple Continuous RV
Random Variable X, with pmf p(x) or pdf f(x)
Tutorial 4 SEG7550 8th Oct..
How accurately can you (1) predict Y from X, and (2) predict X from Y?
CS723 - Probability and Stochastic Processes
Functions of Random variables
Handout Ch 4 實習.
Handout Ch 4 實習.
RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC.
Chapter 5 Applied Statistics and Probability for Engineers
Probability overview Event space – set of possible outcomes
Chapter 2. Random Variables
IE 360: Design and Control of Industrial Systems I
CS723 - Probability and Stochastic Processes
Mathematical Expectation
Presentation transcript:

Independence of random variables Definition Random variables X and Y are independent if their joint distribution function factors into the product of their marginal distribution functions Theorem Suppose X and Y are jointly continuous random variables. X and Y are independent if and only if given any two densities for X and Y their product is the joint density for the pair (X,Y) i.e. Proof: If X and Y are independent random variables and Z =g(X), W = h(Y) then Z, W are also independent. week 9

Example Suppose X and Y are discrete random variables whose values are the non-negative integers and their joint probability function is Are X and Y independent? What are their marginal distributions? Factorization is enough for independence, but we need to be careful of constant terms for factors to be marginal probability functions. week 9

Example and Important Comment The joint density for X, Y is given by Are X, Y independent? Independence requires that the set of points where the joint density is positive must be the Cartesian product of the set of points where the marginal densities are positive i.e. the set of points where fX,Y(x,y) >0 must be (possibly infinite) rectangles. week 9

Conditional densities If X, Y jointly distributed continuous random variables, the conditional density function of Y | X is defined to be if fX(x) > 0 and 0 otherwise. If X, Y are independent then . Also, Integrating both sides over x we get This is a useful application of the law of total probability for the continuous case. week 9

Example Consider the joint density Find the conditional density of X given Y and the conditional density of Y given X. week 9

Properties of Expectations Involving Joint Distributions For random variables X, Y and constants E(aX + bY) = aE(X) + bE(Y) Proof: For independent random variables X, Y E(XY) = E(X)E(Y) whenever these expectations exist. week 9

Covariance Recall: Var(X+Y) = Var(X) + Var(Y) +2 E[(X-E(X))(Y-E(Y))] Definition For random variables X, Y with E(X), E(Y) < ∞, the covariance of X and Y is Covariance measures whether or not X-E(X) and Y-E(Y) have the same sign. Claim: Proof: Note: If X, Y independent then E(XY) =E(X)E(Y), and Cov(X,Y) = 0. week 9

Example Suppose X, Y are discrete random variables with probability function given by Find Cov(X,Y). Are X,Y independent? y x -1 1 pX(x) 1/8 pY(y) week 9

Important Facts Independence of X, Y implies Cov(X,Y) = 0 but NOT vice versa. If X, Y independent then Var(X+Y) = Var(X) + Var(Y). If X, Y are NOT independent then Var(X+Y) = Var(X) + Var(Y) + 2Cov(X,Y). Cov(X,X) = Var(X). week 9

Example Suppose Y ~ Binomial(n, p). Find Var(Y). week 9

Properties of Covariance For random variables X, Y, Z and constants Cov(aX+b, cY+d) = acCov(X,Y) Cov(X+Y, Z) = Cov(X,Z) + Cov(Y,Z) Cov(X,Y) = Cov(Y, X) week 9

Correlation Definition For X, Y random variables the correlation of X and Y is whenever V(X), V(Y) ≠ 0 and all these quantities exists. Claim: ρ(aX+b,cY+d) = ρ(X,Y) Proof: This claim means that the correlation is scale invariant. week 9

Theorem For X, Y random variables, whenever the correlation ρ(X,Y) exists it must satisfy -1 ≤ ρ(X,Y) ≤ 1 Proof: week 9

Interpretation of Correlation ρ ρ(X,Y) is a measure of the strength and direction of the linear relationship between X, Y. If X, Y have non-zero variance, then . If X, Y independent, then ρ(X,Y) = 0. Note, it is not the only time when ρ(X,Y) = 0 !!! Y is a linearly increasing function of X if and only if ρ(X,Y) = 1. Y is a linearly decreasing function of X if and only if ρ(X,Y) = -1. week 9

Example Find Var(X - Y) and ρ(X,Y) if X, Y have the following joint density week 9