4.3 Covariance ﹠Correlation

Slides:



Advertisements
Similar presentations
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Advertisements

Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Correlation and Simple Regression Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
Today Today: More Chapter 3 Reading: –Please read Chapter 3 –Suggested Problems: 3.2, 3.9, 3.12, 3.20, 3.23, 3.24, 3R5, 3R9.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Chapter 4: Joint and Conditional Distributions
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Today Today: More Chapter 5 Reading: –Important Sections in Chapter 5: Only material covered in class Note we have not, and will not cover moment/probability.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Definition of Covariance The covariance of X & Y, denoted Cov(X,Y), is the number where  X = E(X) and  Y = E(Y). Computational Formula:
Lecture 15: Expectation for Multivariate Distributions Probability Theory and Applications Fall 2008 Those who ignore Statistics are condemned to reinvent.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 11 – Derived distributions, covariance, correlation and convolution Dr. Farinaz Koushanfar.
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Chapter 3 Random vectors and their numerical characteristics.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
Multiple Random Variables & OperationsUnit-2. MULTIPLE CHOICE TRUE OR FALSE FILL IN THE BLANKS Multiple.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Math 4030 – 6a Joint Distributions (Discrete)
Multivariate distributions Suppose we are measuring 2 or more different properties of a system –e.g. rotational and radial velocities of stars in a cluster.
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
Chapter Eight Expectation of Discrete Random Variable
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Linear Correlation (12.5) In the regression analysis that we have considered so far, we assume that x is a controlled independent variable and Y is an.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
Joint Moments and Joint Characteristic Functions.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.
Random Variables By: 1.
Chapter 5 Joint Probability Distributions and Random Samples
Inequalities, Covariance, examples
Expected Values.
Handout Ch 4 實習.
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Lectures prepared by: Elchanan Mossel Yelena Shvets
Two Discrete Random Variables
Keller: Stats for Mgmt & Econ, 7th Ed
Factoring Expressions 7.EE.1
Some Rules for Expectation
Chapter 3: Getting the Hang of Statistics
Chapter 10: Covariance and Correlation
Multinomial Distribution
Random Variable X, with pmf p(x) or pdf f(x)
How accurately can you (1) predict Y from X, and (2) predict X from Y?
CS723 - Probability and Stochastic Processes
Random WALK, BROWNIAN MOTION and SDEs
Independence of random variables
Chapter 3: Getting the Hang of Statistics
数据的矩阵描述.
Analysis of Engineering and Scientific Data
Handout Ch 4 實習.
Handout Ch 4 實習.
Financial Econometrics Fin. 505
Chapter 2. Random Variables
IE 360: Design and Control of Industrial Systems I
Further Topics on Random Variables: Covariance and Correlation
Chapter 3-2 Simplifying Algebraic Expressions
Chapter 10: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
CS723 - Probability and Stochastic Processes
Chapter 10: Covariance and Correlation
Bell Ringer #4 (2-5-16) Try and define these terms or say what they mean: 1. distribute 2. factor 3. constant 4. coefficient 5. variable.
Moments of Random Variables
Mathematical Expectation
Presentation transcript:

4.3 Covariance ﹠Correlation

1.Covariance That is to say If X and Y are discrete random variables, Definition 4.3 That is to say If X and Y are discrete random variables, If X and Y are continuous random variables,

Proof

Example Suppose that (X, Y) is uniformly distributed on D={(X, Y):x2+y21} .Prove that X and Y are uncorrelated but not independent. Proof

Thus X and Y are uncorrelated. Since Thus, X is not independent of Y.

2. Properties of covariance:P82 (1) Cov(X, Y)=Cov(Y, X); (2) Cov(aX, bY)=abCov(X, Y), where a, b are constants Proof Cov(aX, bY)=E(aXbY)-E(aX)E(bY) =abE(XY)-aE(X)bE(Y) =ab[E(XY)-E(X)E(Y)] =abCov(X,Y)

(3) Cov(X+Y,Z)=Cov(X, Z)+Cov(Y, Z); Proof Cov(X+Y,Z)= E[(X+Y)Z]-E(X+Y)E(Z) =E(XZ)+E(YZ)-E(X)E(Z)-E(Y)E(Z) =Cov(X,Z)+Cov(Y,Z) (4) D(X+Y)=D(X)+D(Y)+2Cov(X, Y). Remark D(X-Y)=D[X+(-Y)]=D(X)+D(Y)-2Cov(X,Y) Example 4.15----P84

3.Correlation Coefficients Definition4.4 Suppose that r.v. X,Y has finite variance, dentoed by DX>0,DY>0,respectively, then, is name the correlation coefficients of r.v. X and Y .

EX Properties of coefficients x=y D 1 (1) |XY|1; (2) |XY|=1There exists constants a, b such that P {Y= aX+b}=1; (3) X and Y are uncorrelated XY; 1. Suppose that (X,Y) are uniformly distributed on D:0<x<1,0<y<x, try to determine the coefficient of X and Y. EX x=y Answer D 1

D 1

What does Example 2 indicate? Answer 1) 2) What does Example 2 indicate?

Proof

Note P86 Thus, if (X,Y)follow two-dimensional distribution, then “X and Y are independent” is equvalent to “X and Y are uncorrelated

Example 4.16—4.18 (P86) Exercise:P90—11 Find Cov(X,Y),12 Homework:P91—16,17