Random Variable X, with pmf p(x) or pdf f(x)

Slides:



Advertisements
Similar presentations
Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Advertisements

Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Random Variables ECE460 Spring, 2012.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Review of Basic Probability and Statistics
Sampling Distributions
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
Chapter 6 Continuous Random Variables and Probability Distributions
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
The Multivariate Normal Distribution, Part 2 BMTRY 726 1/14/2014.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Jointly Distributed Random Variables
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Ch2: Probability Theory Some basics Definition of Probability Characteristics of Probability Distributions Descriptive statistics.
Lecture 3 A Brief Review of Some Important Statistical Concepts.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 G Lect 4a G Lecture 4a f(X) of special interest: Normal Distribution Are These Random Variables Normally Distributed? Probability Statements.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Multiple Random Variables & OperationsUnit-2. MULTIPLE CHOICE TRUE OR FALSE FILL IN THE BLANKS Multiple.
Basics on Probability Jingrui He 09/11/2007. Coin Flips  You flip a coin Head with probability 0.5  You flip 100 coins How many heads would you expect.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Math 4030 – 6a Joint Distributions (Discrete)
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Continuous Random Variable (1) Section Continuous Random Variable What is the probability that X is equal to x?
Distributions of Functions of Random Variables November 18, 2015
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
1 Two Discrete Random Variables The probability mass function (pmf) of a single discrete rv X specifies how much probability mass is placed on each possible.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Chapter 4 Multivariate Normal Distribution. 4.1 Random Vector Random Variable Random Vector X X 1, , X p are random variables.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Inference about the slope parameter and correlation
Chapter 5 Joint Probability Distributions and Random Samples
Statistics Lecture 19.
Chapter 12 Simple Linear Regression and Correlation
STAT 311 REVIEW (Quick & Dirty)
Chapter 5 Joint Probability Distributions and Random Samples
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The distribution function F(x)
Two Discrete Random Variables
Keller: Stats for Mgmt & Econ, 7th Ed
Review of Probability Concepts
Inference about the Slope and Intercept
Random Variable X, with pmf p(x) or pdf f(x)
Chapter 12 Simple Linear Regression and Correlation
Multinomial Distribution
Inference about the Slope and Intercept
ASV Chapters 1 - Sample Spaces and Probabilities
Independence of random variables
数据的矩阵描述.
Analysis of Engineering and Scientific Data
ASV Chapters 1 - Sample Spaces and Probabilities
Correlation & Trend Lines
Chapter 7 The Normal Distribution and Its Applications
Further Topics on Random Variables: Covariance and Correlation
Discrete Random Variables and Probability Distributions
Further Topics on Random Variables: Covariance and Correlation
Moments of Random Variables
Bivariate Data.
Mathematical Expectation
Presentation transcript:

Random Variable X, with pmf p(x) or pdf f(x) POPULATION Random Variable X, with pmf p(x) or pdf f(x) Recall… PARAMETERS “population characteristics” Mean: X 2 measures how much X varies about its mean . Variance: Proof: See PowerPoint section 3.1-3.3, slides 56, 57 for discrete X.

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances:

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances: Covariance: Proof:

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances: Covariance: Properties: Covariance of two random variables measures how they vary together about their respective means. Variance is  0, but covariance is unrestricted in sign. Cov(X, X) = Other properties based on expected value… Var(X)

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Covariance: Is there an association between X and Y, and if so, how is it measured?

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Covariance: Is there an association between X and Y, and if so, how is it measured?

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Covariance: Is there an association between X and Y, and if so, how is it measured?

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Covariance:

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Covariance:

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Covariance: … but what does it mean????

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Y X y1 y2 y3 y4 y5 x1 p(x1, y1) p(x1, y2) p(x1, y3) p(x1, y4) p(x1, y5) pX(x1) x2 p(x2, y1) p(x2, y2) p(x2, y3) p(x2, y4) p(x2, y5) pX(x2) x3 p(x3, y1) p(x3, y2) p(x3, y3) p(x3, y4) p(x3, y5) pX(x3) x4 p(x4, y1) p(x4, y2) p(x4, y3) p(x4, y4) p(x4, y5) pX(x4) x5 p(x5, y1) p(x5, y2) p(x5, y3) p(x5, y4) p(x5, y5) pX(x5) pY(y1) pY(y2) pY(y3) pY(y4) pY(y5) 1

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Example: Y X 1 2 3 4 5 .04 .20 In a uniform population, each of the points {(1,1), (1, 2),…, (5, 5)} has the same density. A scatterplot would reveal no particular association between X and Y. In fact, i.e., X and Y are statistically independent! It is easy to see that Cov(X, Y) = 0.

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Exercise: Y X 1 2 3 4 5 .04 .12 .20 .28 .36 .10 .15 .25 .30 Fill in the table so that X and Y are statistically independent. Then show that Cov(X, Y) = 0. THEOREM. If X and Y are statistically independent, then Cov(X, Y) = 0. However, the converse does not necessarily hold! Exception: The Bivariate Normal Distribution

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Example: Y X 1 2 3 4 5 .08 .04 .03 .02 .01 .18 .21 .22

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Example: Y X 1 2 3 4 5 .08 .04 .03 .02 .01 .18 .21 .22

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Example: Y X 1 2 3 4 5 .08 .04 .03 .02 .01 .18 .21 .22 X large  high prob Y large As X increases, Y also has a tendency to increase; thus, X and Y are said to be positively correlated. Likewise, two negatively correlated variables have a tendency for Y to decrease as X increases. The simplest mathematical object to have this property is a straight line. X small  high prob Y small

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances: Covariance: Linear Correlation Coefficient: Always between –1 and +1 (“rho”)

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Linear Correlation Coefficient: JAMA. 2003;290:1486-1493 Scatterplot ρ measures the strength of linear association between X and Y. Always between –1 and +1.

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Linear Correlation Coefficient: IQ vs. Head circumference strong moderate weak moderate strong -1 +1 -0.75 -0.5 +0.5 +0.75 positive linear correlation negative linear correlation

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Linear Correlation Coefficient: Body Temp vs. Age strong moderate weak moderate strong -1 +1 -0.75 -0.5 +0.5 +0.75 positive linear correlation negative linear correlation

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? A strong positive correlation exists between ice cream sales and drowning. Cause & Effect? A strong positive correlation exists between ice cream sales and drowning. Cause & Effect? NOT LIKELY… “Temp (F)” is a confounding variable. PARAMETERS Linear Correlation Coefficient: Linear Profit vs. Price strong moderate weak moderate strong -1 +1 -0.75 -0.5 +0.5 +0.75 positive linear correlation negative linear correlation

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances: Covariance: Definition Theorem Special case: Y = constant c Theorem

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances: Covariance: Theorem Proof: (WLOG)

Converse is not necessarily true!!! POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances: Covariance: Theorem (WLOG) Theorem If X and Y are independent, then Cov(X, Y) = 0. Proof: Exercise… Hint: See slide 4 above. Converse is not necessarily true!!! If X and Y are independent, then Corollary

BIVARIATE NORMAL DISTRIBUTION Important Exception BIVARIATE NORMAL DISTRIBUTION