Chapter 5 Joint Probability Distributions and Random Samples

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Review of Basic Probability and Statistics
Sampling Distributions
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
Chapter 6 Continuous Random Variables and Probability Distributions
Section 5.3 Suppose X 1, X 2, …, X n are independent random variables (which may be either of the discrete type or of the continuous type) each from the.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
The Multivariate Normal Distribution, Part 2 BMTRY 726 1/14/2014.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Normal and Sampling Distributions A normal distribution is uniquely determined by its mean, , and variance,  2 The random variable Z = (X-  /  is.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Jointly Distributed Random Variables
Ch2: Probability Theory Some basics Definition of Probability Characteristics of Probability Distributions Descriptive statistics.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
LECTURE IV Random Variables and Probability Distributions I.
1 G Lect 4a G Lecture 4a f(X) of special interest: Normal Distribution Are These Random Variables Normally Distributed? Probability Statements.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Basics on Probability Jingrui He 09/11/2007. Coin Flips  You flip a coin Head with probability 0.5  You flip 100 coins How many heads would you expect.
Math 4030 – 6a Joint Distributions (Discrete)
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Continuous Random Variables and Probability Distributions
Continuous Random Variable (1) Section Continuous Random Variable What is the probability that X is equal to x?
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
1 Two Discrete Random Variables The probability mass function (pmf) of a single discrete rv X specifies how much probability mass is placed on each possible.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Chapter 4 Multivariate Normal Distribution. 4.1 Random Vector Random Variable Random Vector X X 1, , X p are random variables.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Inference about the slope parameter and correlation
Chapter 4 Mathematical Expectation.
Statistics Lecture 19.
Chapter 12 Simple Linear Regression and Correlation
Review 1. Describing variables.
STAT 311 REVIEW (Quick & Dirty)
Chapter 5 Joint Probability Distributions and Random Samples
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The distribution function F(x)
Two Discrete Random Variables
Keller: Stats for Mgmt & Econ, 7th Ed
Chapter 3 Discrete Random Variables and Probability Distributions
Inference about the Slope and Intercept
Random Variable X, with pmf p(x) or pdf f(x)
ASV Chapters 1 - Sample Spaces and Probabilities
Chapter 12 Simple Linear Regression and Correlation
Multinomial Distribution
Inference about the Slope and Intercept
ASV Chapters 1 - Sample Spaces and Probabilities
Random Variable X, with pmf p(x) or pdf f(x)
Independence of random variables
数据的矩阵描述.
Analysis of Engineering and Scientific Data
ASV Chapters 1 - Sample Spaces and Probabilities
Chapter 4 Mathematical Expectation.
Correlation & Trend Lines
Chapter 7 The Normal Distribution and Its Applications
Further Topics on Random Variables: Covariance and Correlation
Discrete Random Variables and Probability Distributions
Further Topics on Random Variables: Covariance and Correlation
Moments of Random Variables
Mathematical Expectation
Presentation transcript:

Chapter 5 Joint Probability Distributions and Random Samples 5.1 - Jointly Distributed Random Variables 5.2 - Expected Values, Covariance, and Correlation 5.3 - Statistics and Their Distributions 5.4 - The Distribution of the Sample Mean 5.5 - The Distribution of a Linear Combination

Random Variable X, with pmf p(x) or pdf f(x) POPULATION Random Variable X, with pmf p(x) or pdf f(x) REVIEW POWERPOINT SECTION “3.1-3.3” FOR BASIC PROPERTIES OF EXPECTED VALUE PARAMETERS Mean:  2 measures how much X varies about its mean . Variance: Proof: See PowerPoint section 3.1-3.3, slides 41, 42 for discrete X.

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances:

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances: Covariance: Proof:

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances: Covariance: Properties: Covariance of two random variables measures how they vary together about their respective means. Variance is  0, but covariance is unrestricted in sign. Cov(X, X) = Other properties based on expected value… Var(X)

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Covariance: Is there an association between X and Y, and if so, how is it measured?

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Covariance: Is there an association between X and Y, and if so, how is it measured?

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Covariance: Is there an association between X and Y, and if so, how is it measured?

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Covariance:

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Covariance:

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Covariance: … but what does it mean????

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Y X y1 y2 y3 y4 y5 x1 p(x1, y1) p(x1, y2) p(x1, y3) p(x1, y4) p(x1, y5) pX(x1) x2 p(x2, y1) p(x2, y2) p(x2, y3) p(x2, y4) p(x2, y5) pX(x2) x3 p(x3, y1) p(x3, y2) p(x3, y3) p(x3, y4) p(x3, y5) pX(x3) x4 p(x4, y1) p(x4, y2) p(x4, y3) p(x4, y4) p(x4, y5) pX(x4) x5 p(x5, y1) p(x5, y2) p(x5, y3) p(x5, y4) p(x5, y5) pX(x5) pY(y1) pY(y2) pY(y3) pY(y4) pY(y5) 1

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Example: Y X 1 2 3 4 5 .04 .20 In a uniform population, each of the points {(1,1), (1, 2),…, (5, 5)} has the same density. A scatterplot would reveal no particular association between X and Y. In fact, i.e., X and Y are statistically independent! It is easy to see that Cov(X, Y) = 0.

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Exercise: Y X 1 2 3 4 5 .04 .12 .20 .28 .36 .10 .15 .25 .30 Fill in the table so that X and Y are statistically independent. Then show that Cov(X, Y) = 0. THEOREM. If X and Y are statistically independent, then Cov(X, Y) = 0. However, the converse does not necessarily hold! Exception: The Bivariate Normal Distribution

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Example: Y X 1 2 3 4 5 .08 .04 .03 .02 .01 .18 .21 .22

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Example: Y X 1 2 3 4 5 .08 .04 .03 .02 .01 .18 .21 .22

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? Example: Y X 1 2 3 4 5 .08 .04 .03 .02 .01 .18 .21 .22 X large  high prob Y large As X increases, Y also has a tendency to increase; thus, X and Y are said to be positively correlated. Likewise, two negatively correlated variables have a tendency for Y to decrease as X increases. The simplest mathematical object to have this property is a straight line. X small  high prob Y small

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances: Covariance: Linear Correlation Coefficient: Always between –1 and +1 (“rho”)

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Linear Correlation Coefficient: JAMA. 2003;290:1486-1493 ρ measures the strength of linear association between X and Y. Always between –1 and +1.

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Linear Correlation Coefficient: IQ vs. Head circumference strong moderate weak moderate strong -1 +1 -0.75 -0.5 +0.5 +0.75 positive linear correlation negative linear correlation

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Linear Correlation Coefficient: Body Temp vs. Age strong moderate weak moderate strong -1 +1 -0.75 -0.5 +0.5 +0.75 positive linear correlation negative linear correlation

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? A strong positive correlation exists between ice cream sales and drowning. Cause & Effect? A strong positive correlation exists between ice cream sales and drowning. Cause & Effect? NOT LIKELY… “Temp (F)” is a confounding variable. PARAMETERS Linear Correlation Coefficient: Linear Profit vs. Price strong moderate weak moderate strong -1 +1 -0.75 -0.5 +0.5 +0.75 positive linear correlation negative linear correlation

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances: Covariance: Definition Theorem Proof: See text, p. 240 Special case: Y = constant c Theorem

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances: Covariance: Theorem Proof: (WLOG)

Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) POPULATION(S) Random Variables X, Y with joint pmf p(x,y) or pdf f(x, y) Is there an association between X and Y, and if so, how is it measured? PARAMETERS Means: Variances: Covariance: Theorem (WLOG) Theorem If X and Y are independent, then Cov(X, Y) = 0. Proof: Exercise (HW problem)… Hint: See slide 4 above. If X and Y are independent, then Corollary