Jointly Distributed Random Variables

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

Joint Probability Distributions and Random Samples
AP Statistics Chapter 7 Notes. Random Variables Random Variable –A variable whose value is a numerical outcome of a random phenomenon. Discrete Random.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Multivariate Distributions
© 2010 Pearson Prentice Hall. All rights reserved Sampling Distributions and the Central Limit Theorem.
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Statistics Lecture 18. Will begin Chapter 5 today.
Sampling Distributions
Probability Densities
Review.
Probability Distributions
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved.
Chapter 4: Joint and Conditional Distributions
Normal and Sampling Distributions A normal distribution is uniquely determined by its mean, , and variance,  2 The random variable Z = (X-  /  is.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Joint Probability Distributions and Random Samples
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
STAT 552 PROBABILITY AND STATISTICS II
Chapter 9.3 (323) A Test of the Mean of a Normal Distribution: Population Variance Unknown Given a random sample of n observations from a normal population.
Chapter 12 Review of Calculus and Probability
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
© 2010 Pearson Prentice Hall. All rights reserved Chapter Sampling Distributions 8.
Lecture 15: Statistics and Their Distributions, Central Limit Theorem
Modular 11 Ch 7.1 to 7.2 Part I. Ch 7.1 Uniform and Normal Distribution Recall: Discrete random variable probability distribution For a continued random.
© 2010 Pearson Prentice Hall. All rights reserved 8-1 Objectives 1.Describe the distribution of the sample mean: samples from normal populations 2.Describe.
Chapter 10 – Sampling Distributions Math 22 Introductory Statistics.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Chapter 8 Sampling Variability and Sampling Distributions.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
1 Topic 5 - Joint distributions and the CLT Joint distributions –Calculation of probabilities, mean and variance –Expectations of functions based on joint.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Sampling distributions rule of thumb…. Some important points about sample distributions… If we obtain a sample that meets the rules of thumb, then…
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.
Topic 5 - Joint distributions and the CLT
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
PROBABILITY AND STATISTICS WEEK 8 Onur Doğan. Skewness Onur Doğan.
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
© 2010 Pearson Prentice Hall. All rights reserved Chapter Sampling Distributions 8.
CSE 474 Simulation Modeling | MUSHFIQUR ROUF CSE474:
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
1 Two Discrete Random Variables The probability mass function (pmf) of a single discrete rv X specifies how much probability mass is placed on each possible.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Chapter 8 Estimation ©. Estimator and Estimate estimator estimate An estimator of a population parameter is a random variable that depends on the sample.
Random Variables 2.1 Discrete Random Variables 2.2 The Expectation of a Random Variable 2.3 The Variance of a Random Variable 2.4 Jointly Distributed Random.
MATH Section 4.4.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Chapter 4 Discrete Random Variables and Probability Distributions
Chapter 9: Joint distributions and independence CIS 3033.
Sampling and Sampling Distributions
Ch5.4 Central Limit Theorem
Jointly distributed random variables
Ch5.2 Covariance and Correlation
Joint Probability Distributions and Random Samples
Two Discrete Random Variables
Chapter 9 Hypothesis Testing.
Chapter 7 Sampling Distributions.
Chapter 7 Sampling Distributions.
Sampling Distributions
Chapter 7 Sampling Distributions.
Chapter 2. Random Variables
Chapter 7 Sampling Distributions.
Presentation transcript:

Jointly Distributed Random Variables CHAPTER 5 Jointly Distributed Random Variables

Joint Probability Mass Function Let X and Y be two discrete rv’s defined on the sample space of an experiment. The joint probability mass function p(x, y) is defined for each pair of numbers (x, y) by Let A be the set consisting of pairs of (x, y) values, then

Marginal Probability Mass Functions The marginal probability mass functions of X and Y, denoted pX(x) and pY(y) are given by

– the height and weight of a person; – the temperature and rainfall of a day; – the two coordinates of a needle randomly dropped on a table; – the number of 1s and the number of 6s in 10 rolls of a die. Example. We are interested in the effect of seat belt use on saving lives. If we consider the following random variables X1 and X2 defined as follows: X1 =0 if child survived X1 =1 if child did not survive And X2 = 0 if no belt X2 = 1 if adult belt used X2 = 2 if child seat used

The following table represents the joint probability distribution of X1 and X2 . In general we write P(X1 = x1 , X2 = x2 ) = p(x1 , x2) and call p(x1 , x2) the joint probability function of (X1 , X2). X1 0 1 -------------------------------------- 0 | 0.38 0.17 | 0.55 X2 1 | 0.14 0.02 | 0.16 2 | 0.24 0.05 | 0.29 ------------------------------------------ 0.76 0.24

Probability that a child will both survive and be in a child seta when involved in an accident is: P(X1 = 0, X2 = 2) = 0.24 Probability that a child will be in a child seat: P(X2 = 2) = P(X1 = 0, X2 = 2) + P(X1 =1, X2 = 2) = 0.24+0.05= 0.29

Joint Probability Density Function Let X and Y be continuous rv’s. Then f (x, y) is a joint probability density function for X and Y if for any two-dimensional set A If A is the two-dimensional rectangle

Marginal Probability Density Functions The marginal probability density functions of X and Y, denoted fX(x) and fY(y), are given by

Independent Random Variables Two random variables X and Y are said to be independent if for every pair of x and y values when X and Y are discrete or when X and Y are continuous. If the conditions are not satisfied for all (x, y) then X and Y are dependent.

Conditional Probability Function Let X and Y be two continuous rv’s with joint pdf f (x, y) and marginal X pdf fX(x). Then for any X value x for which fX(x) > 0, the conditional probability density function of Y given that X = x is If X and Y are discrete, replacing pdf’s by pmf’s gives the conditional probability mass function of Y when X = x.

1) Find the marginal density functions for X and Y. Let X and Y denote the proportion of two different chemicals in a sample mixture of chemicals used as an insecticide. Suppose X and Y have joint probability density given by: (Note that X + Y must be at most unity since the random variables denote proportions within the same sample). 1) Find the marginal density functions for X and Y. 2) Are X and Y independent? 3) Find P(X > 1/2 | Y =1/4).

2) f1(x) f2(y)=2(1-x)* 2(1-y) ≠ 2 = f(x,y), for 0 ≤ x ≤ 1-y. Therefore X and Y are not independent. 3)

5.2 Expected Values, Covariance, and Correlation Let X and Y be jointly distributed rv’s with pmf p(x, y) or pdf f (x, y) according to whether the variables are discrete or continuous. Then the expected value of a function h(X, Y), denoted E[h(X, Y)] or discrete is continuous

Covariance The covariance between two rv’s X and Y is discrete continuous

Short-cut Formula for Covariance

Cov(X, Y) = 0 does not imply X and Y are independent!!

Correlation Proposition If a and c are either both positive or both negative, Corr(aX + b, cY + d) = Corr(X, Y) For any two rv’s X and Y,

Correlation Proposition If X and Y are independent, then but does not imply independence. for some numbers a and b with

The proportions X and Y of two chemicals found in samples of an insecticide have the joint probability density function The random variable Z=X + Y denotes the proportion of the insecticide due to both chemicals combined. Find E(Z) and V(Z) 2) Find the correlation between X and Y and interpret its meaning.

A statistic is any quantity whose value can be calculated from sample data. Prior to obtaining data, there is uncertainty as to what value of any particular statistic will result. A statistic is a random variable denoted by an uppercase letter; a lowercase letter is used to represent the calculated or observed value of the statistic.

Random Samples The rv’s X1,…,Xn are said to form a (simple random sample of size n if The Xi’s are independent rv’s. Every Xi has the same probability distribution.

Simulation Experiments The following characteristics must be specified: The statistic of interest. The population distribution. The sample size n. The number of replications k.

Using the Sample Mean Let X1,…, Xn be a random sample from a distribution with mean value and standard deviation Then In addition, with To = X1 +…+ Xn,

Normal Population Distribution Let X1,…, Xn be a random sample from a normal distribution with mean value and standard deviation Then for any n, is normally distributed, as is To.

The Central Limit Theorem Let X1,…, Xn be a random sample from a distribution with mean value and variance Then if n sufficiently large, has approximately a normal distribution with and To also has approximately a normal distribution with The larger the value of n, the better the approximation.

The Central Limit Theorem small to moderate n large n Population distribution

Rule of Thumb If n > 30, the Central Limit Theorem can be used.