Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.

Slides:



Advertisements
Similar presentations
Order Statistics The order statistics of a set of random variables X1, X2,…, Xn are the same random variables arranged in increasing order. Denote by X(1)
Advertisements

Chapter 3 Some Special Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
1 Continuous random variables Continuous random variable Let X be such a random variable Takes on values in the real space  (-infinity; +infinity)  (lower.
Random Variables ECE460 Spring, 2012.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Review of Basic Probability and Statistics
06/05/2008 Jae Hyun Kim Chapter 2 Probability Theory (ii) : Many Random Variables Bioinformatics Tea Seminar: Statistical Methods in Bioinformatics.
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
Probability Densities
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Sep 20, 2005CS477: Analog and Digital Communications1 Random variables, Random processes Analog and Digital Communications Autumn
Random Variable and Probability Distribution
Lecture II-2: Probability Review
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Joint Distribution of two or More Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Jointly Distributed Random Variables
CHAPTER 4 Multiple Random Variable
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
One Random Variable Random Process.
Appendix : Probability Theory Review Each outcome is a sample point. The collection of sample points is the sample space, S. Sample points can be aggregated.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Operations on Multiple Random Variables
Math 4030 – 6a Joint Distributions (Discrete)
President UniversityErwin SitompulPBST 3/1 Dr.-Ing. Erwin Sitompul President University Lecture 3 Probability and Statistics
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
Continuous Random Variables and Probability Distributions
Distributions of Functions of Random Variables November 18, 2015
CSE 474 Simulation Modeling | MUSHFIQUR ROUF CSE474:
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
Lesson 99 - Continuous Random Variables HL Math - Santowski.
Random Variables By: 1.
Statistics Lecture 19.
Lecture 3 B Maysaa ELmahi.
Math a Discrete Random Variables
STATISTICS Random Variables and Distribution Functions
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
ASV Chapters 1 - Sample Spaces and Probabilities
ASV Chapters 1 - Sample Spaces and Probabilities
6.3 Sampling Distributions
ASV Chapters 1 - Sample Spaces and Probabilities
Chapter 2. Random Variables
Further Topics on Random Variables: Covariance and Correlation
Experiments, Outcomes, Events and Random Variables: A Revisit
Random Variables and Probability Distributions
Chapter 3-2 Discrete Random Variables
Further Topics on Random Variables: Covariance and Correlation
Presentation transcript:

Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee

Random Vector Given a random experiment with a sample space C. Consider two random variables X 1 and X 2 which assign to each element c of C one and only one ordered pair of numbers X 1 (c)=x 1 and X 2 (c)=x 2. Then we say that (X 1, X 2 ) is a random vector. The space of (X 1, X 2 ) is the set of ordered pairs D={(x 1, x 2 ) : X 1 (c)=x 1 and X 2 (c)=x 2 }

Cumulative Distribution Function The joint cumulative distribution function of (X 1, X 2 ) is denoted by F X1,X2 (x 1, x 2 ) and is given as F X1,X2 (x 1, x 2 ) =P[X 1 ≤x 1, X 2 ≤x 2 )]. A random vector (X 1, X 2 )is a discrete random variable is its space D is finite or countable. A random vector (X 1, X 2 ) with space D is continuous if its cdf F X1,X2 (x 1, x 2 ) is continuous.

Probability Mass Function For discrete random variables X 1 and X 2, the joint pmf is defined as

Probability Density Function For a continuous random vector

Marginals The marginal distributions can be obtained from the joint probability density function. For a discrete and continuous random vector the marginals can be obtained as below:

Expectation Suppose (X 1, X 2 ) is of the continuous type. Then E(Y) exists if

Theorem Let (X 1, X 2 ) be a random vector. Let Y 1 = g 1 (X 1, X 2 ) and Y 2 = g 2 (X 1, X 2 ) be a random variable whose expectations exits. Then for any real numbers k 1 and k 2. E(k 1 Y 1 + k 2 Y 2 )= k 1 E(Y 1 ) + k 2 E(Y 2 )

Note

Moment Generating Function Let X = (X 1. X 2 ) ’ be a random vector. If E(e t1x1+t2x2 ) exists for |t 1 |<h 1 and |t 2 |<h 2 where h 1 and h 2 are positive, the mgf is given as

2.3 CONDITIONAL DISTRIBUTIONS AND EXPECTATIONS So far we know – How to find marginals given the joint distribution. Now – Look at conditional distribution, distribution of one of the random variable when the other has a specific value.

Conditional pmf We define S X2 is the support of X 2. Here we assume p X1 (x 1 ) > 0. Thus conditional probability is the joint divvied by the marginal.

Conditional pdf Let f X1x2 (x 1, x 2 ) be the joint pdf and f x1 (x 1 ) and f x2 (x 2 ) be the marginals for X1 and X2 respectively then the conditional pdf of X2, given X1 is

Note

Conditional Expectation and Variance

Theorem Let (X 1, X 2 ) be a random vector such that the variance of X 2 is finite. Then – E[E(X 2 |X 1 )]=E(X 2 ) – Var[E(X 2 |X 1 )]≤ var(X 2 )

2.4 The Correlation Coefficient Here ρ is called the correlation coefficient of X and Y. Cov(X,Y) is the covariance between X and Y.

The Correlation Coefficient Note that -1 ≤ ρ≤ 1. For the bivariate case – If ρ = 1, the graph of the line y = a + bx (b > 0) contains all the probability of the distribution of X and Y. – For ρ = -1, the above is true for the line y = a + bx with b < 0. – For the non-extreme case, ρ can be looked as a measure of the intensity of the concentration of the probability of X and Y about a line y = a + bx.

Theorem Suppose (X,Y) have a joint distribution with the variance of X and Y finite and positive. Denote the means and variances of X and Y by µ 1, µ 2 and σ 1 2, σ 2 2 respectively, and let ρ be the correlation coefficient between X and Y. If E(Y|X) is linear in X then

2.5 Independent random Variables If the conditional pdf f 2|1 (x 2 |x 1 ) does not depend upon x 1 then the marginal pdf of X 2 equals the conditional pdf f 2|1 (x 2 |x 1 ). Let the random variables X and Y have joint pdf f(x,y) and the marginals f x (x) and f y (y) respectively. The random variables X and Y are said to be independent if and only if – f(x,y)= f x (x) f y (y) – Similar defintion can be wriiten for discrete random variables. – Random variables that are not independent are said to be dependent.

Theorem Let the random variables X and Y have support S1 and S2, respectively and have the joint pdf f(x,y). Then X and Y are independent if and only if f(x,y) can be written as a product of a nonnegative function of x and a nonnegative function of y. That is f(x,y)=g(x)h(y) where g(x)>0 and h(y)>0.

Note In general X and Y must be dependent of the space of positive probability density of X and Y is bounded by a curve that is neither a horizontal or vertical line. Example; f(x,y)=8xy, 0< x< y < 1 – S={(x,y): 0< x< y < 1} This is not a product space.

Theorems Let (X, Y) have the joint cfd F(x,y) and let A and Y have the marginal cdfs F x (x) and F y (y) respectively. Then X and Y are independent if and only if – F(x,y)= F x (x)F y (y) The random variable X and Y are independent if and only if the following condition holds. – P(a < X≤ b, c < Y ≤ d)= P(a < X≤ b)P( c < Y ≤ d) – For ever a < b, c < d and a,b,c and are constants.

Theorems Suppose X and Y are independent and that E(u(X)) and E(v(Y)) exist, then – E[u(x), v(Y)]=E[u(X)]E[v(Y)] Suppose the joint mgf M(t1,t2) exists for the random variables X and Y. Then X and Y are independent if and only if – M(t1,t2) = M(t1,0)M(0,t2) That is the joint mfg if the product of the marginal mgfs.

Note If X and Y are independent then the correlation coefficient is zero. However a zero correlation coefficient does not imply independence.