UNIT-2 Multiple Random Variable

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

Random Variables ECE460 Spring, 2012.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Probability Densities
Chapter 6 Continuous Random Variables and Probability Distributions
Data Basics. Data Matrix Many datasets can be represented as a data matrix. Rows corresponding to entities Columns represents attributes. N: size of the.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Continuous Random Variables and Probability Distributions
Random Variable and Probability Distribution
Lecture II-2: Probability Review
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
One Random Variable Random Process.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
Operations on Multiple Random Variables
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Continuous Random Variables and Probability Distributions
Joint Moments and Joint Characteristic Functions.
One Function of Two Random Variables
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Random Variables By: 1.
Continuous Distributions
Expectations of Random Variables, Functions of Random Variables
3. Random Variables (Fig.3.1)
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Expectations of Random Variables, Functions of Random Variables
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
The Bernoulli distribution
Systems of First Order Linear Equations
3.1 Expectation Expectation Example
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Example Suppose X ~ Uniform(2, 4). Let . Find .
STOCHASTIC HYDROLOGY Random Processes
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Functions of Random variables
5. Functions of a Random Variable
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
APPENDIX B Multivariate Statistics
Chapter 5 Applied Statistics and Probability for Engineers
8. One Function of Two Random Variables
Random Variate Generation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Chapter 2. Random Variables
9. Two Functions of Two Random Variables
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: 1
Berlin Chen Department of Computer Science & Information Engineering
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Discrete Random Variables and Probability Distributions
Berlin Chen Department of Computer Science & Information Engineering
8. One Function of Two Random Variables
Continuous Distributions
Continuous Random Variables: Basics
Presentation transcript:

UNIT-2 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability and Conditional Expectation 4.5 Multiple Random Variables 4.6 Functions of Several Random Variables 4.7 Expected Value of Functions of Random Variables 4.8 Jointly Gaussian Random Variables

2.1 Vector Random Variables A vector random variable X is a function that assigns a vector of real numbers to each outcome ζ in S, the sample space of the random experiment . EXAMPLE 4.1 Let a random experiment consist of selecting a student’s name form an urn. Let ζdenote the outcome of this experiment, and define the following three functions :

Events and Probabilities EXAMPLE 2.4 Consider the tow-dimensional random variable X = (X, Y). Find the region of the plane corresponding to the events The regions corresponding to events A and C are straightforward to find and are shown in Fig. 4.1.

For the n-dimensional random variable X = (X1,…,Xn), we are particularly interested in events that have the product form where Ak is a one-dimensional event (ie., subset of the real line) that involves Xk only. A fundamental problem in modeling a system with a vector random variable X = (X1,…, Xn) involves specifying the probability of product-form events : In principle, the probability in Eq. (4.2) is obtained by finding the probability of the equivalent event in the underlying sample space,

EXAMPLE 4.5 None of the events in Example 4.4 are of product form. Event B is the union of two product-form events :

The probability of a non-product-form event B is found as follow : First, B is approximated by the union of disjoint product-form events, say, B1, B2,…, Bn ; the probability of B is then approximated by The approximation becomes exact in the limit as the Bk’s become arbitrarily fine. Independence If the one-dimensional random variable X and Y are “independent,” if A1 is any event that involves X only and A2 is any event that involves Y only, then

In the general case of n random variables, we say that the random variables X1, X2,…, Xn are independent if where the Ak is an event that involves Xk only.

Pairs of Discrete Random Variable 2.2 PAIRS OF RANDOM VARIABLES Pairs of Discrete Random Variable Let the vector random variable X = (X,Y) assume values from some countable set The joint probability mass function of X specifies the probabilities of the product-form event The probability of any event A is the sum of the pmf over the outcomes in A :

The fact that the probability of the sample space S is 1 gives The marginal probability mass functions : and similarly,

EXAMPLE 2.7 The number of bytes N in a message has a geometric distribution with parameter 1-p and range SN={0, 1, 2, …}. Suppose that messages are broken into packets of maximum length M bytes . Let Q be the number of full packets in a message and let R be the number of bytes left over. Find the joint pmf and the marginal pmf’s of Q and R. SQ={0, 1, 2,….} and SR={0, 1, 2, ….M – 1} . The probability of the elementary event {(q, r)} is given by The marginal pmf of Q is

The marginal pmf of R is

The Joint cdf of X and Y The joint cumulative distribution function of X and Y is defined as the probability of the product-form event The joint cdf is nondecreasing in the “northeast” direction, It is impossible for either X or Y to assume a value less than , therefore It is certain that X and Y will assume values less than infinity, therefore

If we let one of the variables approach infinity while keeping the other fixed, we obtain the marginal cumulative distribution functions and Recall that the cdf for a single random variable is continuous form the right. It can be shown that the joint cdf is continuous from the “north” and from the “east” and

EXAMPLE 2.8 The joint cdf for the vector of random variable X = (X,Y) is given by Find the marginal cdf’s. The marginal cdf’s are obtained by letting one of the variables approach infinity :

The cdf can be used to find the probability of events that can be expressed as the union and intersection of semi-infinite rectangles. Consider the strip defined by denoted by the region B in Fig. 4.6(a) . By the third axiom of probability we have that The probability of the semi-infinite strip is therefore Consider next the rectangle denoted by the region A in Fig 4.6 (b).

The probability of the rectangle is thus EXAMPLE 2.9 Find the probability of the events where x > 0 and y > 0, and in Example 4.8 The probability of A is given directly by the cdf : The probability of B requires more work. Consider Bc

The probability of the union of two events : The probability of B : The probability of event D is found by applying property vi of the joint cdf :

The Joint pdf of Two Jointly Continuous Random Variables We say that the random variables X and Y are jointly continuous if the probabilities of events involving (X, Y) can be expressed as an integral of a pdf. There is a nonnegative function fX,Y(x,y), called the joint probability density function, that is defined on the real plane such that for every event A, a subset of the plane, as shown in Fig. 4.7. When a is the entire plane, the integral must equal one : The joint cdf can be obtained in terms of the joint pdf of jointly continuous random variables by integrating over the semi-infinite

rectangle defined by (x, y) : It then follows that if X and Y are jointly continuous random variables, then the pdf can be obtained from the cdf by differentiation : The probability of a rectangle region is obtained by letting in Eq. (4.9) :

The marginal pdf’s fX(x) and fY(y) are obtained by taking the derivative of the corresponding marginal cdf’s , and Similarly,

EXAMPLE 2.10 Jointly Uniform Random Variables A randomly selected point (X, Y) in the unit square has the uniform joint pdf given by Find the joint cdf. There are five cases in this problem, corresponding to the five regions shown in Fig. 4.9. 1. If x < 0 or y < 0, the pdf is zero and Eq. (4.12) implies 2. If (x,y) is inside the unit interval,

3. If 4. Similarly, if 5. Finally, if

EXAMPLE 2.11 Find the normalization constant c and the marginal pdf’s for the following joint pdf : The constant c is found from the normalization condition specified by Eq. (4.10) : Therefore c= 2. The marginal pdf’s are found by evaluating Eq. (4.15a) and (4.15b) : and

EXAMPLE 2.13 Jointly Gaussian Random Variables The joint pdf of X and Y, shown in Fig. 4.11 is We say that X and Y are jointly Gaussian. Find the marginal pdf’s. The marginal pdf of X is found by integrating fX,Y(x,y) over y : We complete the square of the argument of the exponent by adding and subtracting ρ2x2 , that is

Random Variables That Differ in Type EXAMPLE 2.14 A Communication Channel with Discrete Input and continuous Output Let X be the input , Y be output and N be noise. and Find therefore where P[X = +1] = 1 / 2. When the input X = 1, the output Y is uniformly distributed in the interval [-1, 3]; therefore

2.3 INDEPENDENCE OF TWO RANDOM VARIABLES X and Y are independent random variables if any event A1 defined in terms of X is independent of any event A2 defined in terms of Y ; Suppose that X and Y are a pair of discrete random variables. If we let then the independence of X and Y implies that Therefore, if X and Y are independent discrete random variables, then the joint pmf is equal to the product of the marginal pmf’s

Let be a product-form event as above, then We say, The “discrete random variables X and Y are independent if and only if the joint pmf is equal to the product of the marginal pmf’s for all xj, yk ”

EXAMPLE 4.16 Are Q and R in Example 4.7 independent? From Example 4.7 we have Therefore Q and R are independent.

It can shown that the random variables X and Y are independent if and only if their joint cdf is equal to the product of its marginal cdf’s : Similarly, if X and Y are jointly continuous, then X and Y are independent if and only if their joint pdf is equal to the product of the marginal pdf’s : EXAMPLE 4.18 Are the random variables X and Y in Example 4.13 independent? The product of the marginal pdf’s of X and Y in Example 4.13 is The jointly Gaussian r.v’s X and Y are indepdent if and only if ρ=0.

EXAMPLE 4.19 Are the random variables X and Y independent in Example 4.8? If we multiple the marginal cdf’s found in Example 4.8 we find so X and Y are independent. If X and Y are independent random variables, then the random variables defined by any air of functions g(X) and h(Y) are also independent. 1. Consider the one-dimensional events A and B. 2. Let A’ be the set of all values of x such that if x is in A’ then g(x) is in A,

3. Similarly, let B’ be the set of all values of y. then

Conditional Probability 2.4 CONDITIONAL PROBABILITY AND CONDITIONAL EXPECTATION Conditional Probability In Section 2.4, we know If X is discrete, then Eq. (4.22) can be used to obtain the conditional cdf of Y given X = xk : The conditional pdf of Y given X = xk , if the derivative exists, is given by

Integrating the conditional pdf : Note that if X and Y are independent, so If X and Y are discrete, then for xk such that . We defined for xk such that . The probability of any event A given X = xk is found by

Note that if X and Y are independent, then EXAMPLE 4.20 Let X be the input and Y the output of the communication channel discussed in Example 4.14. Find the probability that Y is negative given that X is +1. If X =+1, then Y is uniformly distributed in the interval [-1, 3], that is ,

Thus If X is a continuous random variable, then P[X = x] = 0 so Eq. (4.22) is undefined. We define the conditional cdf of Y given X = x by the following limiting procedure: The conditional cdf on the right side of Eq. (4.28) is :

As we let h approach zero, The conditional pdf of Y given X = x is obtained by Note that if X and Y are independent, then

EXAMPLE 2.21 Let X and Y be the random variables introduced in Example 4.11. Find Using the marginal pdf’s and

If we multiply Eq. (4.26) by P[ X = xk ], then Suppose we are interested in the probability that Y is in A :

If X and Y are continuous, we multiply Eq. (4.31) by fX(x) To replace summations with integrals and pmf’s with pdf’s , EXAMPLE 2.22 Number of Defects in a Region; Random Splitting of Poisson Counts The total number of defects X on a chip is a Poisson random variable with mean α. Suppose that each defect has a probability p of falling in a specific region R and that the location of each defect is independent of the locations of all other defects. Find the pmf of the number of defects Y that fall in the region R. Form Eq. (4.33)

The total number of defect : X = k, The total number of defect : X = k, the number of defects that fall in the region R is a binomial r.v with k, p Noting that Thus Y is a Poisson r.v with mean αp.

EXAMPLE 2.23 Number of Arrivals During a Customer’s Service Time The number of customers that arrive at a service station during a time t is a Poisson random variable with parameter βt. The time required to service each customer is an exponential random variable with parameter α. Find the pmf for the number of customers N that arrive during the service time T of a specific customer. Assume that the customer arrivals are independent of the customer service time.

Let r = (α+β)t, then where we have used the fact that the last integral is a gamma function and is equal to k!. Conditional Expectation The conditional expectation of Y given X = x is defined by If X and Y are both discrete random variables, we have

We now show that where the right-hand side is and We prove Eq. (4.37) for the case where X and Y are jointly continuous random variables, then

The above result also holds for the expected value of a function of Y : The kth moment of Y is given by EXAMPLE 4.25 Average Number of Defects in a Region Find the mean of Y in Example 4.22 using conditional expectation.

EXAMPLE 2.26 Average Number of Arrivals in a Service Time Find the mean and variance of the number of customer arrivals N during the service time T of a specific customer in Example 4.23 . We will need the first two conditional moments of N given T = t: The first two moment of N are

The variance of N is then If T is exponential with parameter α, then E[T] = 1/α and VAR[T] = 1/α2 , so

2.5 MULTIPLE RANDOM VARIABLES Joint Distributions The joint cumulative distribution function of X1, X2,…., Xn is defined as the probability of an n-dimensional semi-infinite rectangle associate with the point (x1,…, xn): The joint cdf is defined for discrete, continuous, and random variables of mixed type.

EXAMPLE 4.27 Let the event A be defined as follows : Find the probability of A . The maximum of three numbers is less than 5 if and only if each of the three numbers is less than 5 ; therefore

The joint probability mass function of n discrete random variables is defined by The probability of any n-dimensional event A is found by summing the pmf over the points in the event One-dimensional pmf of Xj is found by adding the joint pmf over all variables other than xj: The marginal pmf for X1,…,Xn-1 is given by

A family of conditional pmf’s is obtained from the joint pmf by conditioning on different subsets of the random variables. if . Repeated applications of Eq. (4.43a) yield

EXAMPLE 4.28 A computer system receives messages over three communications lines. Let Xj be the number of messages received on line j in one hour. Suppose that the joint pmf of X1, X2, and X3 is given by Find p(x1, x2) and p(x1) given that 0< ai < 1.

If r.v’s X1, X2,…,Xn are jointly continuous random variables, then the probability of any n-dimensional event A is where is the joint probability density function The joint cdf of X is obtained from the joint pdf by integration : The joint pdf (if the derivative exists) is given by The marginal pdf for a subset of the random variables is obtained b integrating the other variables out. The marginal of X1 is

The marginal pdf for X1,…,Xn-1 is given by The pdf of Xn given the values of X1,…,Xn-1 is given by if Repeated applications of Eq. (4.49a) yield

EXAMPLE 4.29 The r.v’s X1, X2, and X3 have the joint Gaussian pdf Find the marginal pdf of X1 and X3 . The above integral was carried out in Example 4.13 with

Independence X1,…,Xn-1 are independent if for any one-dimensional events A1,…,An. It can be shown that X1,…,Xn are independent if and only if for all x1,…,xn. If the random variables are discrete, If the random variables are jointly continuous,

2.6 FUNCTIONS OF SEVERAL RANDOM VARIABLES One Function of Several Random Variables Let the random variable Z be defined as a function of several random variables: The cdf of Z is found by first finding the equivalent event of that is, the set then

EXAMPLE 4.31 Sum of Two Random Variables Let Z = X + Y. Find FZ(z) and fZ(z) in terms of the joint pdf of X and Y. The cdf of Z is The pdf of Z is Thus the pdf for the sum of two random variables is given by a superposition integral. If X and Y are independent random variables, then by Eq. (4.21) the pdf is given by the convolution integral of the margial pdf’s of X and Y :

EXAMPLE 4.32 Sum of Nonindependent Gaussian Random Variables Find the pdf of the sum Z = X + Y of two zero-mean, unit-variance Gaussian random variables with correlation coefficient ρ= -1 / 2. After completing the square of the argument in the exponent we obtain

Let Z = g(X, Y), and suppose we are given that Y = y, then Z = g(X, y) is a function of one random variable. And the pdf of Z given Y = y: fZ(z | Y = y). The pdf of Z is found from

EXAMPLE 4.34 Let Z = X / Y. Find the pdf of Z if X and Y are independent and both exponentially distributed with mean one. Assume Y = y, then The pdf of Z is

Transformations of Random Vectors Let X1,…, Xn be random variables associate with some experiment, and let the random variables Z1,…, Zn be defined by n functions of X = (X1,…, Xn) : The joint cdf of Z1,…, Zn at the point z = (z1,…, zn) is equal to the probability of the region of x where

If X1,…, Xn have a joint pdf, then EXAMPLE 4.35 Let the random variables W and Z be defined by Find the joint cdf of W and Z in terms of the joint cdf of X and Y. If z > w, the above probability is the probability of the semi-infinite rectangle defined by the point (z, z) minus the square region denote by A.

If z < w then

pdf of Linear Transformations We consider first the linear transformation of two random variables : or Denote the above matrix by A. We will assume A has an inverse, so each point (v, w) has a unique corresponding point (x, y) obtained from In Fig. 4.15, the infinitesimal rectangle and the parallelogram are equivalent events, so their probabilities must be equal. Thus

where dP is the area of the parallelogram where dP is the area of the parallelogram. The joint pdf of V and W is thus given by where x an y are related to (v, w) by Eq. (4.56) It can be shown that so the “stretch factor” is where |A| is the determinant of A. Let the n-dimensional vector Z be where A is an invertible matrix. The joint of Z is then

EXAMPLE 4.36 Linear Transformation of Jointly Gaussian Random Variables Let X and Y be the jointly Gaussian random variables introduced in Example 4.13. Let V and W be obtained from (X, Y) by Find the joint pdf of V and W. |A| = 1,

so where By substituting for x and y, the argument of the exponent becomes Thus

pdf of General Transformations Let the r.v’s V and W be defined by two nonlinear functions of X and Y : Assume that the functions v(x, y) and w(x, y) are invertible, then In Fig. 4.17(b) , make the approximation and similarly for the y variable. The probabilities of the infinitesimal rectangle and the parallelogram are approximately equal. therefore

and where dP is the area of the parallelogram. The “stretch factor” at the point (v, w) is given by the determinant of a matrix of partial derivatives : The determinant J(x, y) is called the Jacobian of the transformation.

The Jacobian of the inverse transformation is given by It can be shown that We therefore conclude that the joint pdf of V and W can be found using either of the following expressions :

EXAMPLE 4.37 Radius an Angle of Independent Gaussian Random Variables Let X and Y be zero-mean, unit-variance independent Gaussian random Variables. Find the joint pdf of V and W defined by where denotes the angle in the range (0.2π) that is defined by the point (x, y). The inverse transformation is given by The Jacobian is given by

Thus The pdf of a Rayleigh random variable is given by We therefore conclude that the radius V and the angle W are independent random variables.

EXAMPLE 4.38 Student’s t-distribution Let X be a zero-mean, unit-variable Gaussian random variable and let Y be a chi-square random variable with n degrees of freedom. Assume that X and Y are independent. Find the pdf of Define the auxiliary function W = Y. Then The Jacobian of the inverse transformation is

The pdf of V is let We finally obtain the Student’s t-distribution

Problem:

Consider the problem of finding the joint pdf for n functions of n random variables X = (X1,…, Xn): We assume as before that the set of equations has a unique solution given by The joint pdf of Z is then given by

where are the determinants of the transformation and the inverse transformation, respectively,

2.7 EXPECTED VALUE OF FUNCTIONS OF RANDOM VARIABLES The expected value of Z = g(X, Y) can be found using the following expressions :

EXAMPLE 2.39 Sum of Random Variables Let Z = X + Y . Find E[Z]. Thus, the result shows that the expected value of a sum of n random variables is equal to the sum of the expected values :

In general if X1,…, Xn are independent random variables, then The Correlation and covariance of Two Random Variables The jkth joint moment of X and Y is defined by If j = 0, to obtain the moments of Y, If k = 0, to obtain the moments of X , If j = 1, k = 1, to call E[XY] as the correlation of X and Y. If E[XY]=0, we say that X and Y are orthogonal.

The jkth central moment of X and Y is defined as the joint moment of the centered random variables, X – E[X] and Y – E[Y] : Note: j = 2 k = 0 gives VAR(X) j = 0 k = 2 gives VAR(Y), j = k =1, that is defined as the covariance of X and Y

EXAMPLE 2.41 Covariance of Independent Random Variables Let X and Y are independent random variables. Find their covariance. Therefore pairs of independent random variables have covariance zero. The correlation coefficient of X and Y is defined by where are the standard deviations of X and Y, respectively

The correlation coefficient is a number that is at most 1 in magnitude : proof : The extreme values of ρX,Y are achieved when X an Y are related linearly, Y = aX + b; ρX,Y =1 if a > 0 and ρX,Y = -1 if a < 0. X and Y are said to be uncorrelated if ρX,Y = 0. If X and Y are independent(獨立), then X and Y are uncorrelated. In Example 4.18, we saw that if X and Y are jointly Gaussian and ρX,Y = 0 , then X and Y are independent Gaussian random variables.

EXAMPLE 2.42 Uncorrelated but Dependent Random Variables Let be uniformly distributed in the interval (0,2π). Let The point (X, Y) then corresponds to the point on the unit circle specified by the angle , as shown in Fig. 4.18. This is not the case in Example 3.28, so X and Y are dependent (相依). We now show that X and Y are uncorrelated (不相關):

*Joint Characteristic Function The joint characteristic function of n random variables is defined as Consider If X and Y are jointly continuous random variables, then The inversion formula for the Fourier transform implies that the joint pdf is given by

The marginal characteristic functions can be obtained form the joint characteristic function : If X and Y are independent random variables, then The characteristic function of the sum Z = aX + bY can be obtained from the joint characteristic function of X and Y as follows: If X and Y are independent random variables, the characteristic function of Z = aX + bY is then

The joint moments of X and Y can be obtained by taking derivatives of the joint characteristic funciton. derivatives :

EXAMPLE 2.44 Suppose U and V are independent zero-mean, unit-variance Gaussian random variables, and let Find the joint characteristic function of X and Y, and find E[XY]. The joint characteristic function of X and Y is Since U and V are independent random variables, then

The correlation E[XY] is found from Eq. (4.78) with i = 1and k =1:

2.8 JOINTLY GAUSSIAN RANDOM VARIABLES The random variables X and Y are said to be jointly Gaussian if their joint pdf has the form for The pdf is constant for values x and y for which the argument of the exponent is constant :

When ρX,Y = 0, X and Y are independent ; when ρX,Y ≠ 0, the major axis of the ellipse is oriented along the angle Note that the angle is 45º when the variance are equal. The marginal pdf of X is found by integrating fX,Y(x, y) over all y. that is, X is a Gaussian random variable with mean m1 and variance .

The conditional pdf of X given Y = y is We now show that the ρX,Y in Eq. (4.79) is indeed the correlation coefficient between X and Y. The covariance between X and Y is defined by Now the conditional expectation of (X – m1)(Y – m2) given Y = y is

where we have used the fact that the conditional mean of X given Y = y is Therefore and

EXAMPLE 2.45 The amount of yearly rainfall in city 1 and in city 2 is modeled by a pair of jointly Gaussian random variables, X and Y, with pdf given by Eq. (4,79). Find the most likely value of X given that we know Y = y. The conditional pdf of X given Y = y is given by Eq. (4.82), which is maximum at the conditional mean

n Jointly Gaussian Random Variables The random variables X1, X2,…, Xn are said to be jointly Gaussian if their joint pdf is given by where x and m are column vectors defined by and K is the covariance matrix that is defined by

Equation (4.83) shows that the pdf of jointly Gaussian random variables is completely specified by the individual means and variances and the pairwise covariances.

EXAMPLE 2.46 Verify that the tow-dimensional Gaussian pdf given in Eq. (4.79) has the form of Eq. (4.83). The covariance matrix for the two-dimensional case is given by The inverse of the covariance matrix is The term in the exponent is therefore

EXAMPLE 4.48 Independence of Uncorrelated Jointly Gaussian Random Variables Suppose X1, X2,…, Xn are jointly Gaussian random variables with Show that X1, X2,…, Xn are independent random variables. Therefore and

Thus form Eq. (4.83)

2-dimensional Gaussian pdf, n=2

OPERATIONS ON MULTIPLE RANDOM VARIABLES Modulation, Demodulation and Coding OPERATIONS ON MULTIPLE RANDOM VARIABLES Prepared by A RAJASEKHAR YADAV

5.1 Expected Value of a function of R.Vs If g(x,y) is function of two r.v.s X and Y , then the expected value of g is: Note that the expected value of a jam of functions is equal to the sum of the expected values of the functions:

Expected Value of a function of R.Vs

5.1.1 Joint Moments about the origin: Notes: 1. mn0= E[ Xn ] are the moments of X, m0k are the moments of Y. 2. The sum n+k is called the order of the moments. Thus, m02 ,m20 m11 are called second order moments of X and Y. 3. m10 = E[X] and m01 = E[Y] are the expected values of X and Y, respectively, and are the coordinators of the "center of gravity" of the function fXY(x,y).

5.1.1 Joint Moments about the origin: Correlation: The second-order moment m11 = E[XY] is called the correlation of X and Y. In fact, it is a very important statistic and denoted by RXY. - If RXY = E[X] E[Y] ,then X and Y are said to be uncorrelated. - If X and Y are independent ,then fXY(x,y) = fX(x) fY(y) and

5.1.1 Joint Moments about the origin: Therefore, if X and Y are independent ⇒ they are uncorrelated. However, if X and Y are uncorrelated, it is not necessary that they are independent. If RXY = 0 then X and Y are called orthogonal.

5.1.2 Joint Central Moments: Covariance: The second order joint moment u11 is called the covariance of X and Y and denoted by CXY

5.1.2 Joint Central Moments:

5.1.2 Joint Central Moments: The normalized second-order moment ρ is known as the correlation coefficient of X and Y.

5.2 Joint characteristic Functions: The joint characteristic function of two r.v.s X and Y is defined by Where w1and w2 are real numbers. By setting w1= 0 or w2 = 0, we obtain the marginal characteristic function.

Joint moments are obtained as : Joint moments mnk can be found from the joint characteristic function as follows:

5.3 Jointly Gaussian Random Variables Two random variables are jointly Gaussian if their joint density function is of the form (sometimes called bivariate Gaussian)

Jointly Gaussian Random Variables Its maximum is located at the point

Jointly Gaussian Random Variables The locus of constant values of fXY(x,y) will be an ellipse. This is equivalent to saying that the line of intersection formed by slicing the function fXY(x,y) with a plane parallel to the xy plane is an ellipse.

Jointly Gaussian Random Variables Where fX(x) and fY(y) are the marginal density functions of X and Y.

Jointly Gaussian Random Variables Consider r.v.s Y1 and Y2 related to arbitrary r.v.s X and Y by the coordinate relation

Jointly Gaussian Random Variables From correlation coefficient., If we require Y1 and Y2 to be uncorrelated, we must have CY1Y2=0. by equating the above equation to zero we get.,

N Random variables N random variables are jointly Gaussian if their joint density function is of the form (sometimes called multivariate Gaussian)

N Random variables Cij is the covariance matrix , When N = 2,

Jointly Gaussian Random Variables Uncorrelated Gaussian random variables are also statistically independent. Other properties of Gaussian r.v.s include: • Gaussian r.v.s are completely defined through their 1st- and 2nd-order moments, i.e., their means, variances, and covariance's. • Random variables produced by a linear transformation of jointly Gaussian r.v.s are also Gaussian. • The conditional density functions defined over jointly Gaussian r.v.s is also Gaussian. There fore we conclude that any uncorrelated Gaussian random variables are also statistically independent. It results that a coordinate rotation through an angle

Transformations of Multiple Random Variables

Transformations of Multiple Random Variables Example 1: find the density function for

Transformations of Multiple Random Variables Multiple functions Yi = Ti (X1, X2, X3…………XN)., i =1,2,3……….N

Transformations of Multiple Random Variables Example :

Linear Transformations of Gaussian Random Variables

5.5 Linear Transformations of Gaussian Random Variables

5.5 Linear Transformations of Gaussian Random Variables