One Function of Two Random Variables

Slides:



Advertisements
Similar presentations
9. Two Functions of Two Random Variables
Advertisements

Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
EE3321 ELECTROMAGENTIC FIELD THEORY
Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula.
Probability Densities
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Continuous Random Variables and Probability Distributions
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
1 1. Basics Probability theory deals with the study of random phenomena, which under repeated experiments yield different outcomes that have certain underlying.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
1 5. Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v?
1 2. Independence and Bernoulli Trials Independence: Events A and B are independent if It is easy to show that A, B independent implies are all independent.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
1 One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint p.d.f.
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
One Random Variable Random Process.
1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Functions of a Random Variable.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
Operations on Multiple Random Variables
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
CY1B2 Statistics1 (ii) Poisson distribution The Poisson distribution resembles the binomial distribution if the probability of an accident is very small.
Fourier series, Discrete Time Fourier Transform and Characteristic functions.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Continuous Random Variables and Probability Distributions
Joint Moments and Joint Characteristic Functions.
EEE APPENDIX B Transformation of RV Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
Topics 1 Specific topics to be covered are: Discrete-time signals Z-transforms Sampling and reconstruction Aliasing and anti-aliasing filters Sampled-data.
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
3.1 Expectation Expectation Example
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
5. Functions of a Random Variable
5. Functions of a Random Variable
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
11. Conditional Density Functions and Conditional Expected Values
9. Two Functions of Two Random Variables
Further Topics on Random Variables: Derived Distributions
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Further Topics on Random Variables: Derived Distributions
HKN ECE 313 Exam 2 Review Session
Berlin Chen Department of Computer Science & Information Engineering
Berlin Chen Department of Computer Science & Information Engineering
8. One Function of Two Random Variables
Further Topics on Random Variables: Derived Distributions
Continuous Random Variables: Basics
Presentation transcript:

One Function of Two Random Variables

One Function of Two Random Variables X and Y : Two random variables g(x,y): a function We form a new random variable Z as Given the joint p.d.f how does one obtain the p.d.f of Z ? A receiver output signal usually consists of the desired signal buried in noise the above formulation in that case reduces to Z = X + Y. Practical Viewpoint

We have: Where in the XY plane represents the region where . need not be simply connected First find the region for every z, Then evaluate the integral there.

Example Z = X + Y. Find Integrating over all horizontal strips(like the one in figure) along the x-axis We can find by differentiating directly.

Recall - Leibnitz Differentiation Rule Suppose Then Using the above,

Example – Alternate Method of Integration If X and Y are independent, then This integral is the standard convolution of the functions and expressed in two different ways.

As a special case, suppose that Example – Conclusion If two r.vs are independent, then the density of their sum equals the convolution of their density functions. As a special case, suppose that for then using the figure we can determine the new limits for

On the other hand, by considering vertical strips first, we get Example – Conclusion In that case or On the other hand, by considering vertical strips first, we get if X and Y are independent random variables.

X and Y are independent exponential r.vs Example X and Y are independent exponential r.vs with common parameter , let Z = X + Y. Determine Solution

X and Y are independent uniform r.vs in the common interval (0,1). Example This example shows that care should be taken in using the convolution formula for r.vs with finite range. X and Y are independent uniform r.vs in the common interval (0,1). let Z = X + Y. Determine Clearly, There are two cases of z for which the shaded areas are quite different in shape and they should be considered separately. Solution

Example – Continued

By direct convolution of and we obtain the same result. In fact, for Example – Continued So, we obtain By direct convolution of and we obtain the same result. In fact, for

Example – Continued and for This figure shows which agrees with the convolution of two rectangular waveforms as well.

Example Let Determine its p.d.f and hence If X and Y are independent, which represents the convolution of with Solution

After differentiation, this gives Example – a special case Suppose For and for After differentiation, this gives

Given Z = X / Y, obtain its density function. Example Given Z = X / Y, obtain its density function. if Since by the partition theorem, we have and hence by the mutually exclusive property of the later two events Solution

Integrating over these two regions, we get Example – Continued Integrating over these two regions, we get Differentiation with respect to z gives

Example – Continued If X and Y are nonnegative random variables, then the area of integration reduces to that shown in this figure: This gives or

X and Y are jointly normal random variables with zero mean so that Example X and Y are jointly normal random variables with zero mean so that Show that the ratio Z = X / Y has a Cauchy density function centered at Using and the fact that we obtain Solution

which represents a Cauchy r.v centered at Example – Continued where Thus which represents a Cauchy r.v centered at Integrating the above from to z, we obtain the corresponding distribution function to be

Example Obtain We have But, represents the area of a circle with radius and hence This gives after repeated differentiation Solution

Example X and Y are independent normal r.vs with zero Mean and common variance Determine for Direct substitution with gives where we have used the substitution Solution Result - If X and Y are independent zero mean Gaussian r.vs with common variance then is an exponential r.vs with parameter

Example Let Find This corresponds to a circle with radius Thus If X and Y are independent Gaussian as in the previous example, Solution (*) Rayleigh distribution

Example – Conclusion If where X and Y are real, independent normal r.vs with zero mean and equal variance, then the r.v has a Rayleigh density. W is said to be a complex Gaussian r.v with zero mean, whose real and imaginary parts are independent r.vs. As we saw its magnitude has Rayleigh distribution. What about its phase

U has a Cauchy distribution with Example – Conclusion Let U has a Cauchy distribution with As a result The magnitude and phase of a zero mean complex Gaussian r.v has Rayleigh and uniform distributions respectively. We will show later, these two derived r.vs are also independent of each other!

What if X and Y have nonzero means and respectively? Example What if X and Y have nonzero means and respectively? Since Solution substituting this into (*), and letting we get Rician p.d.f. the modified Bessel function of the first kind and zeroth order where

Application Fading multipath situation where there is Line of sight signal (constant) Multipath/Gaussian noise Fading multipath situation where there is a dominant constant component (mean) a zero mean Gaussian r.v. The constant component may be the line of sight signal and the zero mean Gaussian r.v part could be due to random multipath components adding up incoherently (see diagram below). The envelope of such a signal is said to have a Rician p.d.f.

special cases of the more general order statistics. Example Determine nonlinear operators special cases of the more general order statistics. We can arrange any n-tuple such that: Solution

represent the set of order statistics among n random variables. Example - continued represent r.vs, The function that takes on the value in each possible sequence is known as the k-th order statistic. represent the set of order statistics among n random variables. represents the range, and when n = 2, we have the max and min statistics.

since and form a partition. Example - continued Since we have since and form a partition.

From the rightmost figure, If X and Y are independent, then Example - continued From the rightmost figure, If X and Y are independent, then and hence Similarly Thus

Example - continued (a) (b) (c)

Example X and Y are independent exponential r.vs with common parameter . Determine for We have and hence But and so that Thus min ( X, Y ) is also exponential with parameter 2. Solution

Example X and Y are independent exponential r.vs with common parameter . Define Determine Solution We solve it by partitioning the whole space. Since X and Y are both positive random variables in this case, we have

Example - continued (a) (b)

Let Determine the p.m.f of Z. Example – Discrete Case Let X and Y be independent Poisson random variables with parameters and respectively. Let Determine the p.m.f of Z. Z takes integer values For any gives only a finite number of options for X and Y. The event is the union of (n + 1) mutually exclusive events given by Solution

If X and Y are also independent, then Example – continued As a result If X and Y are also independent, then and hence

Thus Z represents a Poisson random variable with parameter Example – conclusion Thus Z represents a Poisson random variable with parameter Sum of independent Poisson random variables is also a Poisson random variable whose parameter is the sum of the parameters of the original random variables. Such a procedure for determining the p.m.f of functions of discrete random variables is somewhat tedious. As we shall see, the joint characteristic function can be used in this context to solve problems of this type in an easier fashion.