7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.

Slides:



Advertisements
Similar presentations
9. Two Functions of Two Random Variables
Advertisements

Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Week71 Discrete Random Variables A random variable (r.v.) assigns a numerical value to the outcomes in the sample space of a random phenomenon. A discrete.
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
1 1. Basics Probability theory deals with the study of random phenomena, which under repeated experiments yield different outcomes that have certain underlying.
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
1 5. Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v?
1 2. Independence and Bernoulli Trials Independence: Events A and B are independent if It is easy to show that A, B independent implies are all independent.
1 One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint p.d.f.
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Distribution Function properties Let us consider the experiment of tossing the coin 3 times and observing the number of heads The probabilities and the.
Continuous Random Variables and Probability Distributions
Jointly distributed Random variables Multivariate distributions.
Joint Moments and Joint Characteristic Functions.
One Function of Two Random Variables
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Linear Algebra Review.
3. Random Variables (Fig.3.1)
Statistics Lecture 19.
The normal distribution
12. Principles of Parameter Estimation
Jointly distributed random variables
5 Systems of Linear Equations and Matrices
STATISTICS Random Variables and Distribution Functions
The Bernoulli distribution
Systems of First Order Linear Equations
Some Rules for Expectation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Matrices and Matrix Operations
5. Functions of a Random Variable
5. Functions of a Random Variable
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
CS723 - Probability and Stochastic Processes
11. Conditional Density Functions and Conditional Expected Values
9. Two Functions of Two Random Variables
12. Principles of Parameter Estimation
16. Mean Square Estimation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Discrete Random Variables and Probability Distributions
8. One Function of Two Random Variables
Moments of Random Variables
Presentation transcript:

7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record the height and weight of each person in a community or the number of people and the total income in a family, we need two numbers. Let X and Y denote two random variables (r.v) based on a probability model (, F, P). Then and PILLAI

where x and y are arbitrary real numbers. Properties (i) since we get What about the probability that the pair of r.vs (X,Y) belongs to an arbitrary region D? In other words, how does one estimate, for example, Towards this, we define the joint probability distribution function of X and Y to be where x and y are arbitrary real numbers. Properties (i) since we get (7-1) (7-2) PILLAI

To prove (7-3), we note that for Similarly we get (ii) To prove (7-3), we note that for and the mutually exclusive property of the events on the right side gives which proves (7-3). Similarly (7-4) follows. (7-3) (7-4) PILLAI

(iii) This is the probability that (X,Y) belongs to the rectangle in Fig. 7.1. To prove (7-5), we can make use of the following identity involving mutually exclusive events on the right side. (7-5) Fig. 7.1 PILLAI

Joint probability density function (Joint p.d.f) This gives and the desired result in (7-5) follows by making use of (7-3) with and respectively. Joint probability density function (Joint p.d.f) By definition, the joint p.d.f of X and Y is given by and hence we obtain the useful formula Using (7-2), we also get (7-6) (7-7) (7-8) PILLAI

To find the probability that (X,Y) belongs to an arbitrary region D, we can make use of (7-5) and (7-7). From (7-5) and (7-7) Thus the probability that (X,Y) belongs to a differential rectangle x y equals and repeating this procedure over the union of no overlapping differential rectangles in D, we get the useful result (7-9) Fig. 7.2 PILLAI

To prove (7-11), we can make use of the identity (7-10) (iv) Marginal Statistics In the context of several r.vs, the statistics of each individual ones are called marginal statistics. Thus is the marginal probability distribution function of X, and is the marginal p.d.f of X. It is interesting to note that all marginals can be obtained from the joint p.d.f. In fact Also To prove (7-11), we can make use of the identity (7-11) (7-12) PILLAI

and taking derivative with respect to x in (7-13), we get so that To prove (7-12), we can make use of (7-7) and (7-11), which gives and taking derivative with respect to x in (7-13), we get At this point, it is useful to know the formula for differentiation under integrals. Let Then its derivative with respect to x is given by Obvious use of (7-16) in (7-13) gives (7-14). (7-13) (7-14) (7-15) (7-16) PILLAI

If X and Y are discrete r. vs, then represents their joint p. d If X and Y are discrete r.vs, then represents their joint p.d.f, and their respective marginal p.d.fs are given by and Assuming that is written out in the form of a rectangular array, to obtain from (7-17), one need to add up all entries in the i-th row. (7-17) (7-18) Fig. 7.3 It used to be a practice for insurance companies routinely to scribble out these sum values in the left and top margins, thus suggesting the name marginal densities! (Fig 7.3). PILLAI

From (7-11) and (7-12), the joint P. D. F and/or the joint p. d From (7-11) and (7-12), the joint P.D.F and/or the joint p.d.f represent complete information about the r.vs, and their marginal p.d.fs can be evaluated from the joint p.d.f. However, given marginals, (most often) it will not be possible to compute the joint p.d.f. Consider the following example: Example 7.1: Given Obtain the marginal p.d.fs and Solution: It is given that the joint p.d.f is a constant in the shaded region in Fig. 7.4. We can use (7-8) to determine that constant c. From (7-8) Fig. 7.4 (7-19) (7-20) PILLAI

Thus c = 2. Moreover from (7-14) and similarly Clearly, in this case given and as in (7-21)-(7-22), it will not be possible to obtain the original joint p.d.f in (7-19). Example 7.2: X and Y are said to be jointly normal (Gaussian) distributed, if their joint p.d.f has the following form: (7-21) (7-22) (7-23) PILLAI

By direct integration, using (7-14) and completing the square in (7-23), it can be shown that ~ and similarly Following the above notation, we will denote (7-23) as Once again, knowing the marginals in (7-24) and (7-25) alone doesn’t tell us everything about the joint p.d.f in (7-23). As we show below, the only situation where the marginal p.d.fs can be used to recover the joint p.d.f is when the random variables are statistically independent. (7-24) (7-25) PILLAI

or equivalently, if X and Y are independent, then we must have Independence of r.vs Definition: The random variables X and Y are said to be statistically independent if the events and are independent events for any two Borel sets A and B in x and y axes respectively. Applying the above definition to the events and we conclude that, if the r.vs X and Y are independent, then i.e., or equivalently, if X and Y are independent, then we must have (7-26) (7-27) (7-28) PILLAI

If X and Y are discrete-type r.vs then their independence implies Equations (7-26)-(7-29) give us the procedure to test for independence. Given obtain the marginal p.d.fs and and examine whether (7-28) or (7-29) is valid. If so, the r.vs are independent, otherwise they are dependent. Returning back to Example 7.1, from (7-19)-(7-22), we observe by direct verification that Hence X and Y are dependent r.vs in that case. It is easy to see that such is the case in the case of Example 7.2 also, unless In other words, two jointly Gaussian r.vs as in (7-23) are independent if and only if the fifth parameter (7-29) PILLAI

Determine whether X and Y are independent. Solution: Example 7.3: Given Determine whether X and Y are independent. Solution: Similarly In this case and hence X and Y are independent random variables. (7-30) (7-31) (7-32) PILLAI