Download presentation
Presentation is loading. Please wait.
Published bySebastian Cogdell Modified over 9 years ago
1
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see next slide. The marginal distributions can be obtained from the joint distributions as follows: When X and Y are both discrete, the joint probability mass function is given by The probability mass function of X, p X (x), is obtained by “summing over y”. Similarly for p Y (y).
2
C.D.F. for a Bivariate Normal (density shown later)
3
Example for joint probability mass function Consider the following table: Using the table, we have Y=0 Y=3 Y=4 X=5 1/7 1/7 1/7 3/7 X=8 3/7 0 1/7 4/7 4/7 1/7 2/7 pXpX pYpY
4
Expected Values for Jointly Distributed Random Variables Let X and Y be discrete random variables with joint probability mass function p(x, y). Let the sets of values of X and Y be A and B, resp. We define E(X) and E(Y) as Example. For the random variables X and Y from the previous slide,
5
Law of the Unconscious Statistician Revisited Theorem. Let p(x, y) be the joint probability mass function of discrete random variables X and Y. Let A and B be the set of possible values of X and Y, resp. If h is a function of two variables from R 2 to R, then h(X, Y) is a discrete random variable with expected value given by provided that the sum is absolutely convergent. Corollary. For discrete random variables X and Y, Problem. Verify the corollary for X and Y from two slides previous.
6
Joint and marginal distribution functions for continuous r.v.’s Random variables X and Y are jointly continuous if there exists a nonnegative function f(x, y) such that for every well-behaved subset C of lR 2. The function f(x, y) is called the joint probability density function of X and Y. It follows that Also,
7
Density for a Bivariate Normal (see page 449 for formula)
8
Example of joint density for continuous r.v.’s Let the joint density of X and Y be Prove that (1) P{X>1,Y 0, and 0 otherwise.
9
Expected Values for Jointly Distributed Continuous R.V.s Let X and Y be continuous random variables with joint probability density function f(x, y). We define E(X) and E(Y) as Example. For the random variables X and Y from the previous slide, That is, X and Y are exponential random variables. It follows that
10
Law of the Unconscious Statistician Again Theorem. Let f(x, y) be the joint density function of random variables X and Y. If h is a function of two variables from lR 2 to lR, then h(X, Y) is a random variable with expected value given by provided the integral is absolutely convergent. Corollary. For random variables X and Y as in the above theorem, Example. For X and Y defined two slides previous,
11
Random Selection of a Point from a Planar Region Let S be a subset of the plane with area A(S). A point is said to be randomly selected from S if for any subset R of S with area A(R), the probability that R contains the point is A(R)/A(S). Problem. Two people arrive at a restaurant at random times from 11:30am to 12:00 noon. What is the probability that their arrival times differ by ten minutes or less? Solution. Let X and Y be the minutes past 11:30 am that the two people arrive. Let The desired probability is
12
Independent random variables Random variables X and Y are independent if for any two sets of real numbers A and B, That is, events E A ={X A}, E B ={Y B} are independent. In terms of F, X and Y are independent if and only if When X and Y are discrete, they are independent if and only if In the jointly continuous case, X and Y are independent if and only if
13
Example for independent jointly distributed r.v.’s A man and a woman decide to meet at a certain location. If each person independently arrives at a time uniformly distributed between 12 noon and 1 pm, find the probability that the first to arrive has to wait longer than 10 minutes. Solution. Let X and Y denote, resp., the time that the man and woman arrive. X and Y are independent.
14
Sums of independent random variables Suppose that X and Y are independent continuous random variables having probability density functions f X and f Y. Then We obtain the density of the sum by differentiating: The right-hand-side of the latter equation defines the convolution of f X and f Y.
15
Example for sum of two independent random variables Suppose X and Y are independent random variables, both uniformly distributed on (0,1). The density of X+Y is computed as follows: Because of the shape of its density function, X+Y is said to have a triangular distribution.
16
Functions of Independent Random Variables Theorem. Let X and Y be independent random variables and let g and h be real valued functions of a single real variable. Then (i) g(X) and h(Y) are also independent random variables Example. If X and Y are independent, then
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.