PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.

Slides:



Advertisements
Similar presentations
9. Two Functions of Two Random Variables
Advertisements

Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
CHAPTER 8 More About Estimation. 8.1 Bayesian Estimation In this chapter we introduce the concepts related to estimation and begin this by considering.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Continuous Random Variables. For discrete random variables, we required that Y was limited to a finite (or countably infinite) set of values. Now, for.
Statistics Lecture 18. Will begin Chapter 5 today.
Chapter 4: Joint and Conditional Distributions
Jointly distributed Random variables
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Joint Probability Distributions Leadership in Engineering
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Jointly Distributed Random Variables
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
1 1. Basics Probability theory deals with the study of random phenomena, which under repeated experiments yield different outcomes that have certain underlying.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Independence and Bernoulli.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
1 5. Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v?
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
1 One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint p.d.f.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
Chapter 5 Joint Continuous Probability Distributions Doubling our pleasure with two random variables Chapter 5C.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Functions of a Random Variable.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
Operations on Multiple Random Variables
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Geology 6600/7600 Signal Analysis 02 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Jointly distributed Random variables Multivariate distributions.
Joint Moments and Joint Characteristic Functions.
One Function of Two Random Variables
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
1 Two Discrete Random Variables The probability mass function (pmf) of a single discrete rv X specifies how much probability mass is placed on each possible.
Lecture 21 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Statistics Lecture 19.
12. Principles of Parameter Estimation
Some Rules for Expectation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
5. Functions of a Random Variable
8. One Function of Two Random Variables
CS723 - Probability and Stochastic Processes
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
9. Two Functions of Two Random Variables
12. Principles of Parameter Estimation
Further Topics on Random Variables: Derived Distributions
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Further Topics on Random Variables: Derived Distributions
Introduction to Probability: Solutions for Quizzes 4 and 5
8. One Function of Two Random Variables
Further Topics on Random Variables: Derived Distributions
Moments of Random Variables
Presentation transcript:

PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables

Introduction  Expressing observations using more than one quantity -height and weight of each person -number of people and the total income in a family

Joint Probability Distribution Function  Let X and Y denote two RVs based on a probability model ( , F, P). Then:  What about  So, Joint Probability Distribution Function of X and Y is defined to be

Properties(1) Proof   Since

Properties(2) Proof   for  Since this is union of ME events,  Proof of second part is similar.

Properties(3) Proof   So, using property 2, we come to the desired result.

Joint Probability Density Function (Joint p.d.f)  By definition, the joint p.d.f of X and Y is given by  Hence,  So, using the first property,

Calculating Probability  How to find the probability that ( X,Y ) belongs to an arbitrary region D ?  Using the third property,

Marginal Statistics  Statistics of each individual ones are called Marginal Statistics.  marginal PDF of X,  the marginal p.d.f of X.  Can be obtained from the joint p.d.f. Proof

Marginal Statistics Using the formula for differentiation under integrals, t aking derivative with respect to x we get: Proof

 Let  Then: Differentiation Under Integrals

Marginal p.d.fs for Discrete r.vs  X and Y are discrete r.vs  Joint p.d.f:  Marginal p.d.fs:  When written in a tabular fashion, to obtain one needs to add up all entries in the i-th row.  This suggests the name marginal densities.

Example  Given marginals, it may not be possible to compute the joint p.d.f.  Obtain the marginal p.d.fs and for: Example

 is constant in the shaded region  We have: So:  Thus c = 2. Moreover,  Similarly,  Clearly, in this case given and, it will not be possible to obtain the original joint p.d.f in Solution

Example  X and Y are said to be jointly normal (Gaussian) distributed, if their joint p.d.f has the following form:  By direct integration,  So the above distribution is denoted by Example

Example - continued  So, once again, knowing the marginals alone doesn’t tell us everything about the joint p.d.f  We will show that the only situation where the marginal p.d.fs can be used to recover the joint p.d.f is when the random variables are statistically independent.

Independence of r.vs The random variables X and Y are said to be statistically independent if the events and are independent events for any two Borel sets A and B in x and y axes respectively.  For the events and if the r.vs X and Y are independent, then  i.e.,  or equivalently:  If X and Y are discrete-type r.vs then their independence implies Definition

Independence of r.vs  Given  obtain the marginal p.d.fs and  examine whether independence condition for discrete/continuous type r.vs are satisfied.  Example: Two jointly Gaussian r.vs as in (7-23) are independent if and only if the fifth parameter Procedure to test for independence

Example Given Determine whether X and Y are independent. Similarly In this case and hence X and Y are independent random variables. Solution