5.4 Joint Distributions and Independence

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Random Variables and Probability Distributions Schaum’s Outlines of Probability and Statistics Chapter 2 Presented by Carol Dahl Examples by Tyler Hodge.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Michael Maurizi Instructor Longin Jan Latecki C9:
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Review of Basic Probability and Statistics
Joint Distributions, Marginal Distributions, and Conditional Distributions Note 7.
Statistics Lecture 18. Will begin Chapter 5 today.
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
Discrete Random Variables and Probability Distributions
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
The moment generating function of random variable X is given by Moment generating function.
Lecture 2 1 Probability theory Random variables Definition: A random variable X is a function defined on S, which takes values on the real axis In an experiment.
Today Today: More Chapter 5 Reading: –Important Sections in Chapter 5: Only material covered in class Note we have not, and will not cover moment/probability.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Jointly distributed Random variables
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability Distributions
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Joint Distribution of two or More Random Variables
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Joint Probability Distributions Leadership in Engineering
: Appendix A: Mathematical Foundations 1 Montri Karnjanadecha ac.th/~montri Principles of.
Jointly Distributed Random Variables
Chapter 5 Discrete Random Variables and Probability Distributions ©
Lecture 14: Multivariate Distributions Probability Theory and Applications Fall 2005 October 25.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Chapter 3 Random vectors and their numerical characteristics.
Two Random Variables W&W, Chapter 5. Joint Distributions So far we have been talking about the probability of a single variable, or a variable conditional.
LECTURE IV Random Variables and Probability Distributions I.
Chapter 5. Joint Probability Distributions and Random Sample Weiqi Luo ( 骆伟祺 ) School of Software Sun Yat-Sen University : Office.
1 Lecture 14: Jointly Distributed Random Variables Devore, Ch. 5.1 and 5.2.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 5.2: Recap on Probability Theory Jürgen Sturm Technische Universität.
Statistics for Business & Economics
1 Topic 5 - Joint distributions and the CLT Joint distributions –Calculation of probabilities, mean and variance –Expectations of functions based on joint.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Lecture 14 Prof. Dr. M. Junaid Mughal Mathematical Statistics 1.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Math 4030 – 6a Joint Distributions (Discrete)
Chapter 3 Multivariate Random Variables
PROBABILITY AND STATISTICS WEEK 8 Onur Doğan. Skewness Onur Doğan.
Jointly distributed Random variables Multivariate distributions.
Intro to Probability Slides from Professor Pan,Yan, SYSU.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
3.4 Joint Probability Distributions
12.SPECIAL PROBABILITY DISTRIBUTIONS
Chapter 9: Joint distributions and independence CIS 3033.
Random Variables and Probability Distribution (1)
Basics of Multivariate Probability
Statistics Lecture 19.
Jointly distributed random variables
Probability Review 11/22/2018.
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Probability Review 2/24/2019.
ASV Chapters 1 - Sample Spaces and Probabilities
EE 5345 Multiple Random Variables 2-D RV’s Conditional RV’s
Lectures prepared by: Elchanan Mossel Yelena Shvets
Discrete Random Variables and Probability Distributions
Chapter 3-2 Discrete Random Variables
Moments of Random Variables
Presentation transcript:

5.4 Joint Distributions and Independence

A joint probability function for discrete random variables X and Y is a nonnegative function f(x,y), giving the probability that (simultaneously) X takes the value x and Y takes the value y. That is

The function f(x,y) is a joint probability distribution or probability mass function of the discrete random variable X and Y if 1. 2. 3.  P(X=x, Y=y)=f(x,y)

Example A large insurance agency services a number of customers who have purchased both a homeowner’s policy and an automobile policy from the agency. For each type of policy, a deductible amount must be specified. For an automobile policy, the choices are $100 and $250, whereas for a homeowner’s policy, the choices are 0, $100, and $200. Suppose an individual with both types of policy is selected at random from the agency’s files. Let X=deductible amount on the auto policy and Y=the deductible amount on the homeowner’s policy.

The joint probability table is p(x,y)   100 200 x 0.2 0.1 250 0.05 0.15 0.3 Then p(100, 100)=P(X=100 and Y=100) =P($100 deductible on both policies)=.1

y p(x,y)   100 200 x 0.2 0.1 250 0.05 0.15 0.3 The probability P(Y≥100) is computed by summing probabilities of all (x,y) pairs for which y ≥100. P(Y≥100)=p(100,100)+p(250,100)+p(100,200) +p(250,100)=.75

From Joint probability to individual distributions marginal distributions The joint probability function, f(x,y), of X and Y, contains more information than individual probabilities functions of X, and Y. Individual probabilities functions of X, and Y can be obtained from their joint probability function. We call the individual probability functions of X and Y marginal distributions.

Marginal Distributions Definition: The individual probability functions for discrete random variables X and Y with joint probability function f(x,y) are called marginal probability functions. They are obtained by summing f(x,y) values over all possible values of the other variable. The marginal probability function for X is And the marginal probability function for Y is

Marginal probabilities y p(x,y)   100 200 fX(x) x 0.2 0.1 0.5 250 0.05 0.15 0.3 fY(y) 0.25

Conditional Distributions For discrete random variables X and Y with joint probability function f(x,y), the conditional probability function of Y given X=x is Similarly, the conditional distribution of X given Y=y is

Example Given the joint probabilities and marginal probabilities Find the probability of Y , given X=0. f(x,y) x= 0   x=1 x=2 fY(y)  y=0  1/6  2/9  1/36  15/36 y=1  y=2  1 /3  0  1/2  1/12  fX(x)  7/12  7/18 1

Solution fX(0)=7/12 f(0,0)=1/6, f(0,1)=1/3, f(0,2)=1/12 Then fY|X(0|0)=2/7; fY|X (1|0)=4/7, fY|X (2|0)=1/7

Statistical Independence f(x|y) doesn’t depend on y; f(y|x) doesn’t depend on x. Verify that f(x|y)=fX(x) & f(y|x)= fY(y). f(x,y)=f(x|y) fY(y) f(x,y)= fX(x) fY(y).   Then X and Y are independent

Definition Discrete random variables X and Y are called independent if their joint probability function f(x,y) is the product of their respective marginal distributions. That is, X and Y are said to be statistically independent if f(x,y)=fX(x)fY(y)  for all x,y.

Example X and Y have the following joint distribution function Verify that X and Y are independent. F(x,y) x=1 2 3 y=1 0.16 0.08 0.24 0.12

Example X and Y have the following joint distribution function Verify that X and Y are dependent. F(x,y) x=1 2 3 y=1 0.16 0.08 0.24 0.12