The Erik Jonsson School of Engineering and Computer Science Chapter 4 pp. 153-210 William J. Pervin The University of Texas at Dallas Richardson, Texas.

Slides:



Advertisements
Similar presentations
Laws of division of casual sizes. Binomial law of division.
Advertisements

Multivariate Distributions
Contemporary Engineering Economics, 4 th edition, © 2007 Probabilistic Cash Flow Analysis Lecture No. 47 Chapter 12 Contemporary Engineering Economics.
Independence of random variables
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Statistics Lecture 18. Will begin Chapter 5 today.
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
Section 5.3 Suppose X 1, X 2, …, X n are independent random variables (which may be either of the discrete type or of the continuous type) each from the.
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Random Vectors Shruti Sharma Ganesh Oka References :-
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Chapter 4: Joint and Conditional Distributions
The Erik Jonsson School of Engineering and Computer Science Chapter 2 pp William J. Pervin The University of Texas at Dallas Richardson, Texas.
Chapter 4 Joint Distribution & Function of rV. Joint Discrete Distribution Definition.
Mutually Exclusive: P(not A) = 1- P(A) Complement Rule: P(A and B) = 0 P(A or B) = P(A) + P(B) - P(A and B) General Addition Rule: Conditional Probability:
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Today Today: More Chapter 5 Reading: –Important Sections in Chapter 5: Only material covered in class Note we have not, and will not cover moment/probability.
1 Introduction to Stochastic Models GSLM Outline  independence of random variables  variance and covariance  two useful ideas  examples 
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
The Erik Jonsson School of Engineering and Computer Science Chapter 1 pp William J. Pervin The University of Texas at Dallas Richardson, Texas
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 7 – Discrete Random Variables: Conditioning and Independence Farinaz Koushanfar ECE Dept.,
Jointly Distributed Random Variables
Section 8 – Joint, Marginal, and Conditional Distributions.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 11 – Derived distributions, covariance, correlation and convolution Dr. Farinaz Koushanfar.
The Erik Jonsson School of Engineering and Computer Science Chapter 6 pp William J. Pervin The University of Texas at Dallas Richardson, Texas.
Copyright © 2011 Pearson Education, Inc. Association between Random Variables Chapter 10.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
Chapters 7 and 10: Expected Values of Two or More Random Variables
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Properties of expectation.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
1 Lecture 14: Jointly Distributed Random Variables Devore, Ch. 5.1 and 5.2.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
The Erik Jonsson School of Engineering and Computer Science Chapter 3 pp William J. Pervin The University of Texas at Dallas Richardson, Texas.
Statistics for Business & Economics
Probability Refresher. Events Events as possible outcomes of an experiment Events define the sample space (discrete or continuous) – Single throw of a.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
EE 5345 Multiple Random Variables
Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Continuous Random Variable (1) Section Continuous Random Variable What is the probability that X is equal to x?
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
MULTIPLE RANDOM VARIABLES A vector random variable X is a function that assigns a vector of real numbers to each outcome of a random experiment. e.g. Random.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 9 – Continuous Random Variables: Joint PDFs, Conditioning, Continuous Bayes Farinaz Koushanfar.
ENEE 324: Conditional Expectation Richard J. La Fall 2004.
Chapter 31 Conditional Probability & Conditional Expectation Conditional distributions Computing expectations by conditioning Computing probabilities by.
5 pair of RVs.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
Statistics Lecture 19.
Ch5.2 Covariance and Correlation
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Useful Discrete Random Variables
Chapter 10: Covariance and Correlation
Multiple Continuous RV
Useful Discrete Random Variable
CS723 - Probability and Stochastic Processes
Independence of random variables
The University of Texas at Dallas
7. Continuous Random Variables II
IE 360: Design and Control of Industrial Systems I
Further Topics on Random Variables: Covariance and Correlation
Chapter 10: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
CS723 - Probability and Stochastic Processes
Chapter 10: Covariance and Correlation
Mathematical Expectation
Presentation transcript:

The Erik Jonsson School of Engineering and Computer Science Chapter 4 pp William J. Pervin The University of Texas at Dallas Richardson, Texas 75083

The Erik Jonsson School of Engineering and Computer Science Chapter 3 Pairs of Random Variables

The Erik Jonsson School of Engineering and Computer Science Chapter Joint CDF: The joint CDF F X,Y of RVs X and Y is F X,Y (x,y) = P[X ≤ x, Y ≤ y]

The Erik Jonsson School of Engineering and Computer Science Chapter 4 0 ≤ F X,Y (x,y) ≤ 1 If x 1 ≤ x 2 and y 1 ≤ y 2 then F X,Y (x 1,y 1 ) ≤ F X,Y (x 2,y 2 ) F X,Y (∞,∞) = 1

The Erik Jonsson School of Engineering and Computer Science Chapter Joint PMF: The joint PMF of discrete RVs X and Y is P X,Y (x,y) = P[X = x, Y = y] S X,Y = S X x S Y

The Erik Jonsson School of Engineering and Computer Science Chapter 4 For discrete RVs X and Y and any B  X x Y, the probability of the event {(X,Y)  B} is P[B] = Σ (x,y)  B P X,Y (x,y)

The Erik Jonsson School of Engineering and Computer Science Chapter Marginal PMF: For discrete RVs X and Y with joint PMF P X,Y (x,y), P X (x) = Σ y  S Y P X,Y (x,y) P Y (y) = Σ x  S X P X,Y (x,y)

The Erik Jonsson School of Engineering and Computer Science Chapter Joint PDF: The joint PDF of continuous RVs X and Y is function f X,Y such that F X,Y (x,y) = ∫ –∞ y ∫ –∞ x f X,Y (u,v)dudv f X,Y (x,y) = ∂ 2 F X,Y (x,y)/∂x∂y

The Erik Jonsson School of Engineering and Computer Science Chapter 4 f X,Y (x,y) ≥ 0 for all (x,y) F X,Y (x,y)(∞,∞) = 1

The Erik Jonsson School of Engineering and Computer Science Chapter Marginal PDF: If X and Y are RVs with joint PDF f X,Y, the marginal PDFs are f X (x) = Int{f X,Y (x,y)dy,-∞,-∞} f y (x) = Int{f X,Y (x,y)dx,-∞,-∞}

The Erik Jonsson School of Engineering and Computer Science Chapter Functions of Two RVs: Derived RV W=g(X,Y) X,Y discrete: P W (w) = Sum{P X,Y (x,y)|(x,y):g(x,y)=w}

The Erik Jonsson School of Engineering and Computer Science Chapter 4 X,Y continuous: F W (w) = P[W ≤ w] = ∫∫ g(x,y)=w f X,Y (x,y)dxdy Example: If W = max(X,Y), then F W (w) = F X,Y (w,w) = ∫ y≤w ∫ x ≤w f X,Y (x,y)dxdy

The Erik Jonsson School of Engineering and Computer Science Chapter Expected Values: For RVs X and Y, if W = g(X,Y) then Discrete: E[W] = Σ Σ g(x,y)P X,Y (x,y) Continuous: E[W] = ∫ ∫ g(x,y)f X,Y (x,y)dxdy

The Erik Jonsson School of Engineering and Computer Science Chapter 4 Theorem: E[Σg i (X,Y)] = ΣE[g i (X,Y)] In particular: E[X + Y] = E[X] + E[Y]

The Erik Jonsson School of Engineering and Computer Science Chapter 4 The covariance of two RVs X and Y is Cov[X,Y] = σ XY = E[(X – μ X )(Y – μ Y )] Var[X + Y] = Var[X] + Var[Y] + 2Cov[X,Y]

The Erik Jonsson School of Engineering and Computer Science Chapter 4 The correlation of two RVs X and Y is r X,Y = E[XY] Cov[X,Y] = r X,Y – μX μY Cov[X,X] = Var[X] and r X,X = E[X 2 ] Correlation coefficient ρ X,Y =Cov[X,Y]/σ X σ Y

The Erik Jonsson School of Engineering and Computer Science Chapter Independent RVs: Discrete: P X,Y (x,y) = P X (x)P Y (y) Continuous: f X,Y (x,y) = f X (x)f Y (y)

The Erik Jonsson School of Engineering and Computer Science Chapter 4 For independent RVs X and Y: E[g(X)h(Y)] = E[g(X)]E[h(Y)] r X,Y = E[XY] = E[X]E[Y] Cov[X,Y] = σ X,Y = 0 Var[X + Y] = Var[X] + Var[Y]