ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring 2014 6. Jointly Distributed Random Variables.

Slides:



Advertisements
Similar presentations
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Advertisements

DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
2 Discrete Random Variable. Independence (module 1) A and B are independent iff the knowledge that B has occurred does not change the probability that.
Lecture (7) Random Variables and Distribution Functions.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Chapter 5 Discrete Random Variables and Probability Distributions
© 2002 Prentice-Hall, Inc.Chap 4-1 Statistics for Managers Using Microsoft Excel (3 rd Edition) Chapter 4 Basic Probability and Discrete Probability Distributions.
Probability Sample Space Diagrams.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Review of Basic Probability and Statistics
Chapter 4 Discrete Random Variables and Probability Distributions
Discrete Random Variables and Probability Distributions
Probability Densities
Introduction to Probability and Statistics
Probability Distributions
1 Review of Probability Theory [Source: Stanford University]
Probability Concepts and Applications
Visualizing Events Contingency Tables Tree Diagrams Ace Not Ace Total Red Black Total
P robability Midterm Practice Condition Independence 郭俊利 2009/04/13.
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
P robability Important Random Variable Independent random variable Mean and variance 郭俊利 2009/03/23.
Probability and Statistics Review
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Probability Mass Function Expectation 郭俊利 2009/03/16
Chapter 4: Joint and Conditional Distributions
Joint Probability distribution
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Chapter 5 Discrete Probability Distribution I. Basic Definitions II. Summary Measures for Discrete Random Variable Expected Value (Mean) Variance and Standard.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Properties of expectation.
Section 3.7 Suppose the number of occurrences in a “unit” interval follows a Poisson distribution with mean. Recall that for w > 0, P(interval length to.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
One Random Variable Random Process.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Probability (outcome k) = Relative Frequency of k
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.
Section 3.7 Suppose the number of occurrences in a “unit” interval follows a Poisson distribution with mean. Recall that for w > 0, P(interval length to.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
MULTIPLE RANDOM VARIABLES A vector random variable X is a function that assigns a vector of real numbers to each outcome of a random experiment. e.g. Random.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 9 – Continuous Random Variables: Joint PDFs, Conditioning, Continuous Bayes Farinaz Koushanfar.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
5 pair of RVs.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
An Example of {AND, OR, Given that} Using a Normal Distribution By Henry Mesa.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Continuous Random Variables.
3.1 Discrete Random Variables Present the analysis of several random experiments Discuss several discrete random variables that frequently arise in applications.
5. Continuous Random Variables
Business Statistics Topic 4
MUTUALLY EXCLUSIVE EVENTS
Probability Review for Financial Engineers
Probability Key Questions
Random Variables Binomial Distributions
2.0 Probability Concepts definitions: randomness, parent population, random variable, probability, statistical independence, probability of multiple events,
4. Expectation and Variance Joint PMFs
5. Conditioning and Independence
7. Continuous Random Variables II
5 pair of RVs.
Chapter 11 Probability.
Presentation transcript:

ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables

Cards 123 There is a box with 4 cards: You draw two cards without replacement. 4 What is the p.m.f. of the sum of the face values?

Cards Probability model S = ordered pairs of cards, equally likely outcomes X = face value on first card Y = face value on second card We want the p.m.f. of X + Y = P(X = 1, Y = 3) + P(X = 2, Y = 2) + P(X = 3, Y = 1) 1/12 0 P(X + Y = 4) = 1/6.

Joint distribution function In general P(X + Y = z) = ∑ (x, y): x + y = z P(X = x, Y = y) to calculate P(X + Y = z) we need to know f(x, y) = P(X = x, Y = y) for every pair of values x, y. This is the joint p.m.f. of X and Y.

Cards 01/ X Y joint p.m.f. of X and Y : p.m.f. of X + Y 20 31/6 4 51/3 61/6 7 80

Question for you 123 There is a box with 4 cards: You draw two cards without replacement. 4 What is the p.m.f. of the larger face value? What if you draw the cards with replacement?

Marginal probabilities P(X = x) = ∑ y P(X = x, Y = y) 01/ X Y 1/4 P(Y = y) = ∑ x P(X = x, Y = y) 1

Red and blue balls You have 3 red balls and 2 blue balls. Draw 2 balls at random. Let X be the number of blue balls drawn. Replace the 2 balls and draw one ball. Let Y be the number of blue balls drawn this time. 9/5018/503/50 6/5012/502/ X Y 3/5 2/5 3/106/101/10 X Y

Independent random variables X and Y are independent if P(X = x, Y = y) = P(X = x) P(Y = y) for all possible values of x and y. Let X and Y be discrete random variables.

Example Alice tosses 3 coins and so does Bob. What is the probability they get the same number of heads? Probability model Let A / B be Alice’s / Bob’s number of heads Each of A and B is Binomial(3, ½) A and B are independent We want to know P(A = B)

Example Solution 1 1/643/64 1/64 3/649/64 3/64 9/64 3/64 1/643/64 1/ A B 1/83/8 1/8 3/8 1/8 A B P(A = B) = 20/64 = 31.25%

Example Solution 2 P(A = B) = ∑ h P(A = h, B = h) = ∑ h P(A = h) P(B = h) = ∑ h (C(3, h) 1/8) (C(3, h) 1/8) = 1/64 (C(3, 0) 2 + C(3, 1) 2 + C(3, 2) 2 + C(3, 3) 2 ) = 20/64= 31.25%

Independent Poisson Let X be Poisson(  ) and Y be Poisson( ). If X and Y are independent, what is the p.m.f. of X + Y ? Intuition X is the number of blue raindrops in 1 sec Y is the number of red raindrops in 1 sec X + Y is the total number of raindrops E[X + Y] = E[X] + E[Y] =  + 0 1

Independent Poisson P(X + Y = z) The p.m.f. of X + Y is = ∑ (x, y): x + y = z P(X = x, Y = y) = ∑ (x, y): x + y = z P(X = x) P(Y = y) = ∑ (x, y): x + y = z (e -   x /x!) (e - y /y!) = e -(  + ) ∑ (x, y): x + y = z (  x y )/(x!y!) = (e -(  + ) /z!) (  + ) z P(Z = z) The p.m.f. of a Poisson(  + ) r. v. Z is = (e -(  + ) /z!) ∑ x = 0 z!/x!(z-x)!  x z - x z =... so X + Y is a Poisson(  + ) random variable

Barista jam On average a barista sells 2 espressos at $15 each and 3 lattes at $30 each per hour. (b) What is her expected hourly income? (c) What is the probability her income falls short of expectation in the next hour? (a) What is the probability she sells fewer than five coffees in the next hour?

Barista jam Probability model X / Y is number of espressos/lattes sold in next hour X is Poisson(2), Y is Poisson(3) ; X, Y independent Solution (a) X + Y is Poisson(5) so P(X + Y < 5) = ∑ z = 0 e -5 5 z /z! 4 ≈ 0.440

Barista jam (b)hourly income (in dollars) is 15X + 30Y E[15X + 30Y] = 15E[X] + 30E[Y] = 15×2 + 30×3 = 120 (c) P(15X + 30Y < 120) = ∑ z = 0 e z /z! 119 ≈ wrong!

Barista jam P(15X + 30Y < 120) (c) = ∑ (x, y): 15x + 30y < 120 P(X = x, Y = y) = ∑ (x, y): 15x + 30y < 120 P(X = x) P(Y = y) = ∑ (x, y): 15x + 30y < 120 (e -2 2 x /x!) (e -3 3 y /y!)...using the program 14L09.py ≈ 0.480

Expectation E[X, Y] doesn’t make sense, so we look at E[g(X, Y)] for example E[X + Y], E[min(X, Y)] There are two ways to calculate it: Method 1.First obtain the p.m.f. f Z of Z = g(X, Y) Then calculate E[Z] = ∑ z z f Z (z) Method 2. Calculate directly using the formula E[g(X, Y)] = ∑ x, y g(x, y) f XY (x, y)

Method 1: Example 1/643/64 1/64 3/649/64 3/64 9/64 3/64 1/643/64 1/ A B E[min(A, B)] = /64 33/64 15/64 1/64 min(A, B) ⋅ 15/ ⋅ 33/ ⋅ 15/ ⋅ 1/64 = 33/32

Method 2: Example 1/643/64 1/64 3/649/64 3/64 9/64 3/64 1/643/64 1/ A B E[min(A, B)] = ⋅ 1/ ⋅ 3/ ⋅ 1/64 = 33/32

X, Y discrete joint p.m.f. f XY (x, y) = P(X = x, Y = y) Probability of an event (determined by X, Y ) P(A) = ∑ (x, y) in A f XY (x, y) Marginal p.m.f.’s Expectation of Z = g(X, Y) Independence f Z (z) = ∑ (x, y): g(x, y) = z f XY (x, y) f X (x) = ∑ y f XY (x, y) f XY (x, y) = f X (x) f Y (y) for all x, y E[Z] = ∑ x, y g(x, y) f XY (x, y) Derived random variables Z = g(X, Y)

Continuous random variables A pair of continuous random variables X, Y can be specified either by their joint c.d.f. F XY (x, y) = P(X ≤ x, Y ≤ y) or by their joint p.d.f. f XY (x, y) ∂ ∂x = F XY (x, y) ∂ ∂y = P(x < X ≤ x + , y < Y ≤ y +  )  lim ,  → 0

An example Rain drops at a rate of 1 drop/sec. Let X and Y be the arrival times of the first and second raindrop. f(x, y) ∂ ∂x = F(x, y) ∂ ∂y F(x, y) = P(X ≤ x, Y ≤ y) Y X

Continuous marginals Given the joint c.d.f F XY (x, y) = P(X ≤ x, Y ≤ y), we can calculate the marginal c.d.f.s: F X (x) = P(X ≤ x) = lim F XY (x, y) y → ∞ F Y (y) = P(Y ≤ y) = lim F XY (x, y) x → ∞ P(X ≤ x) Exponential(1)

X, Y continuous with joint p.d.f. f XY (x, y) Probability of an event (determined by X, Y ) Marginal p.d.f.’s Independence Derived random variables Z = g(X, Y) P(A) = ∫∫ A f XY (x, y) dxdy f XY (x, y) = f X (x) f Y (y) for all x, y E[Z] = ∫∫ g(x, y) f XY (x, y) dxdy f Z (z) = ∫∫ (x, y): g(x, y) = z f XY (x, y) dxdy f X (x) = ∫ -∞ f XY (x, y) dy ∞ Expectation of Z = g(X, Y)

Independent uniform random variables Let X, Y be independent Uniform(0, 1). f XY (x, y) = f X (x) f Y (y) = f X (x) = 0 if 0 < x < 1 1 if not 0 if 0 < x, y < 1 1 if not f Y (y) = 0 if 0 < y < 1 1 if not f XY (x, y)

Meeting time Alice and Bob arrive in Shatin between 12 and 1pm. How likely arrive within 15 minutes of one another? Probability model Arrival times X, Y are independent Uniform(0, 1) Event A : |X – Y| ≤ ¼ P(A) = ∫∫ A f XY (x, y) dxdy = ∫∫ A 1 dxdy = area(A) in [0, 1] 2

Meeting time Event A : |X – Y| ≤ ¼ y = x + ¼ y = x – ¼ P(A) = area(A) = 1 – (3/4) 2 = 7/16 x y

Buffon’s needle A needle of length l is randomly dropped on a ruled sheet. What is the probability that the needle hits one of the lines?

1 Buffon’s needle X  Probability model The lines are 1 unit apart X is the distance from midpoint to nearest line  is angle with horizontal X is Uniform(0, ½)  is Uniform(0,  ) X,  are independent

Buffon’s needle X 1 l/2 The p.d.f. is f X  (x,  ) = f X (x) f  (  ) = 2/  for 0 < x < ½, 0 <  <  The event H = “needle hits line” happens when X < (l/2) sin    x 0  ½ 0 H l/2

Buffon’s needle =  ∫ 0 (l /  ) sin  d   P(H)P(H) = ∫ 0 ∫ 0 2/  dxd   (l/2) sin  If l ≤ 1 (short needle) then (l/2) sin  is always ≤ ½ : =  (l /  ) ∫ 0 sin  d   =  2l / . P(H) = ∫∫ B f X  (x,  ) dxd  = ∫ 0 ∫ 0 2/  dxd   (l/2)sin 

Many random variables: discrete case Random variables X 1, X 2, …, X k are specified by their joint p.m.f P(X 1 = x 1, X 2 = x 2, …, X k = x k ). We can calculate marginal p.m.f.’s, e.g. P(X 1 = x 1, X 3 = x 3 ) = ∑ x2 P(X 1 = x 1, X 2 = x 2, X 3 = x 3 ) P(X 3 = x 3 ) = ∑ x1, x2 P(X 1 = x 1, X 2 = x 2, X 3 = x 3 ) and so on.

Independence for many random variables Discrete X 1, X 2, …, X k are independent if for all possible values x 1, …, x k. P(X 1 = x 1, X 2 = x 2, …, X k = x k ) = P(X 1 = x 1 ) P(X 2 = x 2 ) … P(X k = x k ) For continuous, we look at p.d.f.’s instead of p.m.f.’s

Dice Three dice are tossed. What is the probability that their face values are non-decreasing? Solution Let X, Y, Z be face values of first, second, third die X, Y, Z independent with p.m.f. p(1) = … = p(6) = 1/6 We want the probability of the event X ≤ Y ≤ Z

Dice P(X ≤ Y ≤ Z) = ∑ (x, y, z): x ≤ y ≤ z P(X = x, Y = y, Z = z) = ∑ (x, y, z): x ≤ y ≤ z (1/6) 3 = ∑ z = 1 ∑ y = 1 ∑ x = 1 (1/6) 3 6 z y = ∑ z = 1 ∑ y = 1 (1/6) 3 y 6 z = ∑ z = 1 (1/6) 3 z (z + 1)/2 6 = (1/6) 3 (1∙2 + 2∙3 + 3∙4 + 4∙5 + 5∙6 + 6∙7)/2 = 56/216 ≈ 0.259

Many-sided dice Now you toss an “infinite-sided die” 3 times. What is the probability the values are increasing?