Lectures prepared by: Elchanan Mossel Yelena Shvets

Slides:



Advertisements
Similar presentations
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Advertisements

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Section 5.3 Suppose X 1, X 2, …, X n are independent random variables (which may be either of the discrete type or of the continuous type) each from the.
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Today Today: Chapter 5 Reading: –Chapter 5 (not 5.12) –Exam includes sections from Chapter 5 –Suggested problems: 5.1, 5.2, 5.3, 5.15, 5.25, 5.33,
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Chapter 4: Joint and Conditional Distributions
Mutually Exclusive: P(not A) = 1- P(A) Complement Rule: P(A and B) = 0 P(A or B) = P(A) + P(B) - P(A and B) General Addition Rule: Conditional Probability:
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Jointly distributed Random variables
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
Statistics 1: Elementary Statistics Section 5-4. Review of the Requirements for a Binomial Distribution Fixed number of trials All trials are independent.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel elena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Continuous Distributions The Uniform distribution from a to b.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Statistics for Business & Economics
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Math 4030 – 6a Joint Distributions (Discrete)
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Jointly distributed Random variables Multivariate distributions.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
Chapter 31 Conditional Probability & Conditional Expectation Conditional distributions Computing expectations by conditioning Computing probabilities by.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
3.4 Joint Probability Distributions
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Chapter 9: Joint distributions and independence CIS 3033.
Statistics Lecture 19.
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Lectures prepared by: Elchanan Mossel Yelena Shvets
Introduction to Probability & Statistics Joint Distributions
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Some Rules for Expectation
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Multiple Continuous RV
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Lectures prepared by: Elchanan Mossel Yelena Shvets
... DISCRETE random variables X, Y Joint Probability Mass Function y1
7. Continuous Random Variables II
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lecture 14: Multivariate Distributions
Probability overview Event space – set of possible outcomes
EE 5345 Multiple Random Variables 2-D RV’s Conditional RV’s
CS723 - Probability and Stochastic Processes
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Moments of Random Variables
Presentation transcript:

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Lectures prepared by: Elchanan Mossel Yelena Shvets Follows Jim Pitman’s book: Probability Sections 6.3

Conditional density Question: Answer: Question: Try: Example: (X,Y) uniform in the unit disk centered at 0. Question: Answer: Question: Try:

Infinitesimal Conditioning Formula Solution: Replace by infinitesimal ratio: (A & X 2 dx) (A) (X2dx) x x+dx

Infinitesimal Conditioning Formula

Conditional Density Claim: Suppose (X,Y) have a joint density f(x,y), and x is such that fX(x) > 0, the conditional density of Y given X=x is a probability density defined by “Proof”: P(Y 2 dy | X = x) = P(Y 2 dy | X 2 dx) = P(X 2 dx, Y 2 dy)/P(X 2 dx) = f(x,y) dx dy / f(x) dx = fy(y | X = x)dy

Conditional Density Joint density of (X,Y): f(x,y); Renormalized slice for given y = conditional density P[X | Y=y], The area is of each slice is 1. x y x Slices through density for fixed y’s. y Area of slice at y x = height of marginal distribution at y. y

Conditioning Formulae Discrete Continuous Multiplication Rule: Division Rule: Bayes’ Rule:

Example: Uniform on a triangle. Suppose that a point (X,Y) is chosen uniformly at random from the triangle A: {(x,y): 0 · x, 0 · y, x+ y · 2}. Question: Find P(Y¸ 1 | X =x). Solution 1: Since Area(A) = 2, for all S, P(S) = Area(S)/2. So for 0 · x < 1: P(X2Dx) = ½Dx (2-x - ½Dx); P(Y>1,X2Dx) = ½Dx (1-x - ½Dx); P(Y¸1|X2Dx) = P(Y>1,X2Dx) / P(X2Dx) = (1-x - ½Dx)/ (2-x - ½Dx); ! (1-x)/(2-x) as Dx ! 0. So P(Y ¸ 1 | X = x) = (1-x)/(2-x) And for x ¸ 1: P(Y¸1|X=x) = 0. 2 (Y¸ 1) 1 Y=1 A X +Y=2 1 2

Example: Uniform on a triangle. Suppose that a point (X,Y) is chosen uniformly at random from the triangle A: {(x,y): 0 · x, 0 · y, x+ y · 2}. Question: Find P(Y¸ 1 | X =x). Solution 2. using the conditional density fY(y|X=x). For the uniform distribution on a triangle of area 2, So for 0 · x · 2, Given, X=x, the distribution of Y is uniform on (0,2-x).

Example: Gamma and Uniform Suppose that X has Gamma(2, l) distribution, and that given X=x, Y has Uniform(0,x) distribution. Question: Find the joint density of X and Y Solution: By definition of Gamma(2, l) distr., By definition of Unif(0,x), By multiplication rule, Question: Find the marginal density of Y. Solution: So Y » Exp(l).

Condition Distribution & Expectations Discrete Continuous Conditional distribution of Y given X=x; Expectation of a function g(X):

Independence Conditional distribution of Y given X=x does not depend on x: P(Y2 B|X=x) = P(Y2 B). Conditional distribution of X given Y=y does not depend on y: P(X2 A|Y=y) = P(X2 A). The multiplication rule reduces to: f(x,y) = fX(x)fY(y). Expectation of the product is the product of Expectations: E(XY) = s s xy fX(x) fY(y) dx dy = E(X) E(Y).

Example: (X,Y) Uniform on a Unit Circle. Question: Are X and Y independent? Solution: So the conditional distribution of X given Y=y depends on y and hence X and Y cannot be independent. However, note that for all x and y E(X|Y = y) = E(X) = 0 and E(Y|X=x) = E(Y) = 0

Average Conditional Probabilities & Expectations Discrete Continuous Average conditional probability: Average conditional expectation:

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Lectures prepared by: Elchanan Mossel Yelena Shvets Follows Jim Pitman’s book: Probability Sections 6.1-6.2