Conditional Probability on a joint discrete distribution

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
2 Discrete Random Variable. Independence (module 1) A and B are independent iff the knowledge that B has occurred does not change the probability that.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Independence of random variables
Joint Distributions, Marginal Distributions, and Conditional Distributions Note 7.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete random variables Probability mass function Distribution function (Secs )
Probability Mass Function Expectation 郭俊利 2009/03/16
The moment generating function of random variable X is given by Moment generating function.
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Chapter 4 Joint Distribution & Function of rV. Joint Discrete Distribution Definition.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Joint Distribution of two or More Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Discrete Random Variables: PMFs and Moments Lemon Chapter 2 Intro to Probability
Expected values and variances. Formula For a discrete random variable X and pmf p(X): Expected value: Variance: Alternate formula for variance:  Var(x)=E(X^2)-[E(X)]^2.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 26.
Convergence in Distribution
Discrete Random Variables A random variable is a function that assigns a numerical value to each simple event in a sample space. Range – the set of real.
X = 2*Bin(300,1/2) – 300 E[X] = 0 Y = 2*Bin(30,1/2) – 30 E[Y] = 0.
COMP 170 L2 L17: Random Variables and Expectation Page 1.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Lesson Objective Understand what we mean by a Random Variable in maths Understand what is meant by the expectation and variance of a random variable Be.
4.1 Probability Distributions Important Concepts –Random Variables –Probability Distribution –Mean (or Expected Value) of a Random Variable –Variance and.
Probability Refresher COMP5416 Advanced Network Technologies.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Random Variables Example:
Distributions of Functions of Random Variables November 18, 2015
Probability and Moment Approximations using Limit Theorems.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Probability Distribution. Probability Distributions: Overview To understand probability distributions, it is important to understand variables and random.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Conditional Expectation
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
Discrete Random Variable Random Process. The Notion of A Random Variable We expect some measurement or numerical attribute of the outcome of a random.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
Probability Refresher
Statistics Lecture 19.
Expectations of Random Variables, Functions of Random Variables
Jointly distributed random variables
Example A device containing two key components fails when and only when both components fail. The lifetime, T1 and T2, of these components are independent.
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The distribution function F(x)
Monte Carlo Approximations – Introduction
Multinomial Distribution
Example Suppose X ~ Uniform(2, 4). Let . Find .
Some Discrete Probability Distributions
Distributions and expected value
Random Variable Random Variable – Numerical Result Determined by the Outcome of a Probability Experiment. Ex1: Roll a Die X = # of Spots X | 1.
Independence of random variables
3. Independence and Random Variables
ASV Chapters 1 - Sample Spaces and Probabilities
5. Conditioning and Independence
Part II: Discrete Random Variables
Chapter 2. Random Variables
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Quick Review of Probability
Chapter 3-2 Discrete Random Variables
Presentation transcript:

Conditional Probability on a joint discrete distribution Given the joint pmf of X and Y, we want to find and These are the base for defining conditional distributions… STA347

Definition For X, Y discrete random variables with joint pmf pX,Y(x,y) and marginal mass function pX(x) and pY(y). If x is a number such that pX(x) > 0, then the conditional pmf of Y given X = x is Is this a valid pmf? Similarly, the conditional pmf of X given Y = y is Note, from the above conditional pmf we get Summing both sides over all possible values of Y we get This is an extremely useful application of the law of total probability. Note: If X, Y are independent random variables then PX|Y(x|y) = PX(x). STA347

Example Suppose we roll a fair die; whatever number comes up we toss a coin that many times. What is the distribution of the number of heads? Let X = number of heads, Y = number on die. We know that Want to find pX(x). The conditional probability function of X given Y = y is given by for x = 0, 1, …, y. By the Law of Total Probability we have Possible values of x: 0,1,2,…,6. STA347

Conditional densities If X, Y jointly distributed continuous random variables, the conditional density function of Y | X is defined to be if fX(x) > 0 and 0 otherwise. If X, Y are independent then . Also, Integrating both sides over x we get This is a useful application of the law of total probability for the continuous case. STA347

Example Consider the joint density Find the conditional density of X given Y and the conditional density of Y given X. STA347

Conditional Expectation For X, Y discrete random variables, the conditional expectation of Y given X = x is and the conditional variance of Y given X = x is where these are defined only if the sums converges absolutely. In general, STA347

For X, Y continuous random variables, the conditional expectation of Y given X = x is and the conditional variance of Y given X = x is In general, STA347

Example Suppose X, Y are continuous random variables with joint density function Find E(X | Y = 2). STA347

More on Conditional Expectation Assume that E(Y | X = x) exists for every x in the range of X. Then, E(Y | X ) is a random variable. The expectation of this random variable is E [E(Y | X )] Theorem E [E(Y | X )] = E(Y) This is called the “Law of Total Expectation”. Proof: STA347

Example Suppose we roll a fair die; whatever number comes up we toss a coin that many times. What is the expected number of heads? STA347

Theorem For random variables X, Y V(Y) = V [E(Y|X)] + E[V(Y|X)] Proof: STA347

Example Let X ~ Geometric(p). Given X = x, let Y have conditionally the Binomial(x, p) distribution. Scenario: doing Bernoulli trails with success probability p until 1st success so X : number of trails. Then do x more trails and count the number of success which is Y. Find, E(Y), V(Y). STA347