Some Rules for Expectation

Slides:



Advertisements
Similar presentations
Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Advertisements

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
1 Continuous random variables Continuous random variable Let X be such a random variable Takes on values in the real space  (-infinity; +infinity)  (lower.
Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Lecture note 6 Continuous Random Variables and Probability distribution.
Probability theory 2010 Main topics in the course on probability theory  Multivariate random variables  Conditional distributions  Transforms  Order.
Continuous Random Variables and Probability Distributions
1 Engineering Computation Part 6. 2 Probability density function.
Today Today: Chapter 5 Reading: –Chapter 5 (not 5.12) –Exam includes sections from Chapter 5 –Suggested problems: 5.1, 5.2, 5.3, 5.15, 5.25, 5.33,
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Probability theory 2010 Conditional distributions  Conditional probability:  Conditional probability mass function: Discrete case  Conditional probability.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Jointly distributed Random variables
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Chapter6 Jointly Distributed Random Variables
The Neymann-Pearson Lemma Suppose that the data x 1, …, x n has joint density function f(x 1, …, x n ;  ) where  is either  1 or  2. Let g(x 1, …,
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Engineering Statistics - IE 261
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Math 4030 – 6a Joint Distributions (Discrete)
Chapter 3 DeGroot & Schervish. Functions of a Random Variable the distribution of some function of X suppose X is the rate at which customers are served.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Continuous Random Variables and Probability Distributions
Jointly distributed Random variables Multivariate distributions.
Functions of Random Variables
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Statistics -Continuous probability distribution 2013/11/18.
Continuous Distributions
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Expected values, covariance, and correlation
Statistics Lecture 19.
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Lectures prepared by: Elchanan Mossel Yelena Shvets
The distribution function F(x)
Means and Variances of Random Variables
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Example Suppose X ~ Uniform(2, 4). Let . Find .
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Analysis of Engineering and Scientific Data
Handout Ch 4 實習.
Handout Ch 4 實習.
6.3 Sampling Distributions
Chapter-1 Multivariate Normal Distributions
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Chapter 2. Random Variables
Lectures prepared by: Elchanan Mossel Yelena Shvets
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Introduction to Probability: Solutions for Quizzes 4 and 5
Quick Review of Probability
Lectures prepared by: Elchanan Mossel Yelena Shvets
Continuous Distributions
Moments of Random Variables
Mathematical Expectation
Presentation transcript:

Some Rules for Expectation

The Linearity property Thus you can calculate E[Xi] either from the joint distribution of X1, … , Xn or the marginal distribution of Xi. The Linearity property

(The Multiplicative property) Suppose X1, … , Xq are independent of Xq+1, … , Xk then In the simple case when k = 2 if X and Y are independent

Some Rules for Variance

If X and Y are independent, then

Multiplicative Rule for independent random variables Suppose that X and Y are independent random variables, then: Proof: Since X and Y are independent then Also

Now or Thus

Conditional Expectation:

Definition Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous random variables with joint probability density function f(x1, x2, …, xq, xq+1 …, xk ) then the conditional joint probability function of X1, X2, …, Xq given Xq+1 = xq+1 , …, Xk = xk is

Definition Let U = h( X1, X2, …, Xq, Xq+1 …, Xk ) then the Conditional Expectation of U given Xq+1 = xq+1 , …, Xk = xk is Note this will be a function of xq+1 , …, xk.

Example Let X, Y, Z denote 3 jointly distributed random variable with joint density function then Determine the conditional expectation of U = X 2 + Y + Z given X = x, Y = y.

The marginal distribution of X,Y. Thus the conditional distribution of Z given X = x,Y = y is

The conditional expectation of U = X 2 + Y + Z given X = x, Y = y.

Thus the conditional expectation of U = X 2 + Y + Z given X = x, Y = y.

A very useful rule Let (x1, x2, … , xq, y1, y2, … , ym) = (x, y) denote q + m random variables. Then

Proof (in the simple case of 2 variables X and Y)

hence

Now

Example: Suppose that a rectangle is constructed by first choosing its length, X and then choosing its width Y. Its length X is selected form an exponential distribution with mean m = 1/l = 5. Once the length has been chosen its width, Y, is selected from a uniform distribution form 0 to half its length. Find the mean and variance of the area of the rectangle A = XY.

Solution: We could compute the mean and variance of A = XY from the joint density f(x,y)

Another Approach Use and Now and This is because given X, Y has a uniform distribution from 0 to X/2

Thus where Note Thus The same answer as previously calculated!!

Now and Also

Thus The same answer as previously calculated!!