Introduction to Probability & Statistics Joint Expectations

Slides:



Advertisements
Similar presentations
Functions of Random Variables. Method of Distribution Functions X 1,…,X n ~ f(x 1,…,x n ) U=g(X 1,…,X n ) – Want to obtain f U (u) Find values in (x 1,…,x.
Advertisements

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Normal Distributions: Finding Probabilities
Random Variables Probability Continued Chapter 7.
Chapter 2: Probability Random Variable (r.v.) is a variable whose value is unknown until it is observed. The value of a random variable results from an.
Introduction to stochastic process
Class notes for ISE 201 San Jose State University
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
P robability Midterm Practice Condition Independence 郭俊利 2009/04/13.
Continuous Random Variables and Probability Distributions
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
1 Random Variables and Discrete probability Distributions SESSION 2.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
SAMPLING DISTRIBUTION
ELEC 303 – Random Signals Lecture 13 – Transforms Dr. Farinaz Koushanfar ECE Dept., Rice University Oct 15, 2009.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
K. Shum Lecture 16 Description of random variables: pdf, cdf. Expectation. Variance.
Chapter 3 Random vectors and their numerical characteristics.
Lectures prepared by: Elchanan Mossel elena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
The Erik Jonsson School of Engineering and Computer Science Chapter 3 pp William J. Pervin The University of Texas at Dallas Richardson, Texas.
Statistics for Business & Economics
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Topic 5 - Joint distributions and the CLT
Engineering Statistics ECIV 2305
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
Continuous Random Variables and Probability Distributions
Expectations Introduction to Probability & Statistics Expectations.
Mean and Variance for Continuous R.V.s. Expected Value, E(Y) For a continuous random variable Y, define the expected value of Y as Note this parallels.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Conditional Expectation
Sampling and Sampling Distributions
Random Variable 2013.
Engineering Probability and Statistics - SE-205 -Chap 4
Probability Continued Chapter 6
Chapter 15 Random Variables
Joint Probability Distributions and Random Samples
Chapter 16 Random Variables.
Reference: (Material source and pages)
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Chapter 16 Random Variables
Probability Continued Chapter 6
The distribution function F(x)
Introduction to Probability & Statistics Joint Distributions
Parameter, Statistic and Random Samples
South Dakota School of Mines & Technology Introduction to Probability & Statistics Industrial Engineering.
Discrete Probability Distributions
Means and Variances of Random Variables
Chapter 4: Mathematical Expectation:
Some Rules for Expectation
VARIANCE REDUCTION.
C14: The central limit theorem
Multinomial Distribution
Example Suppose X ~ Uniform(2, 4). Let . Find .
Introduction to Probability & Statistics The Central Limit Theorem
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Introduction to Probability & Statistics Expectations
3.0 Functions of One Random Variable
Analysis of Engineering and Scientific Data
Handout Ch 4 實習.
Chap 9 Multivariate Distributions Ghahramani 3rd edition
Data Analysis Statistical Measures Industrial Engineering
6.3 Sampling Distributions
7. Continuous Random Variables II
South Dakota School of Mines & Technology Expectations for Exponential
Chapter 16 Random Variables Copyright © 2010 Pearson Education, Inc.
Random variable. Let (, , P) is probability space.
PROBABILITY AND STATISTICS
Presentation transcript:

Introduction to Probability & Statistics Joint Expectations

Properties of Expectations Recall that: 1. E[c] = c 2. E[aX + b] = aE[X] + b 3. 2(ax + b) = a22 4. E[g(x)] = g x dF ( ) 

Joint Expectation Let Z = X + Y E[Z] = E[X] + E[Y]  ( ) cov( , z x y 2 ( ) cov( , z x y  

Derivation   E Z zf x y dxdy [ ] ( , )    ( ) , x y f dxdy Let Z = X + Y E Z zf x y dxdy XY [ ] ( , )      ( ) , x y f dxdy XY

Derivation E[Z]      E Z zf x y dxdy [ ] ( , )    ( ) , x y f Let Z = X + Y E Z zf x y dxdy XY [ ] ( , )      ( ) , x y f dxdy XY    y XY x xf dxdy yf ( , )    x f y dydx dxdy XY ( , )    xf x dx yf y dy X Y ( ) = E[X] + E[Y]

Derivation of 2(z)   ( ) , [ ] z f x y dxdy E Z   Miracle 13 occurs = 2(x) + 2(y)

In General In general, for Z = the sum of n independent variates, Z = X1 + X2 + X3 + . . . + Xn E Z X i n [ ]   1  2 ( ) z x

Class Problem Suppose n customers enter a store. The ith customer spends some Xi amount. Past data indicates that Xi is a uniform random variable with mean $100 and standard deviation $5. Determine the mean and variance for the total revenue for 5 customers.

Class Problem Total Revenue is given by TR = X1 + X2 + X3 + X4 + X5 Using the property E[Z] = E[X] + E[Y], E[TR] = E[X1] + E[X2] + E[X3] + E[X4] + E[X5] = 100 + 100 + 100 + 100 + 100 = $500

Class Problem TR = X1 + X2 + X3 + X4 + X5 Use the property s2(Z) = s2(X) + s2(Y). Assuming Xi, i = 1, 2, 3, 4, 5 all independent s2(TR) = s2(X1) + s2(X2) + s2(X3) + s2(X4) + s2(X5) = 52 + 52 + 52 + 52 + 52 = 125

Class Problem TR = X1 + X2 + X3 + X4 + X5 Xi , independent with mean 100 and standard deviation 5 E[TR] = $500 s2(TR) = 125