Download presentation
Presentation is loading. Please wait.
1
Assignment 2 Chapter 2: Problems Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup session April 29, 2004 – 6:30-8:30 PM
2
(1) If X is a discrete random variable with pmf p ( x ), then for any real-valued function g, (2) If X is a continuous random variable with pdf f ( x ), then for any real-valued function g, Expectation of a function of a random variable
3
If a and b are constants, then E [ aX + b ]= aE [X]+ b The expected value E [ X n ] is called the n th moment of the random variable X. The expected value E [( X - E [ X ]) 2 ] is called the variance of the random variable X and denoted by Var( X ) Var( X ) = E [ X 2 ] - E [ X ] 2
4
Let X and Y be two random variables. The joint cumulative probability distribution of X and Y is defined as Jointly distributed random variables
5
If X and Y are both discrete random variables, the joint pmf of X and Y is defined as
6
If X and Y are continuous random variables, X and Y are said to be jointly continuous if there exists a function f ( x, y ) such that
7
If X and Y are random variables and g is a function of two variables
8
Example: g ( X, Y ) = X + Y The same result holds for discrete X and Y. More generally, E [ a 1 X 1 + a 2 X 2 +...+ a n X n ] = a 1 E [ X 1 ]+ a 1 E [ X 2 ]+...+ a 1 E[ X n ]
9
Two random variables X and Y are said to be independent if for all x and y Independent random variables
10
The covariance of any two random variables X and Y is given by Covariance
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.