Download presentation
Presentation is loading. Please wait.
1
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable
2
Example: A series of n independent trials, each having a probability p of being a success and 1 – p of being a failure, are performed. Let X be the number of successes in the n trials.
3
A random variable that has the following pmf is said to be a Poisson random variable with parameter The Poisson random variable
4
Example: The number of cars sold per day by a dealer is Poisson with parameter = 2. What is the probability of selling no cars today? What is the probability of selling 2?
5
Example: The number of cars sold per day by a dealer is Poisson with parameter = 2. What is the probability of selling no cars today? What is the probability of receiving 100? Solution: P ( X =0) = e -2 0.135 P(X = 2)= e -2 (2 2 /2!) 0.270
6
We say that X is a continuous random variable if there exists a non-negative function f ( x ), for all real values of x, such that for any set B of real numbers, we have The function f ( x ) is called the probability density function (pdf) of the random variable X. Continuous random variables
7
Properties
8
A random variable that has the following pdf is said to be a uniform random variable over the interval ( a, b ) The Uniform random variable
9
A random variable that has the following pdf is said to be a uniform random variable over the interval ( a, b ) The Uniform random variable
10
A random variable that has the following pdf is said to be a exponential random variable with parameter The Exponential random variable
11
A random variable that has the following pdf is said to be a exponential random variable with parameter The Exponential random variable
12
A random variable that has the following pdf is said to be a gamma random variable with parameters , The Gamma random variable
13
A random variable that has the following pdf is said to be a normal random variable with parameters , The Normal random variable Note: The distribution with parameters = 0 and = 1 is called the standard normal distribution.
14
If X is a discrete random variable with pmf p ( x ), then the expected value of X is defined by Expectation of a random variable
15
If X is a discrete random variable with pmf p ( x ), then the expected value of X is defined by Expectation of a random variable Example: p (1)=0.2, p (3)=0.3, p (5)=0.2, p (7)=0.3 E [ X ] = 0.2(1)+0.3(3)+0.2(5)+0.3(7)=0.2+0.9+1+2.1=4.2
16
If X is a continuous random variable with pdf f ( x ), then the expected value of X is defined by
17
Expectation of a Bernoulli random variable E [ X ] = 0(1 - p ) + 1( p ) = p
18
Expectation of a Bernoulli random variable E [ X ] = 0(1 - p ) + 1( p ) = p Expectation of a geometric random variable
19
Expectation of a Bernoulli random variable E [ X ] = 0(1 - p ) + 1( p ) = p Expectation of a geometric random variable Expectation of a binomial random variable
20
Expectation of a Bernoulli random variable E [ X ] = 0(1 - p ) + 1( p ) = p Expectation of a geometric random variable Expectation of a binomial random variable Expectation of a Poisson random variable
21
Expectation of a uniform random variable
22
Expectation of an normal random variable
23
Expectation of a uniform random variable Expectation of an exponential random variable Expectation of a exponential random variable
24
(1) If X is a discrete random variable with pmf p ( x ), then for any real-valued function g, (2) If X is a continuous random variable with pdf f ( x ), then for any real-valued function g, Expectation of a function of a random variable
25
(1) If X is a discrete random variable with pmf p ( x ), then for any real-valued function g, (2) If X is a continuous random variable with pdf f ( x ), then for any real-valued function g, Expectation of a function of a random variable Note: P ( Y = g ( x ))=P( X = x )
26
If a and b are constants, then E [ aX + b ]= aE [X]+ b The expected value E [ X n ] is called the n th moment of the random variable X. The expected value E [( X - E [ X ]) 2 ] is called the variance of the random variable X and denoted by Var( X ) Var( X ) = E [ X 2 ] - E [ X ] 2
27
Let X and Y be two random variables. The joint cumulative probability distribution of X and Y is defined as Jointly distributed random variables
28
If X and Y are both discrete random variables, the joint pmf of X and Y is defined as
29
If X and Y are continuous random variables, X and Y are said to be jointly continuous if there exists a function f ( x, y ) such that
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.