Download presentation
Presentation is loading. Please wait.
1
Generating Functions
2
The Moments of Y We have referred to E(Y) and E(Y2) as the first and second moments of Y, respectively. In general, E(Yk) is the kth moment of Y. Consider the polynomial where the moments of Y are incorporated into the coefficients
3
Moment Generating Function
If the sum converges for all t in some interval |t| < b, the polynomial is called the moment-generating function, m(t), for the random variable Y. And we may note that for each k,
4
Moment Generating Function
Hence, the moment-generating function is given by May rearrange, since finite for |t| < b.
5
Moment Generating Function
That is, is the polynomial whose coefficients involve the moments of Y.
6
The kth moment To retrieve the kth moment from the MGF, evaluate the kth derivative at t = 0. And so, letting t = 0:
7
Common MGFs The MGFs for some of the discrete distributions we’ve seen include:
8
Geometric MGF Consider the MGF
Use derivatives to determine the first and second moments. And so,
9
Geometric MGF Since We have And so,
10
Geometric MGF Since is for a geometric random variable with p = 1/3, our prior results tell us E(Y) = 1/p and V(Y) = (1 – p)/p2. which do agree with our current results.
11
All the moments Although the mean and variance help to describe a distribution, they alone do not uniquely describe a distribution. All the moments are necessary to uniquely describe a probability distribution. That is, if two random variables have equal MGFs, (i.e., mY(t) = mZ(t) for |t| < b ), then they have the same probability distribution.
12
m(aY+b)? For the random variable Y with MGF m(t), consider W = aY + b.
13
E(aY+b) Now, based on the MGF, we could again consider E(W) = E(aY + b). And so, letting t = 0, as expected.
14
V(aY+b) Now, based on the MGF, can you again consider V(W) = V(aY + b). …and so V(W) = V(aY + b) = a2V(Y).
15
Tchebysheff’s Theorem
For “bell-shaped” distributions, the empirical rule gave us a % rule for probability a value falls within 1, 2, or 3 standard deviations from the mean, respectively. When the distribution is not so bell-shaped, Tchebysheff tells use the probability of being within k standard deviations of the mean is at least 1 – 1/k2, for k > 0. Remember, it’s just a lower bound.
16
A Skewed Distribution Consider a binomial experiment with n = 10 and p = 0.1.
17
A Skewed Distribution Verify Tchebysheff’s lower bound for k = 2:
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.