Download presentation
Presentation is loading. Please wait.
1
Mean & Variance of a Distribution
2
Mean & Variance of a Distribution
The mean π and variance π 2 of a random variable X and of its distribution are the theoretical counterparts of the mean π₯ and variance s2 of a frequency distribution in the previous section. And serve a similar purpose. Indeed, the mean characterizes the central location and the variance the spread (the variability) of the distribution.
3
Mean & Variance of a Distribution
The mean π is defined by
4
Mean & Variance of a Distribution
And the variance π 2 by
5
Mean & Variance of a Distribution
π (the positive square root of π 2 ) is called the standart deviation f X and its distribution. F is the probability function or the density, respectively, in (a) or (b). The mean π is also denoted by E(X) and is called the expectation of X because it gives the avarage value of X to be expected in many trials. Quantities such as π and π 2 that measure certain properties of a distribution are called parameters. π and π 2 are the two most important ones.
6
Mean & Variance of a Distribution
From (2) we see that Expect for a discrete βdistributionβ with only one possible value, so that π 2 = 0. We assume that π and π 2 exist (are finite), as is the case for practically all disributions that useful in application.
7
Mean And Variance The random variable X=Number of heads in a single toss of a fair coin has the possible values X=0 and X=1 with probabilities P(X=0)= 1 2 and P(X=1)= 1 2 . Form (1a) we thus obtain the mean π = 0 * * = 1 2 , and (2a) yields the variance
8
Uniform Distribution. Variance Measures Spread
The distribution with the density And f=0 otherwise is called the uniform distribution on the interval π < x < b.
9
Uniform Distribution. Variance Measures Spread
From (1b) we find that π = (a+b)/2 and (2b) yields the variance
10
Uniform Distribution. Variance Measures Spread
Figures illustrates that the spread is large if and only if π 2 large
11
Uniform Distribution. Variance Measures Spread
12
Mean of a Symmetric Distribution
We can obtain the mean π without calculation if a distribution is symmetric. Indeed, you may prove If a distribution is symmetric with respect to x=c, that is, f(c-x) = f(c+x), then π = c
13
Transformation of Mean and Variance
(a) If a random variable X has mean π and variance π 2 , then the random variable Has the mean π* and variance π β2 , where
14
Transformation of Mean and Variance
(b) In particular, the standardized random variable Z corresponding to X, given by Has the mean 0 and variance 1.
15
Proof We prove (5) for a continuous distribution.
To a small interval πΌ of length Ξx on the x-axis there corresponds the probability f(x)Ξx [approximetely; the area of a rectangle of base Ξx and height f(x)]. Then the probability f(x)Ξx must equal that for the corresponding interval on the x*-axis, that is, f*(x*)Ξx* , where f* is the density of X* and Ξx* is the length of the interval on the x*-axis corresponding to πΌ.
16
Proof Hence for differentials we have f*(x*)πx* = f(x)dx.
Also x* = a1 + a2 x by (4) so that (1b) applied to X* gives
17
Proof On the right the first integral rquals 1 by (10) in the previous section. The second integral is π. This proves (5) for π*. It implies From this and (2) applied to X*, again using f*(x*)πx* = f(x)dx, we obtain the second formula in (5),
18
Proof For a discrete distrubution the proof of (5) is similar.
Choosing π 1 =-π / π and π 2 =-1 / π we obtain (6) form (4), writing X* = Z For these π 1 , π 2 formula (5) gives π*=0 and π*2 = 1, as claimed in (b).
19
Expectation, Moments Recall that (1) defines the expectation (the mean) of X, the value of X to be expected on the avarage, written π = E(X). More generally, if g(x) is nonconstant and continuous for all x, then g(x) is a random variable. Hence its mathematical expectation or briefly, its expectation E(g(X)) is the value of g(X) to be the avarage, defined by,
20
Expectation, Moments In the first formula f is the probability function of the discrete random variable X. In the second formula, f is the density of the continuous random variable X. Important special cases are the kth moment of X (where k= 1,2,β¦)
21
Expectation, Moments And the kth central moment of X (k=1,2,β¦)
This includes the first moment, the mean of X
22
Expectation, Moments It also includes the second central moment, the variance of X For later use you may prove
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.