Download presentation
Presentation is loading. Please wait.
1
SUMS OF RANDOM VARIABLES Changfei Chen
2
Sums of Random Variables Let be a sequence of random variables, and let be their sum:
3
Mean and Variance of Sums of Random Variables The expected value of a sum of n random variables is equal to the sum of the expected values: Note: regardless of the statistical dependence of the r.v.
4
Mean and Variance of Sums of Random Variables Variance of sum of r.v.
5
Mean and Variance of Sums of Random Variables Since that the covariance is not necessarily equal to zero in general, the variance of sum is not necessarily equal to the sum of variances of each r.v.. In case that all the r.v. are independent, the covariance will be zeros. Then
6
pdf of Sums of Independent R.V. Here are n independent r.v. First look at the sum of two independent r.v. Z=X+Y The characteristic function of Z: (1)
7
pdf of Sums of Independent R.V. The cdf of Z: Then the pdf of Z: p.s. Go through ‘Leibniz Rule’ in Calculus
8
pdf of Sums of Independent R.V. can be viewed as the Fourier transform of the pdf of Z, so: by equation (1) The Fourier transform of a convolution of two functions is equal to the product of the individual Fourier transforms.
9
pdf of Sums of Independent R.V. Now considering the sum of more r.v. Let Thus the pdf of the sum of r.v. can be found by finding the inverse Fourier transform of the product of the individual characteristic functions.
10
The Sample Mean X be a random variable for which the mean,, is unknown. denote n independent, repeated measurements of X, i.e. the are independent, identically distributed r.v. ( each has the same probability distribution as the others and all are mutually independent ) with the same pdf as X. Then the sample mean,, of the sequence is used to estimate E[X]:
11
The Sample Mean The expected value of the sample mean: (where ) So the mean of the sample mean is equal to So we say that the sample mean is an Unbiased Estimator for
12
The Sample Mean Then the mean square error of the sample mean about is equal to the variance of the sample mean.
13
The Sample Mean Let Then So is the variance of X i
14
The Sample Mean So as n, the number of samples, increases, the variance of the sample mean approaches zero, which means that the probability that the sample mean is close to the true mean approaches one as n becomes very large.
15
The Sample Mean Use the Chebyshev inequality to formalize the probability: Thus for any choice of error and probability, we can select the number of samples,n, to have the sample mean be within of the true mean with probability
16
The Sample Mean Example: A Voltage of constant, but unknown, value is to be measured. Each measurement Xi is actually the sum of the desired voltage v and a noise voltage Ni of zero mean and standard deviation of 1 microvolt: Assume that the noise are independent variables. How many measurements are required so that the probability that is within microvolt of the true mean is at least.99?
17
The Sample Mean Example (Continue): From the problem, we know that each measurement X i has mean v and variance 1. Moreover, we know So We can solve the above equation and get n=100. Thus if repeat the measurement 100 times, we can have the sample mean of the measurement results, on average, be of 99% probability within 1 microvolt.
18
Weak Law of Large Numbers If we let the number of sample,n, approach infinity, The above is the expression of the weak law of large numbers, which states that for a large enough fixed value of n, the sample mean using n samples will be close to the true mean with high probability.
19
Strong Law of Large Numbers Let be a sequence of iid r.v. with finite mean and finite variance which states that with probability 1, every sequence of sample mean calculations will eventually approach and stay close to The strong law of large numbers demonstrates the consistency between the theory and the observed physical behavior.
20
Strong Law of Large Numbers Example: Relative Frequency Consider a sequence of independent repetitions of some random experiment and let the r.v. be the indicator function for the occurrence of event A in the ith trial. The total number of occurrences of A in the first n trials is then The relative frequency of event A in the first n repetitions of the experiment is then Thus the relative frequency is simply the sample mean of the random variables
21
Strong Law of Large Numbers Example (Continue): So, apply the weak law of large numbers to the relative frequency: apply the strong law of large numbers to the relative frequency:
22
The Central Limit Theorem Let be the sum of n iid random variables with finite mean and finite variance, and let be the zero-mean, unit variance r.v. defined by (normalize the ) As we know the pdf of Gaussian r.v. is Where m is the mean and is the variance of Gaussian r.v. Then Which states that as n becomes large, the cdf of approaches the cdf of a Gaussian r.v.
23
The Central Limit Theorem In central limit theorem can be any distributions as they have a finite mean and finite variance, which gives it the wide applicability. The central limit theorem explains why the Gaussian r.v. appears in so many applications.
24
The Central Limit Theorem Example: Suppose that the orders at a restaurant are iid r.v. with mean and standard deviation. Estimate the probability that the first 100 customers spend a total of (1) more than $840. (2) between $780 and $820. Let denote the expenditure of the ith customer, then the total spent of the first 100 customers will be The mean and variance of are Normalize the
25
The Central Limit Theorem Example (Continue): Thus, (1) (2)
26
Questions? Thank you!
27
More: Q-function The values of the Q(x) in the previous example come from the table of Q-function. The Q-function is defined by Where is the cdf of a Gaussian r.v. with zero mean and unit variance. Properties of Q-function:
28
More: Proof of the central limit theorem The characteristic function of is given by Expanding the exponential in the expression, we get The term can be neglected relative to as n becomes large. Thus,
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.