Download presentation
Presentation is loading. Please wait.
1
Telecommunications Networking I
Topic 3 Quantifying the Performance of Communication Systems Carrying Analog Information Dr. Stewart D. Personick Drexel University Copyright 2001, S.D. Personick. All Rights Reserved.
2
Copyright 2001, S.D. Personick. All Rights Reserved.
Signals in Noise The basic model: data or information signal + data or information signal + noise noise Copyright 2001, S.D. Personick. All Rights Reserved.
3
Analog Signals in Noise: Example
Engine + r=s+n =ca+n s= ca (volts) where a= temperature (C) Temperature Sensor c= .01 volt/degree-C n(av) =0, var(n) = volt**2 Copyright 2001, S.D. Personick. All Rights Reserved.
4
Analog Signals in Noise
Example (continued) a= a number representing information to be communicated. apriori, a is a Gaussian random variable with variance A, and zero average value s = a signal in the form of a voltage proportional to a …. = ca (volts), where c is a known constant r = a received signal with additive noise = s + n (volts) n = a Gaussian random variable with variance N (volts**2) How can we estimate “a”, if we receive “r”? Copyright 2001, S.D. Personick. All Rights Reserved.
5
Copyright 2001, S.D. Personick. All Rights Reserved.
Signals in Noise What is the criterion for determining whether we’ve done a good job in estimating “a” from the received signal “r”? It would have something to do with the difference between our estimated value for “a” and the true value for “a” Example: minimize E{(a-a)**2}, where a is the estimated value of “a”, given r Copyright 2001, S.D. Personick. All Rights Reserved.
6
Copyright 2001, S.D. Personick. All Rights Reserved.
Signals in Noise We will give a proof, on the blackboard, that a, the estimate of “a” that minimizes the average value of (a-a)**2 is a(r) = E (a|r) = the “expected value” of “a”, given “r” The above is true for any probability distributions of “a” and “n”; and for the specific cases given, a(r) = r/c [Ac**2/(Ac**2 +N)] Copyright 2001, S.D. Personick. All Rights Reserved.
7
Copyright 2001, S.D. Personick. All Rights Reserved.
Signals in Noise Harder example: a = a Gaussian random variable with variance A, representing information s(t) = a signal of duration T (seconds) where s(t) = a c(t), and c(t) is a known waveform r(t) = a received signal = s(t) + n(t), where n(t) is a “random process” having a set of known statistical characteristics How do we estimate a, given r(t)? Copyright 2001, S.D. Personick. All Rights Reserved.
8
Copyright 2001, S.D. Personick. All Rights Reserved.
Signals in Noise What is the criterion for evaluating how good an estimate of “a” we have derived from r(t)? How do we describe the noise n(t) in a way that is useful in determining how to estimate “a” from r(t)? Copyright 2001, S.D. Personick. All Rights Reserved.
9
Copyright 2001, S.D. Personick. All Rights Reserved.
Signals in Noise Suppose n(t) = nc(t) + x(t) where: n is a Gaussian random variable of variance N; and where, in some loosely defined sense: x(t) is a random process that is statistically independent of the random variable “n”, and where …. (continued on next slide) Copyright 2001, S.D. Personick. All Rights Reserved.
10
Copyright 2001, S.D. Personick. All Rights Reserved.
Signals in Noise ...x(t) is also “orthogonal” to the known waveform c(t), then we can construct a “hand waiving” argument that suggests that we can ignore x(t), and concentrate on the noise nc(t), as we attempt to estimate the underlying information variable “a”. To make this argument more precisely requires a deep understanding of the theory of random processes Copyright 2001, S.D. Personick. All Rights Reserved.
11
Copyright 2001, S.D. Personick. All Rights Reserved.
Signals in Noise We will show (using the blackboard) that we can convert this problem into the earlier problem of estimating an information parameter “a” from a received signal of the form r= ca + n. While doing so, we will introduce the concept of a “matched filter” Copyright 2001, S.D. Personick. All Rights Reserved.
12
Gaussian Random Processes
If we look at (“sample”) the random process n(t) at times: t1,t2,t3,…,tj, then we get a set of random variables: n(t1), n(t2), n(t3), …,n(tj). If the set of random variables {n(tj)} has a joint probability density that is Gaussian, then n(t) is called a Gaussian random process Copyright 2001, S.D. Personick. All Rights Reserved.
13
Gaussian Random Processes (cont’d)
Any linear combination of samples of a Gaussian random process is a Gaussian random variable Extending the above, the integral of the product n(t)c(t) over a time interval T is also a Gaussian random variable if, n(t) is a Gaussian random process, and c(t) is a known function Copyright 2001, S.D. Personick. All Rights Reserved.
14
Gaussian Random Processes (cont’d)
Let n(t) be a random process (not necessarily Gaussian) Define “n” as follows: n= the integral over T of n(t)c(t)/W, where W= the integral over T of c(t)c(t) Then, we can write n(t) as follows: n(t)= nc(t) + “the rest of n(t)” Copyright 2001, S.D. Personick. All Rights Reserved.
15
Gaussian Random Processes (cont’d)
If n(t) is a “white, Gaussian random process”, then: -n is a Gaussian random variable, and - “rest of n(t)” is statistically independent of “n”…I.e., “the rest of n(t)” contains no information that can help of estimate either “n” or “a” Copyright 2001, S.D. Personick. All Rights Reserved.
16
Gaussian Random Processes (cont’d)
Furthermore, we can build a correlator that works as follows. It takes the received waveform, r(t), multiplies it by the known waveform c(t), integrates over the time interval T, and finally divides by W z= {the integral over T of r(t)c(t)}/W Copyright 2001, S.D. Personick. All Rights Reserved.
17
Gaussian Random Processes (cont’d)
Going back to the definitions and equations above, we find that z= a + n, where “a” is the original information variable we wish to estimate, and n is a Gaussian random variable Thus by introducing the correlator, we convert the new problem to (continued) Copyright 2001, S.D. Personick. All Rights Reserved.
18
Gaussian Random Processes (cont’d)
…the old problem of estimating a scalar (“a”) from another scalar (“r”); where r= a+n, and where n is a Gaussian random variable The correlation operation is also known as “matched filtering”, because it can be accomplished by passing r(t) through a filter whose impulse response is c(-t). Copyright 2001, S.D. Personick. All Rights Reserved.
19
Capturing analog waveforms
Copyright 2001, S.D. Personick. All Rights Reserved.
20
Copyright 2001, S.D. Personick. All Rights Reserved.
Example Information Waveform Time Copyright 2001, S.D. Personick. All Rights Reserved.
21
Copyright 2001, S.D. Personick. All Rights Reserved.
Example Pulse stream Time Copyright 2001, S.D. Personick. All Rights Reserved.
22
Copyright 2001, S.D. Personick. All Rights Reserved.
Example Sampling Copyright 2001, S.D. Personick. All Rights Reserved.
23
Copyright 2001, S.D. Personick. All Rights Reserved.
Example Samples Copyright 2001, S.D. Personick. All Rights Reserved.
24
Copyright 2001, S.D. Personick. All Rights Reserved.
Example Pulse Amplitude Modulation (PAM) Copyright 2001, S.D. Personick. All Rights Reserved.
25
Copyright 2001, S.D. Personick. All Rights Reserved.
Example PAM Stream Copyright 2001, S.D. Personick. All Rights Reserved.
26
Copyright 2001, S.D. Personick. All Rights Reserved.
Example The PAM stream is a representation of the information signal Copyright 2001, S.D. Personick. All Rights Reserved.
27
Copyright 2001, S.D. Personick. All Rights Reserved.
Example s(t) = transmitted signal Copyright 2001, S.D. Personick. All Rights Reserved.
28
Copyright 2001, S.D. Personick. All Rights Reserved.
Example r(t) = s(t) + n(t) = received signal Copyright 2001, S.D. Personick. All Rights Reserved.
29
Copyright 2001, S.D. Personick. All Rights Reserved.
Matched Filtering y (t) h(t)= x(-t) a x(t) Matched Filter y(t)= integral of [h(t-u) a x(u)du] = integral of [ x(u-t) a x(u)du] y(0)= integral of [a (x(u)x(u)du] = a E Copyright 2001, S.D. Personick. All Rights Reserved.
30
Copyright 2001, S.D. Personick. All Rights Reserved.
Matched Filtering y (t) h(t)= x(-t) a x(t) + n(t) Matched Filter y(t)= integral of [h(t-u) {a x(u) + n(u)}du] = integral of [ x(u-t) {a x(u) + n(u)}du] y(0)= integral of [ (x(u){a x(u) + n(u)}du] = a E + n Copyright 2001, S.D. Personick. All Rights Reserved.
31
Copyright 2001, S.D. Personick. All Rights Reserved.
Example If each of the random variables “a” is a Gaussian random variable with variance A and, if n(t) is white Gaussian noise with “spectral density” N Then the optimal estimate of each of the variables “a” is given by “sampling” the output of the matched filter and multiplying it by: (1/E) [AE/(N+AE)] Copyright 2001, S.D. Personick. All Rights Reserved.
32
Copyright 2001, S.D. Personick. All Rights Reserved.
Example If each of the random variables “a” is a Gaussian random variable with variance A and, if n(t) is “white” Gaussian noise with “spectral density” N Then the associated mean squared error of each estimate will be A [N/(N+AE)] Copyright 2001, S.D. Personick. All Rights Reserved.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.