Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright 1998, S.D. Personick. All Rights Reserved. Telecommunications Networking I Lectures 4&5 Quantifying the Performance of Communication Systems.

Similar presentations


Presentation on theme: "Copyright 1998, S.D. Personick. All Rights Reserved. Telecommunications Networking I Lectures 4&5 Quantifying the Performance of Communication Systems."— Presentation transcript:

1 Copyright 1998, S.D. Personick. All Rights Reserved. Telecommunications Networking I Lectures 4&5 Quantifying the Performance of Communication Systems Carrying Analog Information

2 Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise The basic problem: information signal noise signal + noise information

3 Copyright 1998, S.D. Personick. All Rights Reserved. Analog Signals in Noise: Example Temperature Sensor s= ca (volts), where a=temperature (C) + noise r=s+n =ca+n c=.01 volt/degree-C n(av) =0, var(n) =.0001 volt**2

4 Copyright 1998, S.D. Personick. All Rights Reserved. Analog Signals in Noise Example (continued) a= a number representing information to be communicated. Apriori, a is a gaussian random variable with variance A, and zero average value s = a signal in the form of a voltage proportional to a …. = ca (volts), where c is a known constant r = a received signal with additive noise = s + n (volts) n = a gaussian random variable with variance N (volts**2) How can we estimate “a”, if we receive “r”?

5 Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise What is the criterion for determining whether we’ve done a good job in estimating “a” from the received signal “r”? It would have something to do with the difference between our estimated value for “a” and the true value for “a” Example: minimize E{(a-a)**2}, where a is the estimated value of “a”, given r

6 Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise We will give a proof, on the blackboard, that a, the estimate of “a” that minimizes the average value of (a-a)**2 is a(r) = E (a|r) = the “expected value” of “a”, given “r” The above is true for any probability distributions of “a” and “n”; and for the specific cases given, a(r) = r/c [Ac**2/(Ac**2 +N)]

7 Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise Harder example: a = a gaussian random variable with variance A, representing information s(t) = a signal of duration T (seconds) where s(t) = a c(t), and c(t) is a known waveform r(t) = a received signal = s(t) + n(t), where n(t) is a “random process” having a set of known statistical characteristics How do we estimate a, given r(t)?

8 Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise What is the criterion for evaluating how good an estimate of “a” we have derived from r(t)? How do we describe the noise n(t) in a way that is useful in determining how to estimate “a” from r(t)?

9 Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise Suppose n(t) = nc(t) + x(t) where: n is a gaussian random variable of variance N; and where, in some loosely defined sense: x(t) is a random process that is statistically independent of the random variable “n”, and where …. (continued on next slide)

10 Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise...x(t) is also “orthogonal” to the known waveform c(t), then we can construct a “hand waiving” argument that suggests that we can ignore x(t), and concentrate on the noise nc(t), as we attempt to estimate the underlying information variable “a”. To make this argument more precisely requires a deep understanding of the theory of random processes

11 Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise We will show (using the blackboard) that we can convert this problem into the earlier problem of estimating an information parameter “a” from a received signal of the form r= ca + n. While doing so, we will introduce the concept of a “matched filter”

12 Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes If we look at (“sample”) the random process n(t) at times: t1,t2,t3,…,tj, then we get a set of random variables: n(t1), n(t2), n(t3), …,n(tj). If the set of random variables {n(tj)} has a joint probability density that is Gaussian, then n(t) is called a Gaussian random process

13 Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes (cont’d) Any linear combination of samples of a Gaussian random process is a Gaussian random variable Extending the above, the integral of the product n(t)c(t) over a time interval T is also a Gaussian random variable if, n(t) is a Gaussian random process, and c(t) is a known function

14 Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes (cont’d) Let n(t) be a random process (not necessarily Gaussian) Define “n” as follows: n= the integral over T of n(t)c(t)/W, where W= the integral over T of c(t)c(t) Then, we can write n(t) as follows: n(t)= nc(t) + “the rest of n(t)”

15 Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes (cont’d) If n(t) is a “white, Gaussian random process”, then: -n is a Gaussian random variable, and - “rest of n(t)” is statistically independent of “n”…I.e., “the rest of n(t)” contains no information that can help of estimate either “n” or “a”

16 Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes (cont’d) Furthermore, we can build a correlator that works as follows. It takes the received waveform, r(t), multiplies it by the known waveform c(t), integrates over the time interval T, and finally divides by W z= {the integral over T of r(t)c(t)}/W

17 Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes (cont’d) Going back to the definitions and equations above, we find that z= a + n, where “a” is the original information variable we wish to estimate, and n is a Gaussian random variable Thus by introducing the correlator, we convert the new problem to (continued)

18 Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes (cont’d) …the old problem of estimating a scaler (“a”) from another scaler (“r”); where r= a+n, and where n is a Gaussian random variable The correlation operation is also known as “matched filtering”, because it can be accomplished by passing r(t) through a filter whose impulse response is c(-t).

19 Copyright 1998, S.D. Personick. All Rights Reserved. Example Time Information Waveform

20 Copyright 1998, S.D. Personick. All Rights Reserved. Example Time Pulse stream

21 Copyright 1998, S.D. Personick. All Rights Reserved. Example Sampling

22 Copyright 1998, S.D. Personick. All Rights Reserved. Example Samples

23 Copyright 1998, S.D. Personick. All Rights Reserved. Example Pulse Amplitude Modulation (PAM)

24 Copyright 1998, S.D. Personick. All Rights Reserved. Example PAM Stream

25 Copyright 1998, S.D. Personick. All Rights Reserved. Example The PAM stream is a representation of the information signal

26 Copyright 1998, S.D. Personick. All Rights Reserved. Example s(t) = transmitted signal

27 Copyright 1998, S.D. Personick. All Rights Reserved. Example r(t) = s(t) + n(t) = received signal

28 Copyright 1998, S.D. Personick. All Rights Reserved. Matched Filtering a x(t) Matched Filter h(t)= x(-t) y (t) y(t)= integral of [h(t-u) a x(u)du] = integral of [ x(u-t) a x(u)du] y(0)= integral of [a (x(u)x(u)du] = a E

29 Copyright 1998, S.D. Personick. All Rights Reserved. Matched Filtering a x(t) + n(t) Matched Filter h(t)= x(-t) y (t) y(t)= integral of [h(t-u) {a x(u) + n(u)}du] = integral of [ x(u-t) {a x(u) + n(u)}du] y(0)= integral of [ (x(u){a x(u) + n(u)}du] = a E + n

30 Copyright 1998, S.D. Personick. All Rights Reserved. Example If each of the random variables “a” is a Gaussian random variable with variance A and, if n(t) is white Gaussian noise with “spectral density” N Then the optimal estimate of each of the variables “a” is given by “sampling” the output of the matched filter and multiplying it by: (1/E) [AE/(N+AE)]

31 Copyright 1998, S.D. Personick. All Rights Reserved. Example If each of the random variables “a” is a Gaussian random variable with variance A and, if n(t) is “white” Gaussian noise with “spectral density” N Then the associated mean squared error of each estimate will be A [N/(N+AE)]


Download ppt "Copyright 1998, S.D. Personick. All Rights Reserved. Telecommunications Networking I Lectures 4&5 Quantifying the Performance of Communication Systems."

Similar presentations


Ads by Google