Copyright 1998, S.D. Personick. All Rights Reserved. Telecommunications Networking I Lectures 4&5 Quantifying the Performance of Communication Systems.

Slides:



Advertisements
Similar presentations
1. INTRODUCTION In order to transmit digital information over * bandpass channels, we have to transfer the information to a carrier wave of.appropriate.
Advertisements

CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Lecture 7 Linear time invariant systems
Modulated Digital Transmission
OPTIMUM FILTERING.
EE322 Digital Communications
Digital Data Transmission ECE 457 Spring Information Representation Communication systems convert information into a form suitable for transmission.
Lecture 4 Measurement Accuracy and Statistical Variation.
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3b: Detection and Signal Spaces.
E&CE 418: Tutorial-6 Instructor: Prof. Xuemin (Sherman) Shen
Lecture 16 Random Signals and Noise (III) Fall 2008 NCTU EE Tzu-Hsien Sang.
Digital communication - vector approach Dr. Uri Mahlab 1 Digital Communication Vector Space concept.
Matched Filters By: Andy Wang.
ELEC 303 – Random Signals Lecture 21 – Random processes
Review of Probability.
PULSE MODULATION.
Pulse Modulation 1. Introduction In Continuous Modulation C.M. a parameter in the sinusoidal signal is proportional to m(t) In Pulse Modulation P.M. a.
Modulation, Demodulation and Coding Course
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Lecture II Introduction to Digital Communications Following Lecture III next week: 4. … Matched Filtering ( … continued from L2) (ch. 2 – part 0 “ Notes.
Dept. of EE, NDHU 1 Chapter Three Baseband Demodulation/Detection.
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
ECE 8443 – Pattern Recognition EE 3512 – Signals: Continuous and Discrete Objectives: Convolution Definition Graphical Convolution Examples Properties.
Baseband Demodulation/Detection
Elements of Stochastic Processes Lecture II
Performance of Digital Communications System
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
1 Chapter 9 Detection of Spread-Spectrum Signals.
Chapter 4: Baseband Pulse Transmission Digital Communication Systems 2012 R.Sokullu1/46 CHAPTER 4 BASEBAND PULSE TRANSMISSION.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Introduction to Digital Signals
Chapter 1 Random Process
ECE 4710: Lecture #31 1 System Performance  Chapter 7: Performance of Communication Systems Corrupted by Noise  Important Practical Considerations: 
Discrete-time Random Signals
Baseband Receiver Receiver Design: Demodulation Matched Filter Correlator Receiver Detection Max. Likelihood Detector Probability of Error.
Geology 6600/7600 Signal Analysis 28 Sep 2015 © A.R. Lowry 2015 Last time: Energy Spectral Density; Linear Systems given (deterministic) finite-energy.
EE 3220: Digital Communication Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser Slman bin Abdulaziz.
Lecture 5,6,7: Random variables and signals Aliazam Abbasfar.
Geology 6600/7600 Signal Analysis 23 Oct 2015
Performance of Digital Communications System
Eeng360 1 Chapter 2 Linear Systems Topics:  Review of Linear Systems Linear Time-Invariant Systems Impulse Response Transfer Functions Distortionless.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
Chapter 6 Random Processes
Lecture 1.31 Criteria for optimal reception of radio signals.
Identify the random variable of interest
SIGNAL SPACE ANALYSIS SISTEM KOMUNIKASI
12. Principles of Parameter Estimation
SIGNALS PROCESSING AND ANALYSIS
Analog to digital conversion
Telecommunications Networking I
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Signal Processing First
Lecture 1.30 Structure of the optimal receiver deterministic signals.
Quantization and Encoding
Physics 114: Lecture 10 Error Analysis/ Propagation of Errors
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Error rate due to noise In this section, an expression for the probability of error will be derived The analysis technique, will be demonstrated on a binary.
The Spectral Representation of Stationary Time Series
Chapter 6 Random Processes
12. Principles of Parameter Estimation
16. Mean Square Estimation
Presentation transcript:

Copyright 1998, S.D. Personick. All Rights Reserved. Telecommunications Networking I Lectures 4&5 Quantifying the Performance of Communication Systems Carrying Analog Information

Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise The basic problem: information signal noise signal + noise information

Copyright 1998, S.D. Personick. All Rights Reserved. Analog Signals in Noise: Example Temperature Sensor s= ca (volts), where a=temperature (C) + noise r=s+n =ca+n c=.01 volt/degree-C n(av) =0, var(n) =.0001 volt**2

Copyright 1998, S.D. Personick. All Rights Reserved. Analog Signals in Noise Example (continued) a= a number representing information to be communicated. Apriori, a is a gaussian random variable with variance A, and zero average value s = a signal in the form of a voltage proportional to a …. = ca (volts), where c is a known constant r = a received signal with additive noise = s + n (volts) n = a gaussian random variable with variance N (volts**2) How can we estimate “a”, if we receive “r”?

Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise What is the criterion for determining whether we’ve done a good job in estimating “a” from the received signal “r”? It would have something to do with the difference between our estimated value for “a” and the true value for “a” Example: minimize E{(a-a)**2}, where a is the estimated value of “a”, given r

Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise We will give a proof, on the blackboard, that a, the estimate of “a” that minimizes the average value of (a-a)**2 is a(r) = E (a|r) = the “expected value” of “a”, given “r” The above is true for any probability distributions of “a” and “n”; and for the specific cases given, a(r) = r/c [Ac**2/(Ac**2 +N)]

Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise Harder example: a = a gaussian random variable with variance A, representing information s(t) = a signal of duration T (seconds) where s(t) = a c(t), and c(t) is a known waveform r(t) = a received signal = s(t) + n(t), where n(t) is a “random process” having a set of known statistical characteristics How do we estimate a, given r(t)?

Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise What is the criterion for evaluating how good an estimate of “a” we have derived from r(t)? How do we describe the noise n(t) in a way that is useful in determining how to estimate “a” from r(t)?

Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise Suppose n(t) = nc(t) + x(t) where: n is a gaussian random variable of variance N; and where, in some loosely defined sense: x(t) is a random process that is statistically independent of the random variable “n”, and where …. (continued on next slide)

Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise...x(t) is also “orthogonal” to the known waveform c(t), then we can construct a “hand waiving” argument that suggests that we can ignore x(t), and concentrate on the noise nc(t), as we attempt to estimate the underlying information variable “a”. To make this argument more precisely requires a deep understanding of the theory of random processes

Copyright 1998, S.D. Personick. All Rights Reserved. Signals in Noise We will show (using the blackboard) that we can convert this problem into the earlier problem of estimating an information parameter “a” from a received signal of the form r= ca + n. While doing so, we will introduce the concept of a “matched filter”

Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes If we look at (“sample”) the random process n(t) at times: t1,t2,t3,…,tj, then we get a set of random variables: n(t1), n(t2), n(t3), …,n(tj). If the set of random variables {n(tj)} has a joint probability density that is Gaussian, then n(t) is called a Gaussian random process

Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes (cont’d) Any linear combination of samples of a Gaussian random process is a Gaussian random variable Extending the above, the integral of the product n(t)c(t) over a time interval T is also a Gaussian random variable if, n(t) is a Gaussian random process, and c(t) is a known function

Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes (cont’d) Let n(t) be a random process (not necessarily Gaussian) Define “n” as follows: n= the integral over T of n(t)c(t)/W, where W= the integral over T of c(t)c(t) Then, we can write n(t) as follows: n(t)= nc(t) + “the rest of n(t)”

Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes (cont’d) If n(t) is a “white, Gaussian random process”, then: -n is a Gaussian random variable, and - “rest of n(t)” is statistically independent of “n”…I.e., “the rest of n(t)” contains no information that can help of estimate either “n” or “a”

Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes (cont’d) Furthermore, we can build a correlator that works as follows. It takes the received waveform, r(t), multiplies it by the known waveform c(t), integrates over the time interval T, and finally divides by W z= {the integral over T of r(t)c(t)}/W

Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes (cont’d) Going back to the definitions and equations above, we find that z= a + n, where “a” is the original information variable we wish to estimate, and n is a Gaussian random variable Thus by introducing the correlator, we convert the new problem to (continued)

Copyright 1998, S.D. Personick. All Rights Reserved. Gaussian Random Processes (cont’d) …the old problem of estimating a scaler (“a”) from another scaler (“r”); where r= a+n, and where n is a Gaussian random variable The correlation operation is also known as “matched filtering”, because it can be accomplished by passing r(t) through a filter whose impulse response is c(-t).

Copyright 1998, S.D. Personick. All Rights Reserved. Example Time Information Waveform

Copyright 1998, S.D. Personick. All Rights Reserved. Example Time Pulse stream

Copyright 1998, S.D. Personick. All Rights Reserved. Example Sampling

Copyright 1998, S.D. Personick. All Rights Reserved. Example Samples

Copyright 1998, S.D. Personick. All Rights Reserved. Example Pulse Amplitude Modulation (PAM)

Copyright 1998, S.D. Personick. All Rights Reserved. Example PAM Stream

Copyright 1998, S.D. Personick. All Rights Reserved. Example The PAM stream is a representation of the information signal

Copyright 1998, S.D. Personick. All Rights Reserved. Example s(t) = transmitted signal

Copyright 1998, S.D. Personick. All Rights Reserved. Example r(t) = s(t) + n(t) = received signal

Copyright 1998, S.D. Personick. All Rights Reserved. Matched Filtering a x(t) Matched Filter h(t)= x(-t) y (t) y(t)= integral of [h(t-u) a x(u)du] = integral of [ x(u-t) a x(u)du] y(0)= integral of [a (x(u)x(u)du] = a E

Copyright 1998, S.D. Personick. All Rights Reserved. Matched Filtering a x(t) + n(t) Matched Filter h(t)= x(-t) y (t) y(t)= integral of [h(t-u) {a x(u) + n(u)}du] = integral of [ x(u-t) {a x(u) + n(u)}du] y(0)= integral of [ (x(u){a x(u) + n(u)}du] = a E + n

Copyright 1998, S.D. Personick. All Rights Reserved. Example If each of the random variables “a” is a Gaussian random variable with variance A and, if n(t) is white Gaussian noise with “spectral density” N Then the optimal estimate of each of the variables “a” is given by “sampling” the output of the matched filter and multiplying it by: (1/E) [AE/(N+AE)]

Copyright 1998, S.D. Personick. All Rights Reserved. Example If each of the random variables “a” is a Gaussian random variable with variance A and, if n(t) is “white” Gaussian noise with “spectral density” N Then the associated mean squared error of each estimate will be A [N/(N+AE)]