Telecommunications Networking I

Slides:



Advertisements
Similar presentations
1. INTRODUCTION In order to transmit digital information over * bandpass channels, we have to transfer the information to a carrier wave of.appropriate.
Advertisements

CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Lecture 7 Linear time invariant systems
Modulated Digital Transmission
OPTIMUM FILTERING.
SYSTEMS Identification
Lecture 4 Measurement Accuracy and Statistical Variation.
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3b: Detection and Signal Spaces.
E&CE 418: Tutorial-6 Instructor: Prof. Xuemin (Sherman) Shen
Lecture 16 Random Signals and Noise (III) Fall 2008 NCTU EE Tzu-Hsien Sang.
Digital communication - vector approach Dr. Uri Mahlab 1 Digital Communication Vector Space concept.
Matched Filters By: Andy Wang.
ELEC 303 – Random Signals Lecture 21 – Random processes
Review of Probability.
PULSE MODULATION.
Pulse Modulation 1. Introduction In Continuous Modulation C.M. a parameter in the sinusoidal signal is proportional to m(t) In Pulse Modulation P.M. a.
Modulation, Demodulation and Coding Course
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Dept. of EE, NDHU 1 Chapter Three Baseband Demodulation/Detection.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Baseband Demodulation/Detection
Performance of Digital Communications System
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
1 Chapter 9 Detection of Spread-Spectrum Signals.
Chapter 4: Baseband Pulse Transmission Digital Communication Systems 2012 R.Sokullu1/46 CHAPTER 4 BASEBAND PULSE TRANSMISSION.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Introduction to Digital Signals
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 3: Baseband Modulation.
Digital Communications Chapeter 3. Baseband Demodulation/Detection Signal Processing Lab.
Chapter 1 Random Process
ECE 4710: Lecture #31 1 System Performance  Chapter 7: Performance of Communication Systems Corrupted by Noise  Important Practical Considerations: 
Discrete-time Random Signals
Baseband Receiver Receiver Design: Demodulation Matched Filter Correlator Receiver Detection Max. Likelihood Detector Probability of Error.
EE 3220: Digital Communication Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser Slman bin Abdulaziz.
Geology 6600/7600 Signal Analysis 23 Oct 2015
Performance of Digital Communications System
Eeng360 1 Chapter 2 Linear Systems Topics:  Review of Linear Systems Linear Time-Invariant Systems Impulse Response Transfer Functions Distortionless.
Chapter 6 Random Processes
Copyright 1998, S.D. Personick. All Rights Reserved. Telecommunications Networking I Lectures 4&5 Quantifying the Performance of Communication Systems.
Eeng Chapter4 Bandpass Signalling  Bandpass Filtering and Linear Distortion  Bandpass Sampling Theorem  Bandpass Dimensionality Theorem  Amplifiers.
Lecture 1.31 Criteria for optimal reception of radio signals.
Identify the random variable of interest
Physics 114: Lecture 13 Probability Tests & Linear Fitting
SIGNAL SPACE ANALYSIS SISTEM KOMUNIKASI
12. Principles of Parameter Estimation
SIGNALS PROCESSING AND ANALYSIS
Analog to digital conversion
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Signal Processing First
Lecture 1.30 Structure of the optimal receiver deterministic signals.
Physics 114: Lecture 10 Error Analysis/ Propagation of Errors
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Statistical Methods For Engineers
Error rate due to noise In this section, an expression for the probability of error will be derived The analysis technique, will be demonstrated on a binary.
The Spectral Representation of Stationary Time Series
Parametric Methods Berlin Chen, 2005 References:
COSC 4214: Digital Communications
Chapter4 Bandpass Signalling Bandpass Filtering and Linear Distortion
Chapter 6 Random Processes
12. Principles of Parameter Estimation
16. Mean Square Estimation
INTRODUCTION TO DIGITAL COMMUNICATION
copyright Robert J. Marks II
Presentation transcript:

Telecommunications Networking I Topic 3 Quantifying the Performance of Communication Systems Carrying Analog Information Dr. Stewart D. Personick Drexel University Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Signals in Noise The basic model: data or information signal + data or information signal + noise noise Copyright 2001, S.D. Personick. All Rights Reserved.

Analog Signals in Noise: Example Engine + r=s+n =ca+n s= ca (volts) where a= temperature (C) Temperature Sensor c= .01 volt/degree-C n(av) =0, var(n) = .0001 volt**2 Copyright 2001, S.D. Personick. All Rights Reserved.

Analog Signals in Noise Example (continued) a= a number representing information to be communicated. apriori, a is a Gaussian random variable with variance A, and zero average value s = a signal in the form of a voltage proportional to a …. = ca (volts), where c is a known constant r = a received signal with additive noise = s + n (volts) n = a Gaussian random variable with variance N (volts**2) How can we estimate “a”, if we receive “r”? Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Signals in Noise What is the criterion for determining whether we’ve done a good job in estimating “a” from the received signal “r”? It would have something to do with the difference between our estimated value for “a” and the true value for “a” Example: minimize E{(a-a)**2}, where a is the estimated value of “a”, given r Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Signals in Noise We will give a proof, on the blackboard, that a, the estimate of “a” that minimizes the average value of (a-a)**2 is a(r) = E (a|r) = the “expected value” of “a”, given “r” The above is true for any probability distributions of “a” and “n”; and for the specific cases given, a(r) = r/c [Ac**2/(Ac**2 +N)] Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Signals in Noise Harder example: a = a Gaussian random variable with variance A, representing information s(t) = a signal of duration T (seconds) where s(t) = a c(t), and c(t) is a known waveform r(t) = a received signal = s(t) + n(t), where n(t) is a “random process” having a set of known statistical characteristics How do we estimate a, given r(t)? Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Signals in Noise What is the criterion for evaluating how good an estimate of “a” we have derived from r(t)? How do we describe the noise n(t) in a way that is useful in determining how to estimate “a” from r(t)? Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Signals in Noise Suppose n(t) = nc(t) + x(t) where: n is a Gaussian random variable of variance N; and where, in some loosely defined sense: x(t) is a random process that is statistically independent of the random variable “n”, and where …. (continued on next slide) Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Signals in Noise ...x(t) is also “orthogonal” to the known waveform c(t), then we can construct a “hand waiving” argument that suggests that we can ignore x(t), and concentrate on the noise nc(t), as we attempt to estimate the underlying information variable “a”. To make this argument more precisely requires a deep understanding of the theory of random processes Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Signals in Noise We will show (using the blackboard) that we can convert this problem into the earlier problem of estimating an information parameter “a” from a received signal of the form r= ca + n. While doing so, we will introduce the concept of a “matched filter” Copyright 2001, S.D. Personick. All Rights Reserved.

Gaussian Random Processes If we look at (“sample”) the random process n(t) at times: t1,t2,t3,…,tj, then we get a set of random variables: n(t1), n(t2), n(t3), …,n(tj). If the set of random variables {n(tj)} has a joint probability density that is Gaussian, then n(t) is called a Gaussian random process Copyright 2001, S.D. Personick. All Rights Reserved.

Gaussian Random Processes (cont’d) Any linear combination of samples of a Gaussian random process is a Gaussian random variable Extending the above, the integral of the product n(t)c(t) over a time interval T is also a Gaussian random variable if, n(t) is a Gaussian random process, and c(t) is a known function Copyright 2001, S.D. Personick. All Rights Reserved.

Gaussian Random Processes (cont’d) Let n(t) be a random process (not necessarily Gaussian) Define “n” as follows: n= the integral over T of n(t)c(t)/W, where W= the integral over T of c(t)c(t) Then, we can write n(t) as follows: n(t)= nc(t) + “the rest of n(t)” Copyright 2001, S.D. Personick. All Rights Reserved.

Gaussian Random Processes (cont’d) If n(t) is a “white, Gaussian random process”, then: -n is a Gaussian random variable, and - “rest of n(t)” is statistically independent of “n”…I.e., “the rest of n(t)” contains no information that can help of estimate either “n” or “a” Copyright 2001, S.D. Personick. All Rights Reserved.

Gaussian Random Processes (cont’d) Furthermore, we can build a correlator that works as follows. It takes the received waveform, r(t), multiplies it by the known waveform c(t), integrates over the time interval T, and finally divides by W z= {the integral over T of r(t)c(t)}/W Copyright 2001, S.D. Personick. All Rights Reserved.

Gaussian Random Processes (cont’d) Going back to the definitions and equations above, we find that z= a + n, where “a” is the original information variable we wish to estimate, and n is a Gaussian random variable Thus by introducing the correlator, we convert the new problem to (continued) Copyright 2001, S.D. Personick. All Rights Reserved.

Gaussian Random Processes (cont’d) …the old problem of estimating a scalar (“a”) from another scalar (“r”); where r= a+n, and where n is a Gaussian random variable The correlation operation is also known as “matched filtering”, because it can be accomplished by passing r(t) through a filter whose impulse response is c(-t). Copyright 2001, S.D. Personick. All Rights Reserved.

Capturing analog waveforms Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Example Information Waveform Time Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Example Pulse stream Time Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Example Sampling Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Example Samples Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Example Pulse Amplitude Modulation (PAM) Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Example PAM Stream Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Example The PAM stream is a representation of the information signal Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Example s(t) = transmitted signal Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Example r(t) = s(t) + n(t) = received signal Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Matched Filtering y (t) h(t)= x(-t) a x(t) Matched Filter y(t)= integral of [h(t-u) a x(u)du] = integral of [ x(u-t) a x(u)du] y(0)= integral of [a (x(u)x(u)du] = a E Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Matched Filtering y (t) h(t)= x(-t) a x(t) + n(t) Matched Filter y(t)= integral of [h(t-u) {a x(u) + n(u)}du] = integral of [ x(u-t) {a x(u) + n(u)}du] y(0)= integral of [ (x(u){a x(u) + n(u)}du] = a E + n Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Example If each of the random variables “a” is a Gaussian random variable with variance A and, if n(t) is white Gaussian noise with “spectral density” N Then the optimal estimate of each of the variables “a” is given by “sampling” the output of the matched filter and multiplying it by: (1/E) [AE/(N+AE)] Copyright 2001, S.D. Personick. All Rights Reserved.

Copyright 2001, S.D. Personick. All Rights Reserved. Example If each of the random variables “a” is a Gaussian random variable with variance A and, if n(t) is “white” Gaussian noise with “spectral density” N Then the associated mean squared error of each estimate will be A [N/(N+AE)] Copyright 2001, S.D. Personick. All Rights Reserved.