Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband channels Signal space representation Optimal receivers Digital modulation techniques Channel coding Synchronization Information theory
Signal, random variable, random process and spectra
Signal, random variable, random process and spectra Signals Review of probability and random variables Random processes: basic concepts Gaussian and White processes Selected from Chapter 2.1-2.6, 5.1-5.3
Signal In communication systems, a signal is any function that carries information. Also called information bearing signal
Signal Continuous-time signal vs. discrete-time signal Continuous-valued signal vs. discrete-valued signal Continuous-time continuous-valued: analog signal Discrete-time and discrete-valued: digital signal Discrete-time and continuous-valued : sampled signal Continuous-time and discrete-valued: quantized signal
Signal
Signal Energy vs. power signal Energy Power A signal is an energy signal iff energy is limited A signal is a power signal iff power is limited
Signal Fourier Transform
Random variable Review of probability and random variables Two events A and B Conditional probability P(A|B) Joint probability P(AB)=P(A)P(B|A)=P(B)P(A|B) A and B are independent iff P(AB)=P(A)P(B) Let be mutually exclusive events with . Then for any event , we have
Random variable Review of probability and random variables Bayes’ Rule: Let be mutually exclusive such that . For any nonzero probability event B, we have
Consider a binary communication system Random variable Review of probability and random variables Consider a binary communication system
Random variable Review of probability and random variables A random variable is a mapping from the sample space to the set of real numbers Discrete r.v.: range is finite ({0,1}) or countable infinite ({0,1,…}) Continuous r.v.: range is uncountable infinite (real number)
Random variable Review of probability and random variables The cumulative distribution function (CDF) of a r.v. X is The key properties of CDF
Random variable Review of probability and random variables The probability density function (PDF) of a r.v. X is The key properties of PDF
Random variable Review of probability and random variables Common random variables: Bernoulli, Binomial, Uniform, and Gaussian Bernoulli distribution: Binomial distribution: the sum of n independent Bernoulli r.v.
Random variable Review of probability and random variables Suppose that we transmit a 31-bit sequence with error correction capability up to 3 bit errors If the probability of a bit error is p=0.001, what is the probability that the sequence received is in error? On the other hand, if no error correction is used, the error probability is
Random variable Review of probability and random variables Uniform distribution: The random phase of a sinusoid is often modeled as a uniform r.v. between 0 and The mean or expected value of X is first moment of X
Random variable Review of probability and random variables The n-th moment of X If n=2, we have the mean-squared value of X The n-th central moment is If n=2, we have the variance of X is called the standard deviation
The most important distribution in communications! Random variable Review of probability and random variables Gaussian distribution: A Gaussian r.v. is completely determined by its mean and variance, and hence usually denoted as The most important distribution in communications!
Extremely important in error probability analysis! Random variable Review of probability and random variables Q-function is a standard form to express error probabilities without a closed-form The Q-function is the area under the tail of a Gaussian pdf with mean 0 and variance 1 Extremely important in error probability analysis!
Random variable Review of probability and random variables Some key properties of Q-function Q-function is monotonically decreasing Craig’s alternative form of Q-function Upperbound For a Gaussian variable
Random variable Review of probability and random variables Joint distribution of two r.v.s X and Y is And joint PDF is
Random variable Review of probability and random variables Marginal distribution Marginal density X and Y are said to be independent iff
Random variable Review of probability and random variables Correlation of the two r.v.s X and Y is Correlation of the two centered r.v.s X-E[X] and Y-E[Y] is called the covariance of X and Y If then X and Y are called uncorrelated.
Random variable Review of probability and random variables The covariance of X and Y normalized w.r.t. is referred to the correlation coefficient of X and Y If X and Y are independent, then they are uncorrelated. The converse is not true except for the Gaussian r.v.
Random variable Review of probability and random variables Functions of random variable. How to obtain the PDF of r.v. ? Two steps: Calculate the CDF of Y through Take the derivative of CDF For r.v.s , generally
Random variable Review of probability and random variables are jointly Gaussian iff
Random variable Review of probability and random variables In case of n=2, we have
If X1 and X2 are Gaussian and uncorrelated, they are independent. Random variable Review of probability and random variables And if X1 and X2 are uncorrelated, i.e., If X1 and X2 are Gaussian and uncorrelated, they are independent.
Random variable Review of probability and random variables If n random variables are jointly Gaussian, then any set of them is also jointly Gaussian. Jointly Gaussian r.v.s are completely characterized by mean vector and the covariance matrix. Any linear combination of the n r.v.s is Gaussian.
The average r.v. converges to the mean value Random variable Review of probability and random variables Law of large numbers Consider a sequence of r.v. , let , if the r.v.s are uncorrelated with the same mean , and variance , then The average r.v. converges to the mean value
Random variable Review of probability and random variables Central limit theorem If are i.i.d. r.v.s with same mean , and variance , then Thermal noise results from the random movements of many electrons is well modeled by a Gaussian distribution.
Random variable Review of probability and random variables Consider for example a sequence of uniform distribution r.v.s with mean 0 and variance 1/12, then n=2 n=4 n=1 n=8 Dashed line: Solid line:
Random variable Review of probability and random variables Rayleigh distribution: Rayleigh distribution is usually used to model fading for non-line-of-sight (NLOS) signal transmissions. Very important distribution for analysis in mobile and wireless communications.
Random variable Review of probability and random variables Consider for example h=a+jb, where a and b are i.i.d. Gaussian r.v.s with mean 0 and variance , then the magnitude of h follows Rayleigh distribution and the phase follows uniform distribution.
Random process and spectra Random processes: basic concepts A random process (stochastic process, or random signal) is the evolution of random variables over time. A random process is a function defined over time and sample space with value at a specific time corresponding to a sample value . Given , are the random variables. Given , is the sample path function.
Random process and spectra Random processes: basic concepts Let , for fixed amplitude and frequency, the random process have different sample path functions First trial Second trial Third trial
Random process and spectra Random processes: basic concepts The statistics of random variables at different times CDF Expectation function Auto-correlation function Covariance
Random process and spectra Random processes: basic concepts Consider for example the with , we have Expectation function: Auto-correlation function (assume ):
Random process and spectra Random processes: Stationary processes A stochastic process is said to be stationary if for any n and The expectation function is independent of time The auto-correlation function only depends on the time difference
Random process and spectra Random processes: Stationary processes A random process is said to be Wide sense stationary (WSS) if A strict stationary process satisfies for any n and
Random process and spectra Random processes: Stationary processes Some properties of WSS If , then
Random process and spectra Random processes: Stationary processes Statistical averaging Time averaging If time averaging = statistical averaging, then the process is said to be ergodic. (what does it mean?)
Random process and spectra Random processes: Stationary processes
Random process and spectra Random processes: Stationary processes Frequency domain characteristics
Random process and spectra Random processes: Stationary processes Given a deterministic signal , the power is Truncate to get an energy signal Parseval’s theorem Hence
Random process and spectra Random processes: Stationary processes Consider as a sample path function of a random process X(t) , the average power of X(t) is Power spectral density of X(t)
Random process and spectra Random processes: Stationary processes Consider a binary semi-random signal with Find its power spectral density?
Random process and spectra Random processes: Stationary processes Wiener-Khinchin theorem Property
Random process and spectra Random processes: Stationary processes Consider for instance the random process , we have Then,
Random process and spectra Random processes: Stationary processes Given a binary random signal with
Random process and spectra Random processes: Stationary processes Then, we have
Random process and spectra Random processes: Stationary processes Random process transmission through linear systems Mean of the output
Random process and spectra Random processes: Stationary processes Auto-correlation of the output If the input is a WSS, then the output is also a WSS. PSD of the output
Random process and spectra Random processes: Stationary processes Comparison between deterministic and random signal
Random process and spectra Random processes: Stationary processes Consider for example
Random process and spectra Random processes: Gaussian and White processes Gaussian process Some properties If it is WSS, it is strictly stationary. If the input to a linear system is a Gaussian process, the output is also a Gaussian process.
Random process and spectra Random processes: Gaussian and White processes Consider for example the noise, which is often modeled as Gaussian and stationary with zero mean White noise
Random process and spectra Random processes: Gaussian and White processes Band-limited noise Think about
Random process and spectra Random processes: Gaussian and White processes Band-pass noise Canonical form of a band-pass noise process
Random process and spectra Random processes: Gaussian and White processes Consider the band-pass noise. If n(t) is a zero-mean, stationary and Gaussian noise, then nc(t) and ns(t) satisfy the following properties: nc(t) and ns(t) are zero-mean, jointly stationary and Gaussian process.
Random process and spectra Random processes: Gaussian and White processes Consider the band-pass noise. Angular representation of n(t) with
Random process and spectra Random processes: Gaussian and White processes Find the statistics of the envelop and phase of the angular representation of n(t).
Random process and spectra Random processes: Gaussian and White processes Overview
Signal, random variable, random process and spectra Suggested reading & Homework Chapter 2.1-2.6, Chapter 5.1-5.3 of the textbook