Download presentation
Presentation is loading. Please wait.
Published byZoe Hensley Modified over 6 years ago
1
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband channels Signal space representation Optimal receivers Digital modulation techniques Channel coding Synchronization Information theory
2
Signal, random variable, random process and spectra
3
Signal, random variable, random process and spectra
Signals Review of probability and random variables Random processes: basic concepts Gaussian and White processes Selected from Chapter ,
4
Signal In communication systems, a signal is any function that carries information. Also called information bearing signal
5
Signal Continuous-time signal vs. discrete-time signal
Continuous-valued signal vs. discrete-valued signal Continuous-time continuous-valued: analog signal Discrete-time and discrete-valued: digital signal Discrete-time and continuous-valued : sampled signal Continuous-time and discrete-valued: quantized signal
6
Signal
7
Signal Energy vs. power signal Energy Power
A signal is an energy signal iff energy is limited A signal is a power signal iff power is limited
8
Signal Fourier Transform
9
Random variable Review of probability and random variables
Two events A and B Conditional probability P(A|B) Joint probability P(AB)=P(A)P(B|A)=P(B)P(A|B) A and B are independent iff P(AB)=P(A)P(B) Let be mutually exclusive events with . Then for any event , we have
10
Random variable Review of probability and random variables
Bayes’ Rule: Let be mutually exclusive such that For any nonzero probability event B, we have
11
Consider a binary communication system
Random variable Review of probability and random variables Consider a binary communication system
12
Random variable Review of probability and random variables
A random variable is a mapping from the sample space to the set of real numbers Discrete r.v.: range is finite ({0,1}) or countable infinite ({0,1,…}) Continuous r.v.: range is uncountable infinite (real number)
13
Random variable Review of probability and random variables
The cumulative distribution function (CDF) of a r.v. X is The key properties of CDF
14
Random variable Review of probability and random variables
The probability density function (PDF) of a r.v. X is The key properties of PDF
15
Random variable Review of probability and random variables
Common random variables: Bernoulli, Binomial, Uniform, and Gaussian Bernoulli distribution: Binomial distribution: the sum of n independent Bernoulli r.v.
16
Random variable Review of probability and random variables
Suppose that we transmit a 31-bit sequence with error correction capability up to 3 bit errors If the probability of a bit error is p=0.001, what is the probability that the sequence received is in error? On the other hand, if no error correction is used, the error probability is
17
Random variable Review of probability and random variables
Uniform distribution: The random phase of a sinusoid is often modeled as a uniform r.v. between 0 and The mean or expected value of X is first moment of X
18
Random variable Review of probability and random variables
The n-th moment of X If n=2, we have the mean-squared value of X The n-th central moment is If n=2, we have the variance of X is called the standard deviation
19
The most important distribution in communications!
Random variable Review of probability and random variables Gaussian distribution: A Gaussian r.v. is completely determined by its mean and variance, and hence usually denoted as The most important distribution in communications!
20
Extremely important in error probability analysis!
Random variable Review of probability and random variables Q-function is a standard form to express error probabilities without a closed-form The Q-function is the area under the tail of a Gaussian pdf with mean 0 and variance 1 Extremely important in error probability analysis!
21
Random variable Review of probability and random variables
Some key properties of Q-function Q-function is monotonically decreasing Craig’s alternative form of Q-function Upperbound For a Gaussian variable
22
Random variable Review of probability and random variables
Joint distribution of two r.v.s X and Y is And joint PDF is
23
Random variable Review of probability and random variables
Marginal distribution Marginal density X and Y are said to be independent iff
24
Random variable Review of probability and random variables
Correlation of the two r.v.s X and Y is Correlation of the two centered r.v.s X-E[X] and Y-E[Y] is called the covariance of X and Y If then X and Y are called uncorrelated.
25
Random variable Review of probability and random variables
The covariance of X and Y normalized w.r.t is referred to the correlation coefficient of X and Y If X and Y are independent, then they are uncorrelated. The converse is not true except for the Gaussian r.v.
26
Random variable Review of probability and random variables
Functions of random variable. How to obtain the PDF of r.v ? Two steps: Calculate the CDF of Y through Take the derivative of CDF For r.v.s , generally
27
Random variable Review of probability and random variables
are jointly Gaussian iff
28
Random variable Review of probability and random variables
In case of n=2, we have
29
If X1 and X2 are Gaussian and uncorrelated, they are independent.
Random variable Review of probability and random variables And if X1 and X2 are uncorrelated, i.e., If X1 and X2 are Gaussian and uncorrelated, they are independent.
30
Random variable Review of probability and random variables
If n random variables are jointly Gaussian, then any set of them is also jointly Gaussian. Jointly Gaussian r.v.s are completely characterized by mean vector and the covariance matrix. Any linear combination of the n r.v.s is Gaussian.
31
The average r.v. converges to the mean value
Random variable Review of probability and random variables Law of large numbers Consider a sequence of r.v , let , if the r.v.s are uncorrelated with the same mean , and variance , then The average r.v. converges to the mean value
32
Random variable Review of probability and random variables
Central limit theorem If are i.i.d. r.v.s with same mean , and variance , then Thermal noise results from the random movements of many electrons is well modeled by a Gaussian distribution.
33
Random variable Review of probability and random variables
Consider for example a sequence of uniform distribution r.v.s with mean 0 and variance 1/12, then n=2 n=4 n=1 n=8 Dashed line: Solid line:
34
Random variable Review of probability and random variables
Rayleigh distribution: Rayleigh distribution is usually used to model fading for non-line-of-sight (NLOS) signal transmissions. Very important distribution for analysis in mobile and wireless communications.
35
Random variable Review of probability and random variables
Consider for example h=a+jb, where a and b are i.i.d. Gaussian r.v.s with mean 0 and variance , then the magnitude of h follows Rayleigh distribution and the phase follows uniform distribution.
36
Random process and spectra
Random processes: basic concepts A random process (stochastic process, or random signal) is the evolution of random variables over time. A random process is a function defined over time and sample space with value at a specific time corresponding to a sample value . Given , are the random variables. Given , is the sample path function.
37
Random process and spectra
Random processes: basic concepts Let , for fixed amplitude and frequency, the random process have different sample path functions First trial Second trial Third trial
38
Random process and spectra
Random processes: basic concepts The statistics of random variables at different times CDF Expectation function Auto-correlation function Covariance
39
Random process and spectra
Random processes: basic concepts Consider for example the with , we have Expectation function: Auto-correlation function (assume ):
40
Random process and spectra
Random processes: Stationary processes A stochastic process is said to be stationary if for any n and The expectation function is independent of time The auto-correlation function only depends on the time difference
41
Random process and spectra
Random processes: Stationary processes A random process is said to be Wide sense stationary (WSS) if A strict stationary process satisfies for any n and
42
Random process and spectra
Random processes: Stationary processes Some properties of WSS If , then
43
Random process and spectra
Random processes: Stationary processes Statistical averaging Time averaging If time averaging = statistical averaging, then the process is said to be ergodic. (what does it mean?)
44
Random process and spectra
Random processes: Stationary processes
45
Random process and spectra
Random processes: Stationary processes Frequency domain characteristics
46
Random process and spectra
Random processes: Stationary processes Given a deterministic signal , the power is Truncate to get an energy signal Parseval’s theorem Hence
47
Random process and spectra
Random processes: Stationary processes Consider as a sample path function of a random process X(t) , the average power of X(t) is Power spectral density of X(t)
48
Random process and spectra
Random processes: Stationary processes Consider a binary semi-random signal with Find its power spectral density?
49
Random process and spectra
Random processes: Stationary processes Wiener-Khinchin theorem Property
50
Random process and spectra
Random processes: Stationary processes Consider for instance the random process , we have Then,
51
Random process and spectra
Random processes: Stationary processes Given a binary random signal with
52
Random process and spectra
Random processes: Stationary processes Then, we have
53
Random process and spectra
Random processes: Stationary processes Random process transmission through linear systems Mean of the output
54
Random process and spectra
Random processes: Stationary processes Auto-correlation of the output If the input is a WSS, then the output is also a WSS. PSD of the output
55
Random process and spectra
Random processes: Stationary processes Comparison between deterministic and random signal
56
Random process and spectra
Random processes: Stationary processes Consider for example
57
Random process and spectra
Random processes: Gaussian and White processes Gaussian process Some properties If it is WSS, it is strictly stationary. If the input to a linear system is a Gaussian process, the output is also a Gaussian process.
58
Random process and spectra
Random processes: Gaussian and White processes Consider for example the noise, which is often modeled as Gaussian and stationary with zero mean White noise
59
Random process and spectra
Random processes: Gaussian and White processes Band-limited noise Think about
60
Random process and spectra
Random processes: Gaussian and White processes Band-pass noise Canonical form of a band-pass noise process
61
Random process and spectra
Random processes: Gaussian and White processes Consider the band-pass noise. If n(t) is a zero-mean, stationary and Gaussian noise, then nc(t) and ns(t) satisfy the following properties: nc(t) and ns(t) are zero-mean, jointly stationary and Gaussian process.
62
Random process and spectra
Random processes: Gaussian and White processes Consider the band-pass noise. Angular representation of n(t) with
63
Random process and spectra
Random processes: Gaussian and White processes Find the statistics of the envelop and phase of the angular representation of n(t).
64
Random process and spectra
Random processes: Gaussian and White processes Overview
65
Signal, random variable, random process and spectra
Suggested reading & Homework Chapter , Chapter of the textbook
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.