Download presentation
Presentation is loading. Please wait.
Published byBrittney Pierce Modified over 8 years ago
1
Wireless Communication Elec 534 Set I September 9, 2007 Behnaam Aazhang
2
The Course Light homework Team project –Individual paper presentations Mid October –Team project presentations Early December
3
Multiuser Network Multiple nodes with information
4
Outline Transmission over simple channels –Information theoretic approach –Fundamental limits –Approaching capacity Fading channel models –Multipath –Rayleigh –Rician
5
Outline Transmission over fading channels –Information theoretic approach –Fundamental limits –Approaching achievable rates Communication with “additional” dimensions –Multiple input multiple (MIMO) Achievable rates Transmission techniques –User cooperation Achievable rates Transmission techniques
6
Outline Wireless network –Cellular radios –Multiple access Achievable rate region Multiuser detection –Random access
7
Why Information Theory? Information is modeled as random Information is quantified Transmission of information –Model driven –Reliability measured –Rate is established
8
Information Entropy –Higher entropy (more random) higher information content Random variable –Discrete –Continuous
9
Communication Information transmission Mutual information Channel Useful Information Noise; useless information Maximum useful information
10
Wireless Information transmission Channel Useful Information Noise; useless information Maximum useful information Interference Randomness due to channel
11
Multiuser Network Multiple nodes with information
12
References C.E. Shannon, W. Weaver, A Mathematical Theory Communication, 1949. T.M. Cover and J. Thomas, Elements of Information Theory, 1991. R. Gallager, Information Theory and Reliable Communication, 1968. J. Proakis, Digital Communication, 4 th edition D. Tse and P. Viswanath, Fundamentals of Wireless Communication, 2005. A. Goldsmith “Wireless Communication” Cambridge University Press 2005
13
References E. Biglieri, J. Proakis, S. Shamai, Fading Channels: Information Theoretic and Communications, IEEE IT Trans.,1999. A. Goldsmith, P. Varaiya, Capacity of Fading Channels with Channel Side Information, IEEE IT Trans. 1997. I. Telatar, Capacity of Multi-antenna Gaussian Channels, European Trans. Telecomm, 1999. A. Sendonaris, E. Erkip, and B. Aazhang, “User cooperation diversity, Part I. Systemdescription,” IEEE Trans. Commun., Nov. 2003. ——, “User cooperation diversity. Part II. Implementation aspects and performance analysis,” IEEE Trans. Commun., Nov. 2003. J. N. Laneman, D. N. C. Tse, and G. W. Wornell, “Cooperative diversity in wireless networks: Efficient protocols and outage behavior,” IEEE Trans. Inform. Theory, Dec. 2004. M.A. Khojastepour, A. Sabharwal, and B. Aazhang, “On capacity of Gaussian ‘cheap’ relay channel,” GLOBECOM, Dec. 2003.
14
Reading for Set 1 Tse and Viswanath –Chapters 5.1-5.3, 3.1 –Appendices A, B.1-B.5 Goldsmith –Chapters 1, 4.1,5 –Appendices A, B, C
15
Single Link AWGN Channel Model where r(t) is the baseband received signal, b(t) is the information bearing signal, and n(t) is noise. The signal b(t) is assumed to be band-limited to W. The time period is assumed to be T. The dimension of signal is N=2WT
16
Signal Dimensions A signal with bandwidth W sampled at the Nyquist rate. W complex (independent) samples per second. Each complex sample is one dimension or degree of freedom. Signal of duration T and bandwidth W has 2WT real degrees of freedom and can be represented 2WT real dimensions
17
Signals in Time Domain Sampled at Nyquist rate Example: three independent samples per second means three degrees of freedom time Voltage 1/W 1 second
18
Signal in Frequency Domain Bandwidth W at carrier frequency f c frequency Power W Carrier frequency f c
19
Baseband Signal in Frequency Domain Passband signal down converted Bandwidth W frequency Power W
20
Sampling The baseband signal sampled at rate W Where Sinc function is an example of expansion basis
21
Model There are N orthonormal basis functions to represent the information signal space. For example, The discrete time version
22
Noise Assumed to be a Gaussian process –Zero mean –Wide sense stationary –Flat power spectral density with height Passed through a filter with BW of W –Samples at the rate W are Gaussian –Samples are independent
23
Noise Projection of noise Projections, n i onto orthonormal bases f i (t) are –zero mean –Gaussian –Variance
24
Noise The samples of noise are Gaussian and independent The received signal given the information samples are also Gaussian
25
Model The discrete time formulation can come from sampling the received signal at the Nyquist rate of W The final model The discrete time model could have come from projection or simple sampling
26
Statistical Model Key part of the model The discrete time received signals are independent since noise is assumed white
27
Entropy Differential entropy Differential conditional entropy with
28
Example A Gaussian random variable with mean and variance The differential entropy is If complex then it is Among all random variables with fixed variance Gaussian has the largest differential entropy
29
Proof Consider two zero mean random variables X and Y with the same variance Assume X is Gaussian Variance of X
30
Proof Kullback-Leibler distance Due to Gibbs inequality!
31
Gibbs’ Inequality The KL distance is nonnegative
32
Capacity Formally defined by Shannon as where the mutual information with
33
Capacity Maximum reliable rate of information through the channel with this model. In our model
34
Mutual Information Information flow Channel Useful Information Noise; useless information Maximum useful information
35
Capacity In this model the maximum is achieved when information vector has mutually independent and Gaussian distributed elements.
36
AWGN Channel Capacity The average power of information signal The noise variance
37
AWGN Capacity The original Shannon formula per unit time An alternate with energy per bit
38
Achievable Rate and Converse Construct codebook with N-dimensional space Law of large numbers Sphere packing
39
Sphere Packing Number of spheres (ratio of volumes) Non overlapping –As N grows the probability of codeword error vanishes Higher rates not possible without overlap
40
Achievable Rate and Converse Construct codebook with bits in N channel use
41
Achieving Capacity The information vector should be mutually independent with Gaussian distribution The dimension N should be large –Complexity Source has information to transmit –Full buffer Channel is available –No contention for access –Point to point
42
Achieving Capacity Accurate model –Statistical Noise –Deterministic Linear channel Signal model at the receiver Timing Synchronization
43
Approaching Capacity High SNR: –Coded modulation with large constellation size –Large constellation with binary codes Low SNR: –Binary modulation –Turbo coding –LDPC coding
44
Constellations and Coding
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.