Postacademic Course on Telecommunications 20/4/00 p. 1 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Lecture-2:

Slides:



Advertisements
Similar presentations
Physical Layer: Signals, Capacity, and Coding
Advertisements

Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
MIMO Communication Systems
Information Theory EE322 Al-Sanie.
Comparison of different MIMO-OFDM signal detectors for LTE
Diversity techniques for flat fading channels BER vs. SNR in a flat fading channel Different kinds of diversity techniques Selection diversity performance.
Lab 2 COMMUNICATION TECHNOLOGY II. Capacity of a System The bit rate of a system increases with an increase in the number of signal levels we use to denote.
Chapter 6 Information Theory
Mattias Wennström Signals & Systems Group Mattias Wennström Uppsala University Sweden Promises of Wireless MIMO Systems.
Digital Data Transmission ECE 457 Spring Information Representation Communication systems convert information into a form suitable for transmission.
Fundamental limits in Information Theory Chapter 10 :
7: MIMO I: Spatial Multiplexing and Channel Modeling Fundamentals of Wireless Communication, Tse&Viswanath 1 7. MIMO: Spatial Multiplexing and Channel.
Matched Filters By: Andy Wang.
Noise, Information Theory, and Entropy
Noise, Information Theory, and Entropy
MIMO Multiple Input Multiple Output Communications © Omar Ahmad
MULTIPLE INPUT MULTIPLE OUTPUT SYSTEMS (MIMO)
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
20/4/00 p. 1 Postacademic Course on Telecommunications Module-3 Transmission Marc Moonen Lecture-1 Introduction K.U.Leuven/ESAT-SISTA Module-3 : Transmission.
Wireless Communication Elec 534 Set IV October 23, 2007
Lecture 3-1: Coding and Error Control
Lecture 1. References In no particular order Modern Digital and Analog Communication Systems, B. P. Lathi, 3 rd edition, 1998 Communication Systems Engineering,
Module-3 : Transmission Lecture-5 (4/5/00)
Optimization of adaptive coded modulation schemes for maximum average spectral efficiency H. Holm, G. E. Øien, M.-S. Alouini, D. Gesbert, and K. J. Hole.
27/4/00 p. 1 Postacademic Course on Telecommunications Module-3 Transmission Marc Moonen Lecture-3 Transmitter Design K.U.Leuven/ESAT-SISTA Module-3 :
§4 Continuous source and Gaussian channel
4/5/00 p. 1 Postacademic Course on Telecommunications Module-3 Transmission Marc Moonen Lecture-6 Adaptive Equalization K.U.Leuven/ESAT-SISTA Module-3.
Gaussian Channel. Introduction The most important continuous alphabet channel is the Gaussian channel depicted in Figure. This is a time-discrete channel.
Bandwidth and noise. Bandwidth basically means how fast your signal can change or how fast can you send out symbols. – Symbol is something you send out.
Channel Capacity
Course Review for Final ECE460 Spring, Common Fourier Transform Pairs 2.
Channel Capacity.
Signals CY2G2/SE2A2 Information Theory and Signals Aims: To discuss further concepts in information theory and to introduce signal theory. Outcomes:
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio.
Wireless Communication Elec 534 Set I September 9, 2007 Behnaam Aazhang.
Space-Time and Space-Frequency Coded Orthogonal Frequency Division Multiplexing Transmitter Diversity Techniques King F. Lee.
1 Chapter 1 Introduction to spread-spectrum communications Part I.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Information Theory The Work of Claude Shannon ( ) and others.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
University of Houston Cullen College of Engineering Electrical & Computer Engineering Capacity Scaling in MIMO Wireless System Under Correlated Fading.
11/5/00 p. 1 Postacademic Course on Telecommunications Module-3 Transmission Marc Moonen Lecture-8 Multi-tone Modulation K.U.Leuven/ESAT-SISTA Module-3.
1 Central Limit Theorem The theorem states that the sum of a large number of independent observations from the same distribution has, under certain general.
Limits On Wireless Communication In Fading Environment Using Multiple Antennas Presented By Fabian Rozario ECE Department Paper By G.J. Foschini and M.J.
Space Time Codes. 2 Attenuation in Wireless Channels Path loss: Signals attenuate due to distance Shadowing loss : absorption of radio waves by scattering.
OFDM Based WLAN System Song Ziqi Zhang Zhuo.
7: MIMO I: Spatial Multiplexing and Channel Modeling Fundamentals of Wireless Communication, Tse&Viswanath 1 7. MIMO I: Spatial Multiplexing and Channel.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Baseband Receiver Receiver Design: Demodulation Matched Filter Correlator Receiver Detection Max. Likelihood Detector Probability of Error.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3c: Signal Detection in AWGN.
ELEC E7210 Communication Theory Lectures autumn 2015 Department of Communications and Networking.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its.
EE359 – Lecture 14 Outline Practical Issues in Adaptive Modulation
EE359 – Lecture 8 Outline Capacity of Flat-Fading Channels
Space Time Codes.
MULTIPLE INPUT MULTIPLE OUTPUT SYSTEMS (MIMO)
Advanced Wireless Networks
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Advanced Wireless Networks
MIMO III: Channel Capacity, Interference Alignment
Subject Name: Information Theory Coding Subject Code: 10EC55
Nyquist and Shannon Capacity
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
Topics discussed in this section:
Chenhui Zheng/Communication Laboratory
Presentation transcript:

Postacademic Course on Telecommunications 20/4/00 p. 1 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Lecture-2: Limits of Communication Problem Statement: Given a communication channel (bandwidth B), and an amount of transmit power, what is the maximum achievable transmission bit-rate (bits/sec), for which the bit-error-rate is (can be) sufficiently (infinitely) small ? - Shannon theory (1948) - Recent topic: MIMO-transmission (e.g. V-BLAST 1998, see also Lecture-1)

Postacademic Course on Telecommunications 20/4/00 p. 2 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Overview `Just enough information about entropy’ (Lee & Messerschmitt 1994) self-information, entropy, mutual information,… Channel Capacity (frequency-flat channel) Channel Capacity (frequency-selective channel) example: multicarrier transmission MIMO Channel Capacity example: wireless MIMO

Postacademic Course on Telecommunications 20/4/00 p. 3 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA `Just enough information about entropy’(I) Consider a random variable X with sample space (`alphabet’) Self-information in an outcome is defined as where is probability for (Hartley 1928) `rare events (low probability) carry more information than common events’ `self-information is the amount of uncertainty removed after observing.’

Postacademic Course on Telecommunications 20/4/00 p. 4 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA `Just enough information about entropy’(II) Consider a random variable X with sample space (`alphabet’) Average information or entropy in X is defined as because of the log, information is measured in bits

Postacademic Course on Telecommunications 20/4/00 p. 5 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA `Just enough information about entropy’ (III) Example: sample space (`alphabet’) is {0,1} with entropy=1 bit if q=1/2 (`equiprobable symbols’) entropy=0 bit if q=0 or q=1 (`no info in certain events’) q H(X) 1 10

Postacademic Course on Telecommunications 20/4/00 p. 6 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA `Just enough information about entropy’ (IV) `Bits’ being a measure for entropy is slightly confusing (e.g. H(X)=0.456 bits??), but the theory leads to results, agreeing with our intuition (and with a `bit’ again being something that is either a `0’ or a `1’), and a spectacular theorem Example: alphabet with M=2^n equiprobable symbols : -> entropy = n bits i.e. every symbol carries n bits

Postacademic Course on Telecommunications 20/4/00 p. 7 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA `Just enough information about entropy’ (V) Consider a second random variable Y with sample space (`alphabet’) Y is viewed as a `channel output’, when X is the `channel input’. Observing Y, tells something about X: is the probability for after observing

Postacademic Course on Telecommunications 20/4/00 p. 8 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA `Just enough information about entropy’ (VI) Example-1 : Example-2 : (infinitely large alphabet size for Y) + noise decision device XY noise XY

Postacademic Course on Telecommunications 20/4/00 p. 9 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA `Just enough information about entropy’(VII) Average-information or entropy in X is defined as Conditional entropy in X is defined as Conditional entropy is a measure of the average uncertainty about the channel input X after observing the output Y

Postacademic Course on Telecommunications 20/4/00 p. 10 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA `Just enough information about entropy’(VIII) Average information or entropy in X is defined as Conditional entropy in X is defined as Average mutual information is defined as I(X|Y) is uncertainty about X that is removed by observing Y

Postacademic Course on Telecommunications 20/4/00 p. 11 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Channel Capacity (I) Average mutual information is defined by -the channel, i.e. transition probabilities -but also by the input probabilities Channel capacity (`per symbol’ or `per channel use’) is defined as the maximum I(X|Y) for all possible choices of A remarkably simple result: For a real-valued additive Gaussian noise channel, and infinitely large alphabet for X (and Y), channel capacity is signal (noise) variances

Postacademic Course on Telecommunications 20/4/00 p. 12 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Channel Capacity (II) A remarkable theorem (Shannon 1948): With R channel uses per second, and channel capacity C, a bit stream with bit-rate C*R (=capacity in bits/sec) can be transmitted with arbitrarily low probability of error = Upper bound for system performance !

Postacademic Course on Telecommunications 20/4/00 p. 13 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Channel Capacity (II) For a real-valued additive Gaussian noise channel, and infinitely large alphabet for X (and Y), the channel capacity is For a complex-valued additive Gaussian noise channel, and infinitely large alphabet for X (and Y), the channel capacity is

Postacademic Course on Telecommunications 20/4/00 p. 14 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Channel Capacity (III) Information I(X|Y) conveyed by a real-valued channel with additive white Gaussian noise, for different input alphabets, with all symbols in the alphabet equally likely (Ungerboeck 1982)

Postacademic Course on Telecommunications 20/4/00 p. 15 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Channel Capacity (IV) Information I(X|Y) conveyed by a complex-valued channel with additive white Gaussian noise, for different input alphabets, with all symbols in the alphabet equally likely (Ungerboeck 1982)

Postacademic Course on Telecommunications 20/4/00 p. 16 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Channel Capacity (V) This shows that, as long as the alphabet is sufficiently large, there is no significant loss in capacity by choosing a discrete input alphabet, hence justifies the usage of such alphabets ! The higher the SNR, the larger the required alphabet to approximate channel capacity

Postacademic Course on Telecommunications 20/4/00 p. 17 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Channel Capacity (frequency-flat channels) Up till now we considered capacity `per symbol’ or `per channel use’ A continuous-time channel with bandwidth B (Hz) allows 2B (per second) channel uses (*), i.e. 2B symbols being transmitted per second, hence capacity is (*) This is Nyquist criterion `upside-down’ (see also Lecture-3) received signal (noise) power

Postacademic Course on Telecommunications 20/4/00 p. 18 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Channel Capacity (frequency-flat channels) Example: AWGN baseband channel (additive white Gaussian noise channel) n(t) + channel s(t) r(t)=Ho.s(t)+n(t) Ho f H(f) B -B Ho here

Postacademic Course on Telecommunications 20/4/00 p. 19 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Channel Capacity (frequency-flat channels) Example: AWGN passband channel passband channel with bandwidth B accommodates complex baseband signal with bandwidth B/2 (see Lecture-3) n(t) + channel s(t) r(t)=Ho.s(t)+n(t) Ho f H(f) x Ho x+B

Postacademic Course on Telecommunications 20/4/00 p. 20 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Channel Capacity (frequency-selective channels) n(t) + channel s(t) R(f)=H(f).S(f)+N(f) H(f) Example: frequency-selective AWGN-channel received SNR is frequency-dependent! f H(f) B-B

Postacademic Course on Telecommunications 20/4/00 p. 21 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Channel Capacity (frequency-selective channels) Divide bandwidth into small bins of width df, such that H(f) is approx. constant over df Capacity is optimal transmit power spectrum? f H(f) B-B 0 B

Postacademic Course on Telecommunications 20/4/00 p. 22 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Channel Capacity (frequency-selective channels) Maximize subject to solution is `Water-pouring spectrum’ Available Power B L area

Postacademic Course on Telecommunications 20/4/00 p. 23 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Channel Capacity (frequency-selective channels) Example : multicarrier modulation available bandwidth is split up into different `tones’, every tone has a QAM-modulated carrier (modulation/demodulation by means of IFFT/FFT). In ADSL, e.g., every tone is given (+/-) the same power, such that an upper bound for capacity is (white noise case) (see Lecture-7/8)

Postacademic Course on Telecommunications 20/4/00 p. 24 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (I) SISO =`single-input/single output’ MIMO=`multiple-inputs/multiple-outputs’ Question: we usually think of channels with one transmitter and one receiver. Could there be any advantage in using multiple transmitters and/or receivers (e.g. multiple transmit/receive antennas in a wireless setting) ??? Answer: You bet..

Postacademic Course on Telecommunications 20/4/00 p. 25 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (II) 2-input/2-output example A B C D + + X1 X2Y2 Y1 N1 N2

Postacademic Course on Telecommunications 20/4/00 p. 26 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (III) Rules of the game: P transmitters means that the same total power is distributed over the available transmitters (no cheating) Q receivers means every receive signal is corrupted by the same amount of noise (no cheating) Noises on different receivers are often assumed to be uncorrelated (`spatially white’), for simplicity

Postacademic Course on Telecommunications 20/4/00 p. 27 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (IV) 2-in/2-out example, frequency-flat channels Ho X1 X2Y2 Y1 N1 N2 first example/attempt

Postacademic Course on Telecommunications 20/4/00 p. 28 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (V) 2-in/2-out example, frequency-flat channels corresponds to two separate channels, each with input power and additive noise total capacity is room for improvement...

Postacademic Course on Telecommunications 20/4/00 p. 29 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (VI) 2-in/2-out example, frequency-flat channels Ho -Ho Ho + + X1 X2Y2 Y1 N1 N2 second example/attempt

Postacademic Course on Telecommunications 20/4/00 p. 30 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (VII) A little linear algebra….. Matrix V’

Postacademic Course on Telecommunications 20/4/00 p. 31 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (VIII) A little linear algebra…. (continued) Matrix V is `orthogonal’ (V’.V=I) which means that it represents a transformation that conserves energy/power Use as a transmitter pre-transformation then (use V’.V=I)... Dig up your linear algebra course notes...

Postacademic Course on Telecommunications 20/4/00 p. 32 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (IX) Then… + + Y2 Y1 N1 N2 + + X^1 X^2 X2 X1 transmitter A B C D V11 V12 V21 V22 channelreceiver

Postacademic Course on Telecommunications 20/4/00 p. 33 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (X) corresponds to two separate channels, each with input power, output power and additive noise total capacity is 2x SISO-capacity!

Postacademic Course on Telecommunications 20/4/00 p. 34 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (XI) Conclusion: in general, with P transmitters and P receivers, capacity can be increased with a factor up to P (!) But: have to be `lucky’ with the channel (cfr. the two `attempts/examples’) Example : V-BLAST (Lucent 1998) up to 40 bits/sec/Hz in a `rich scattering environment’ (reflectors, …)

Postacademic Course on Telecommunications 20/4/00 p. 35 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (XII) General I/O-model is : every H may be decomposed into this is called a `singular value decompostion’, and works for every matrix (check your MatLab manuals) diagonal matrix orthogonal matrix V’.V=I orthogonal matrix U’.U=I

Postacademic Course on Telecommunications 20/4/00 p. 36 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (XIII) With H=U.S.V’, V is used as transmitter pre-tranformation (preserves transmit energy) and U’ is used as a receiver transformation (preserves noise energy on every channel) S=diagonal matrix, represents resulting, effectively `decoupled’ (SISO) channels Overall capacity is sum of SISO-capacities Power allocation over SISO-channels (and as a function of frequency) : water pouring

Postacademic Course on Telecommunications 20/4/00 p. 37 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA MIMO Channel Capacity (XIV) Reference: G.G. Rayleigh & J.M. Cioffi `Spatio-temporal coding for wireless communications’ IEEE Trans. On Communications, March 1998

Postacademic Course on Telecommunications 20/4/00 p. 38 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Assignment 1 (I) 1. Self-study material Dig up your favorite (?) signal processing textbook & refresh your knowledge on -discrete-time & continuous time signals & systems -signal transforms (s- and z-transforms, Fourier) -convolution, correlation -digital filters...will need this in next lectures

Postacademic Course on Telecommunications 20/4/00 p. 39 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Assignment 1 (II) 2. Exercise (MIMO channel capacity) Investigate channel capacity for… -SIMO-system with 1 transmitter, Q receivers -MISO-system with P transmitters, 1 receiver -MIMO-system with P transmitters, Q receivers P=Q (see Lecture 2) P>Q P<Q