Chapter 4: Information Theory
Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its measure. LO 4.2 – Discuss the probabilistic behaviour of a source of information. LO 4.3 – Illustrate the properties of discrete memoryless channel and mutual information. LO 4.4 –Analyze the intrinsic ability of the communication channel to convey information reliably.
4.1.1 Discrete and Continuous Message Figure An Analog Discrete-Time Signal Figure An Analog Discrete-Time Signal t s(t) t Figure An Analog Continuous-Time Signal Figure An Analog Continuous-Time Signal
… Discrete and Continuous Messages Figure A Digital Discrete-Time Signal Figure A Digital Discrete-Time Signal Figure A Digital Continuous-Time Signal Figure A Digital Continuous-Time Signal t s(t) t
….. Discrete and Continuous Messages Figure A Digital Communication System with DMS Discrete Memoryless Source (DMS) Binary Source and Channel Encoder Binary Source and Channel Decoder Destination Channel noise Binary symmetric channel Transmitter side 0, 1 Receiver side 0, 1
4.1.2 Amount of Information Measure of Information : Bit Nat – 1 nat = 1.44 bits Decit or Hartley – 1 Decit = 3.32 bits
4.2 Average Information and Entropy The average information :that represents statistical average per individual message generated by a source is known as entropy, expresses in bits per symbol.
Concept of Information and Entropy The information contained in a message depends on its probability of occurrence. That is, if the probability of occurrence of a particular message is more, then it contains less amount of information and vice versa. The Entropy of a source is a measure of the average amount of information per source symbol in a long message. It is usually expressed in bits per symbol.
Properties of Entropy
Entropy of Binary Memoryless Source A binary source is said to be memoryless when it generates statistically independent successive symbols 0 and 1.
Differential Entropy Properties of Differential Entropy
Joint Entropy The joint entropy is the average uncertainty of the communication channel as a whole considering the entropy due to channel input as well as channel output.
Conditional Entropy It is a measure of the average uncertainty remaining about the channel input after the channel output, and the channel output after the channel input has been observed, respectively.
Average Effective Entropy It is the difference between the entropy of the source and the conditional entropy of the message. If a discrete memoryless source generates r messages per second, then the information rate or the average information per second is defined as
Coding of Information
4.3 Characteristics of a Discrete Memoryless Channel A Channel Matrix. Or Probability Transition Matrix
Binary Symmetric Channel (BSC) It is a binary channel which can transmit only one of two symbols (0 and 1) In BSC channel, transmission is not perfect, and occasionally the receiver gets the wrong bit.
Binary Ersure Channel (BEC) Input symbols Output symbols x 0 = 0 x 1 = 1 y 0 = 0 y e = e p(y 0 /x 0 ) p(y 1 /x 1 ) p(y e /x 1 ) p(y e /x 0 ) y 1 = 1 Figure A General Model of a Binary Erasure Channel Figure A General Model of a Binary Erasure Channel
4.3.1 Mutual Information
Properties of Mutual Information Symmetrical property : Non-negative property : Joint Entropy of Input/output Channel :
4.4 Shannon’s Channel Coding Theorem
Implementation of Shannon’s Channel Coding Theorem in BSC
4.4.1 Channel Capacity A channel that possesses Gaussian noise characteristics is known as a Gaussian channel. If the band-limited white Gaussian noise is linearly added with input during transmission through a channel, then it is called additive white Gaussian noise (AWGN), and the channel is called AWGN channel.
Shannon Channel Capacity Theorem
About the Author T. L. Singal graduated from National Institute of Technology, Kurukshetra and post-graduated from Punjab Technical university in Electronics & Communication Engineering. He began his career with Avionics Design Bureau, HAL, Hyderabad in 1981 and worked on Radar Communication Systems. Then he led R&D group in a Telecom company and successfully developed Multi- Access VHF Wireless Communication Systems. He visited Germany during He executed international assignment as Senior Network Consultant with Flextronics Network Services, Texas, USA during He was associated with Nokia, AT&T, Cingular Wireless and Nortel Networks, for optimization of 2G/3G Cellular Networks in USA. Since 2003, he is in teaching profession in reputed engineering colleges in India. He has number of technical research papers published in the IEEE Proceedings, Journals, and International/National Conferences. He has authored three text-books `Wireless Communications (2010)’, `Analog & Digital Communications (2012)’, and `Digital Communication (2015)’ with internationally renowned publisher McGraw-Hill Education.
THANKS!