Some Common Binary Signaling Formats:
NRZ RZ NRZ-B AMI Manchester
Symbols, Words, Messages A “Code” is a system of arranging symbols from a symbol alphabet into words (sequences) which have an agreed upon meaning between the sender and receiver. Written Languages Spoken languages The radix system of conveying value/quantity Roman Numerals Morse Code ASCII Codes Semaphores Codes may be hierarchical, or embedded: Binary > ASCII > Roman Letters > Words > Sentences A “Symbol” in one code may be a “Word” or “Message” in another.
Quantity of Information “I flipped a coin, and it came up.... ?” One Bit of Information is contained in the answer to a question which has two equally likely outcomes. “Is Dr. Lyall going to give everyone a gold bar after class today?” The two outcomes are not equally likely. You might guess that there is a 99% probability that the answer is “no”, so when I tell you that the answer is “no”, it contains very little information. But if I tell you the answer is “yes”, then that is a big deal, because it contains a great deal of information. Where p i is the probability of outcome i.
Examples The decimal number system uses 10 symbols (0.. 9). Assuming the occurrence of each symbol is equally likely (10% probability), the information content of each digit is -log 2 (0.1) = 3.32 bits/symbol = 1 “dit”. You wish to encode the 26 (lower case) letters of the English alphabet using decimal digits. Method 1: -log 2 (1/26) = 4.7 bits = (4.7 bits )/(3.32 bits/dit) = dits. Method 2: -log 10 (1/26) = dits Since you can’t send a fraction of a symbol, you need two decimal digits for the encoding, but each pair only carries dits of iniformation, so the Coding Efficiency is 1.415/2 = = 77.5 %. For binary encoding (two symbols: 0, 1), you need 5 symbols to express 4.7 bits. The Coding Efficiency is 4.7/5 = 0.94 = 94%. Suppose we wanted to use a three symbol alphabet, {*,#, +}. Each symbol expresses -log 2 (1/3) = bits/symbol. The number of symbols required to express 4.7 bits of information is (4.7 bits)/1.585 bits/symbol = 2.96 symbols, so three are required. Each group of three symbols carries 3 x bits = bits. The Coding Efficiency is 4.7/4.755 = 2.96/3 = ~ 99%. Decode the following: #+###*####**++++#+
In general, we attach importance to a message in relation to its ‘unexpectedness.’ An unlikely message (or symbol) carries more information than a likely message (or symbol). Every weekend I ask Dad for $50 to go out partying. 90% of the time he says NO, 10% of the time he says YES. There are two symbols in the alphabet. The information content of YES is - log 2 (0.1) = 3.32 bits. The information content of NO is -log 2 (0.9) = 0.15 bits. On the average, how many bits of information are in his answer? 90% of the time I get 0.15 bits, 10% of the time I get 3.32 bits. On the average, I get (0.9)(0.15) + (0.1)(3.32) =.467 Bits. What if YES and NO were equally likely (50% each)? The average information per symbol is called Entropy. Entropy is maximum when all symbols are equally likely.
Entropy Study of English Text
Definitions Baud Rate/Signaling Rate f B =1/T B : Symbols/Second Minimum One-sided Channel BW f c (min) = 1/2T B : Hz Average information Transfer Rate f i = H f B : Bits/Second System Capacity: Maximum Information Transfer Rate (Max H) C = f B log 2 (M) = 2f c log 2 (M) Maximum Information Transfer (T Seconds) : I T (max) = CT
Shannon Limit System Capacity: Maximum Information Transfer Rate (Max H) C = f B log 2 (M) = 2f c log 2 (M) Shannon Limit for System Capacity C = BW log 2 (SINAD) If BW > f c (min) Example: For SINAD = 1000 (30 dB) M < Or - For 32 symbol channel (5 bits/symbol) we must have SINAD > 30 dB