Some Common Binary Signaling Formats:. 1 0 1 0 0 1 1 1 0 1 NRZ RZ NRZ-B AMI Manchester.

Slides:



Advertisements
Similar presentations
Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Advertisements

Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
Chapter 2 Fundamentals of Data and Signals
Information & Entropy. Shannon Information Axioms Small probability events should have more information than large probabilities. – “the nice person”
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
SIMS-201 Compressing Information. 2  Overview Chapter 7: Compression Introduction Entropy Huffman coding Universal coding.
Chapter Two Fundamentals of Data and Signals
Chapter Two Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User's Approach Sixth Edition.
Review Ch.1,Ch.4,Ch.7. Review of tags covered various header tags Img tag Style, attributes and values alt.
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
Fundamental limits in Information Theory Chapter 10 :
Chapter Two Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User's Approach Fifth Edition.
February 1,
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
Chapter 2 Fundamentals of Data and Signals
4.1 Chapter 4 Digital Transmission Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
1 Chapter 1 Introduction. 2 Outline 1.1 A Very Abstract Summary 1.2 History 1.3 Model of the Signaling System 1.4 Information Source 1.5 Encoding a Source.
Chapter 2: Fundamentals of Data and Signals. 2 Objectives After reading this chapter, you should be able to: Distinguish between data and signals, and.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
1 Chapter 2 Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User’s Approach.
Lecture 1 Professor: Dr. Miguel Alonso Jr.. Outline Intro to the History of Data Communications A Basic Communication System Elements of Microwave and.
Huffman Coding Vida Movahedi October Contents A simple example Definitions Huffman Coding Algorithm Image Compression.
STATISTIC & INFORMATION THEORY (CSNB134)
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
01/10/04 SUMS/1/21 COM347J1 Networks and Data Communications Ian McCrumRoom 5D03B Tel: 90.
Data Communications & Computer Networks, Second Edition1 Chapter 2 Fundamentals of Data and Signals.
COMT 222 Tools for a Digital World. Digital? What makes information Digital? If it helps:  When is information not analog? Answer:  A finite number.
Representing text Each of different symbol on the text (alphabet letter) is assigned a unique bit patterns the text is then representing as.
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
1 INFORMATION IN DIGITAL DEVICES. 2 Digital Devices Most computers today are composed of digital devices. –Process electrical signals. –Can only have.
(Important to algorithm analysis )
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
Channel Capacity.
Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat
BZUPAGES.COM 4.1 Chapter 4 Digital Transmission Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
British Computer Society (BCS)
Linawati Electrical Engineering Department Udayana University
DCSP-8: Minimal length coding I Jianfeng Feng Department of Computer Science Warwick Univ., UK
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Introduction to Digital and Analog Communication Systems
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.
Chapter Two Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User's Approach Eighth Edition.
Quiz: Do A or B (15 Points for both correct) A)Using the table at the right, compute how much information is in “information”. B)Considering lower case.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
ECE 101 An Introduction to Information Technology Information Coding.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Powerpoint Templates Computer Communication & Networks Week # 04 1 Lecture only.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
Chapter 4 Digital Transmission Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Data Communication and Networking Digital Transmission Chapter 4.
Information Theory Information Suppose that we have the source alphabet of q symbols s 1, s 2,.., s q, each with its probability p(s i )=p i. How much.
DATA REPRESENTATION - TEXT
Chapter 4 Information Theory
Binary Representation in Text
Binary Representation in Text
Chapter 2: Physical Layer
Binary 1 Basic conversions.
Introduction to Information theory
Information Theory Michael J. Watts
Fundamentals of Networking and
A Brief Introduction to Information Theory
Lecture 9: Radix-64 Tutorial
CSCD 433 Network Programming
C Programming Language
1.6) Storing Integer: 1.7) storing fraction:
Presentation transcript:

Some Common Binary Signaling Formats:

NRZ RZ NRZ-B AMI Manchester

Symbols, Words, Messages A “Code” is a system of arranging symbols from a symbol alphabet into words (sequences) which have an agreed upon meaning between the sender and receiver. Written Languages Spoken languages The radix system of conveying value/quantity Roman Numerals Morse Code ASCII Codes Semaphores Codes may be hierarchical, or embedded: Binary > ASCII > Roman Letters > Words > Sentences A “Symbol” in one code may be a “Word” or “Message” in another.

Quantity of Information “I flipped a coin, and it came up.... ?” One Bit of Information is contained in the answer to a question which has two equally likely outcomes. “Is Dr. Lyall going to give everyone a gold bar after class today?” The two outcomes are not equally likely. You might guess that there is a 99% probability that the answer is “no”, so when I tell you that the answer is “no”, it contains very little information. But if I tell you the answer is “yes”, then that is a big deal, because it contains a great deal of information. Where p i is the probability of outcome i.

Examples The decimal number system uses 10 symbols (0.. 9). Assuming the occurrence of each symbol is equally likely (10% probability), the information content of each digit is -log 2 (0.1) = 3.32 bits/symbol = 1 “dit”. You wish to encode the 26 (lower case) letters of the English alphabet using decimal digits. Method 1: -log 2 (1/26) = 4.7 bits = (4.7 bits )/(3.32 bits/dit) = dits. Method 2: -log 10 (1/26) = dits Since you can’t send a fraction of a symbol, you need two decimal digits for the encoding, but each pair only carries dits of iniformation, so the Coding Efficiency is 1.415/2 = = 77.5 %. For binary encoding (two symbols: 0, 1), you need 5 symbols to express 4.7 bits. The Coding Efficiency is 4.7/5 = 0.94 = 94%. Suppose we wanted to use a three symbol alphabet, {*,#, +}. Each symbol expresses -log 2 (1/3) = bits/symbol. The number of symbols required to express 4.7 bits of information is (4.7 bits)/1.585 bits/symbol = 2.96 symbols, so three are required. Each group of three symbols carries 3 x bits = bits. The Coding Efficiency is 4.7/4.755 = 2.96/3 = ~ 99%. Decode the following: #+###*####**++++#+

In general, we attach importance to a message in relation to its ‘unexpectedness.’ An unlikely message (or symbol) carries more information than a likely message (or symbol). Every weekend I ask Dad for $50 to go out partying. 90% of the time he says NO, 10% of the time he says YES. There are two symbols in the alphabet. The information content of YES is - log 2 (0.1) = 3.32 bits. The information content of NO is -log 2 (0.9) = 0.15 bits. On the average, how many bits of information are in his answer? 90% of the time I get 0.15 bits, 10% of the time I get 3.32 bits. On the average, I get (0.9)(0.15) + (0.1)(3.32) =.467 Bits. What if YES and NO were equally likely (50% each)? The average information per symbol is called Entropy. Entropy is maximum when all symbols are equally likely.

Entropy Study of English Text

Definitions Baud Rate/Signaling Rate f B =1/T B : Symbols/Second Minimum One-sided Channel BW f c (min) = 1/2T B : Hz Average information Transfer Rate f i = H f B : Bits/Second System Capacity: Maximum Information Transfer Rate (Max H) C = f B log 2 (M) = 2f c log 2 (M) Maximum Information Transfer (T Seconds) : I T (max) = CT

Shannon Limit System Capacity: Maximum Information Transfer Rate (Max H) C = f B log 2 (M) = 2f c log 2 (M) Shannon Limit for System Capacity C = BW log 2 (SINAD) If BW > f c (min) Example: For SINAD = 1000 (30 dB) M < Or - For 32 symbol channel (5 bits/symbol) we must have SINAD > 30 dB