1.Check if channel capacity* can cope with source information rate, if yes, source coding can proceed. Understand why. (i) Calculate Source entropy from.

Slides:



Advertisements
Similar presentations
Chapter 2: Digital Modulation
Advertisements

Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
Chapter Thirteen: Multiplexing and Multiple- Access Techniques.
Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
CY2G2 Information Theory 1
Information Theory EE322 Al-Sanie.
1/15 KLKSK Pertemuan III Analog & Digital Data Shannon Theorem xDSL.
DIGITAL COMMUNICATIONS.  The modern world is dependent on digital communications.  Radio, television and telephone systems were essentially analog in.
EE 4272Spring, 2003 Chapter 3 Data Transmission Part II Data Communications Concept & Terminology Signal : Time Domain & Frequency Domain Concepts Signal.
Lab 2 COMMUNICATION TECHNOLOGY II. Capacity of a System The bit rate of a system increases with an increase in the number of signal levels we use to denote.
Chapter 6 Information Theory
Chapter 3 Data and Signals
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
Fundamental limits in Information Theory Chapter 10 :
1 NETWORK CODING Anthony Ephremides University of Maryland - A NEW PARADIGM FOR NETWORKING - February 29, 2008 University of Minnesota.
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
Module 3.0: Data Transmission
Molecular Information Theory Niru Chennagiri Probability and Statistics Fall 2004 Dr. Michael Partensky.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Source Coding Hafiz Malik Dept. of Electrical & Computer Engineering The University of Michigan-Dearborn
Communication Systems
Noise, Information Theory, and Entropy
IT-101 Section 001 Lecture #15 Introduction to Information Technology.
Noise and SNR. Noise unwanted signals inserted between transmitter and receiver is the major limiting factor in communications system performance 2.
STATISTIC & INFORMATION THEORY (CSNB134)
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Review: The application layer. –Network Applications see the network as the abstract provided by the transport layer: Logical full mesh among network end-points.
Noiseless Coding. Introduction Noiseless Coding Compression without distortion Basic Concept Symbols with lower probabilities are represented by the binary.
CY2G2 Information Theory 5
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Channel Capacity.
Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Introduction to Digital and Analog Communication Systems
DIGITAL COMMUNICATIONS Linear Block Codes
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
ECE 4710: Lecture #2 1 Frequency  Communication systems often use atmosphere for transmission  “Wireless”  Time-varying Electro-Magnetic (EM) Wave 
Data Compression Meeting October 25, 2002 Arithmetic Coding.
Physical Layer PART II. Position of the physical layer.
Electromagnetic Spectrum
Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.
Verification & Validation. Batch processing In a batch processing system, documents such as sales orders are collected into batches of typically 50 documents.
1 Central Limit Theorem The theorem states that the sum of a large number of independent observations from the same distribution has, under certain general.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
1 st semester 1436/  When a signal is transmitted over a communication channel, it is subjected to different types of impairments because of imperfect.
1 3. Data Transmission. Prof. Sang-Jo Yoo 2 Contents  Concept and Terminology  Analog and Digital Data Transmission  Transmission Impairments  Asynchronous.
The Channel and Mutual Information
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Mutual Information, Joint Entropy & Conditional Entropy
Sami Khorbotly, Ph.D., IEEE member Assistant Professor
DIGITAL COMMUNICATION. Introduction In a data communication system, the output of the data source is transmitted from one point to another. The rate of.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
Digital Communications Chapter 6. Channel Coding: Part 1
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Information Theory Information Suppose that we have the source alphabet of q symbols s 1, s 2,.., s q, each with its probability p(s i )=p i. How much.
Fundamentals of Communications. Communication System Transmitter: originates the signal Receiver: receives transmitted signal after it travels over the.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Lecture 1.31 Criteria for optimal reception of radio signals.
Introduction to Information theory
Introduction to electronic communication systems
Information Theory Michael J. Watts
Digital data communication (Error control)
Chapter 6.
Recap lecture 29 Example of prefixes of a language, Theorem: pref(Q in R) is regular, proof, example, Decidablity, deciding whether two languages are equivalent.
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
REVIEW Physical Layer.
Presentation transcript:

1.Check if channel capacity* can cope with source information rate, if yes, source coding can proceed. Understand why. (i) Calculate Source entropy from symbols probabilities* (ii) Source information rate = Source entropy × symbol rate* 2.Coder design: Step 1: Initially starts singly; Step 2: (i) Coding by your optional method ; (ii) Calculate average length; (iii) Coder rate = Average length × symbol rate* (iv) Check if this is ok, if yes, Goto Step 4; Step 3: change into a higher level of group coding (from coding singly to by pair, or from by pair to by threes); then repeat Step 2; Step 4: Coding finishes. You may still be asked to calculate the probabilities of 0’s or 1’s in the output streams. ____________________________________________________________ *These will be given.. Source Coding Procedure:

Example: An information source produces a long sequence of three independent symbols A, B with probabilities 0.8, 0.2 respectively; 80 such symbols are produced per second. The information is to be transmitted via a noiseless binary channel which can transmit up to 60 binary digits per second. Design a suitable compact instantaneous code and find the probabilities of the binary digit produced. 1. Check if channel capacity is sufficient for source rate. Source entropy= -(0.8log log0.2)= 0.72 bits/symbol So, source information rate (bit/sec)= Source rate (symbol/sec)* Source entropy (bits/symbols)=0.72*80=57.8 bits/sec. Because one bit carries one bit information, and 57.8 <60 (channel capacity), hence yes. p(0)+p(1)=1, so, p(0)=0.615, p(1)= (i)threes: p(0)=0.52; p(1)=0.48.

Coder design: 1.Coding singly, A=0, B=1. L=1 digit/symbol. Coder produces 80 bits/sec, which is too fast for 60 bits/sec channel rate. 2. Coding in pairs, AA AB BA BB L 2= 0.64*1+0.16*2+0.16*3+0.04*3 =1.56 binary digits/pair L=1.56/2=0.78 binary digits/symbol. Coder produces 80*0.78=62.4 binary digits/sec. > 60. Still too fast.

3. Coding in threes (Hoffman) AAA AAB ABA BAA ABB BAB BBA BBB L 3= 0.518* * * *5 =2.19 binary digits/three L=2.19/3=0.73 binary digits/symbol. Coder produces 80*0.73=58 binary digits/sec. < 60. OK.

Probabilities of 0 and 1 in output sequence: (i)singly; p(0)=0.8; p(1)=0.2; (ii)in pairs; (iii)threes: p(0)=0.52; p(1)=0.48. p(0)+p(1)=1, so, p(0)=0.615, p(1)=0.385.

Noise In communication theory, any interference or distraction that disrupts the communication process. Noise is a random electrical signal on a communications channel that interferes with the desired signal or data. Noise can either be generated by the circuitry of the communications device itself or come from one of a variety of external sources, including radio and TV signals, lightning, and nearby transmission lines. What are effects of noise? In communication theory, any interference or distraction that disrupts the communication process. It may reach a sufficiently high level to cause spurious errors, thereby corrupting some of the information contained in the signal being propagated. Reduction of information: because the presence of noisy reduce the confidence of the information at the receiver. There are still some uncertainty of validity of the transmission. For example if numerical data are read over a telephone line the listener may be unsure that he heard some of the values correctly, or in the transmission of data by binary pulses some of pulses may be received in error.

Information in Noisy Channel … ( x ) 0( x ) …..

TransmitterReceiver noiseless the information gained over the transmission is Transmitter Receiver noisy Transmitted an event of probability over a noiseless channel

Consider that the transmission is decomposed into 2 stages: (1). Transmitted an event of probability over a noisy channel to (2) Retransmit the same event via noiseless channel, starting with probability ( gained in the first transmission) to 1 Transmitted an event of probability over a noisy channel, —› probability at the receiver Transmitter (1) (2) Receiver

The information in noisy channel is hence given by Information to make reception correct is the a priori probability: probability at receiver of event before transmission. is the a posteriori probability: probability at receiver of event after transmission. Quantity of information in noisy channel

Example: A binary system produces Marks and Spaces with equal probabilities, 1/8 of all pulses being received in error. Find the information received for all combinations of input and output. There are four possibilities : (i) Correct (M  M, or S  S), 7/8 (ii) Wrong (S  M, or M  S), 1/8 Probability of Marks or Spaces transmitted (prior probability) Probability of correct transmission (M  M, or S  S) (posterior probability)

Probability of Marks or Spaces transmitted (prior probability) Probability of incorrect transmission (S  M, or M  S) (posterior probability) Average information received (per binary digit)

M M P(M)=0.5 7/8 S S P(S)=0.5 P(S|S)= 7/8 P(S|M)=1/8 P(M|S)=1/8

Transmitted source symbols Xi; i=1, ….n, Received symbols Yj; j=1, ….n, joint probability of having transmitted Xi; and Yj is known to have been received. conditional probability of Xi having been transmitted if Yi is known to have been received. conditional probability of receiving Yj, when Xi is known to have been transmitted.

A binary system generates 0’s and 1’s with equal probability, 5% of all pulses received are in error, (e.g. 0  1, or 1  0). Find the information received for possible combinations of inputs and outputs, and average information received. peP(1)P(0)P(1|1) or P(0|0) P(0|1) or P(1|0) I(1|1 ) or I(0|0) I(0|1) or I(1|0) Average Information received log(0.95/0.5) = log(0.05/0.5) = *log (0.95/0.5) +0.05*log(0.05/0.5) =0.7326