Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.

Slides:



Advertisements
Similar presentations
The Transmission-Switching Duality of Communication Networks
Advertisements

Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Chapter 8 Channel Capacity. bits of useful info per bits actually sent Change in entropy going through the channel (drop in uncertainty): average uncertainty.
II. Modulation & Coding. © Tallal Elshabrawy Design Goals of Communication Systems 1.Maximize transmission bit rate 2.Minimize bit error probability 3.Minimize.
1 The 2-to-4 decoder is a block which decodes the 2-bit binary inputs and produces four output All but one outputs are zero One output corresponding to.
Sampling and Pulse Code Modulation
Probabilistic verification Mario Szegedy, Rutgers www/cs.rutgers.edu/~szegedy/07540 Lecture 4.
Information Theory EE322 Al-Sanie.
Chapter 6 Information Theory
Digital Communications (디지털 통신)
Cellular Communications
Lihua Weng Dept. of EECS, Univ. of Michigan Error Exponent Regions for Multi-User Channels.
Fundamental limits in Information Theory Chapter 10 :
資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch4: Channel.
S. Mandayam/ECE Dept./Rowan University Digital Communications / Fall 2002 Shreekanth Mandayam ECE Department Rowan University
Figure 6.1. A convolutional encoder. Figure 6.2. Structure of a systematic convolutional encoder of rate.
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
EE322 Digital Communications 1EE322 A. Al-Sanie. Instructor: Abdulhameed Al-Sanie د. عبدالحميد الصانع Office: 2c30 Web page:
S. Mandayam/ECE Dept./Rowan University Digital Communications / Fall 2002 Shreekanth Mandayam ECE Department Rowan University
Tracey Ho Sidharth Jaggi Tsinghua University Hongyi Yao California Institute of Technology Theodoros Dikaliotis California Institute of Technology Chinese.
1 Chapter 1 Introduction. 2 Outline 1.1 A Very Abstract Summary 1.2 History 1.3 Model of the Signaling System 1.4 Information Source 1.5 Encoding a Source.
USAFrance OCEAN. Control and Communication: an overview Jean-Charles Delvenne CMI, Caltech May 5, 2006.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Noise, Information Theory, and Entropy
331: STUDY DATA COMMUNICATIONS AND NETWORKS.  1. Discuss computer networks (5 hrs)  2. Discuss data communications (15 hrs)
Modern Digital and Analog Communication Systems Lathi Copyright © 2009 by Oxford University Press, Inc. C H A P T E R 11 PERFORMANCE ANALYSIS OF DIGITAL.
Lecture 3 Outline Announcements: No class Wednesday Friday lecture (1/17) start at 12:50pm Review of Last Lecture Communication System Block Diagram Performance.
§1 Entropy and mutual information
1 INF244 Textbook: Lin and Costello Lectures (Tu+Th ) covering roughly Chapter 1;Chapters 9-19? Weekly exercises: For your convenience Mandatory.
Fundamentals of Digital Communication 2 Digital communication system Low Pass Filter SamplerQuantizer Channel Encoder Line Encoder Pulse Shaping Filters.
ENTROPIC CHARACTERISTICS OF QUANTUM CHANNELS AND THE ADDITIVITY PROBLEM A. S. Holevo Steklov Mathematical Institute, Moscow.
Channel Coding Part 1: Block Coding
Daphne Koller Message Passing Loopy BP and Message Decoding Probabilistic Graphical Models Inference.
Gaussian Channel. Introduction The most important continuous alphabet channel is the Gaussian channel depicted in Figure. This is a time-discrete channel.
Channel Capacity.
Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single.
§3 Discrete memoryless sources and their rate-distortion function §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion.
Redundancy The object of coding is to introduce redundancy so that even if some of the information is lost or corrupted, it will still be possible to recover.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio.
EE 6332, Spring, 2014 Wireless Communication Zhu Han Department of Electrical and Computer Engineering Class 11 Feb. 19 th, 2014.
Introduction to Data Communication: the discrete channel model A.J. Han Vinck University of Essen April 2005.
§2 Discrete memoryless channels and their capacity function
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Lecture 2 Outline Announcements: No class next Wednesday MF lectures (1/13,1/17) start at 12:50pm Review of Last Lecture Analog and Digital Signals Information.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
DIGITAL COMMUNICATIONS Linear Block Codes
Coding Theory Efficient and Reliable Transfer of Information
CHAPTER 5 SIGNAL SPACE ANALYSIS
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
School of Computer and Communication Engineering, UniMAP Mohd ridzuan mohd nor DKT 122/3 - DIGITAL SYSTEM I Chapter.
Fundamentals of Communication
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Mutual Information, Joint Entropy & Conditional Entropy
Channel Coding: Part I Presentation II Irvanda Kurniadi V. ( ) Digital Communication 1.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its.
Layered Decoding and Secrecy Over Degraded Broadcast Channel
Multiplexers Anindya IE CSE.
Probability, Random Variables, and Random Processes
II. Modulation & Coding.
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
Presentation transcript:

Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding

Discrete Memoryless Source Figure 1.2. A Discrete Memoryless Source

y 2 =ln(x) y 1 =x-1 Figure 1.3. Inequality ln(x) <= x-1

Figure 1.4. Entropy function for the binary source

Figure 1.5. A discrete transmission channel

Figure 1.6. Binary symmetric channel

Figure 1.7. Binary erasure channel

Figure 1.8. Example 1.7

Figure 1.9. Relationships among the different entropies

Figure BSC of Example 1.9

Figure Channel capacity for the BSC

Figure Example 1.11

Figure Vector representation of signals

Figure Vector addition of signals and noise

Figure Gaussian channel

Figure A representation of the output vector space

Figure An encoder for the BSC

Figure An ideal communication system

Figure Practical and non-practical operation regions

Figure Practical and non-practical operating regions. The Shannon limit