A Mathematical Theory of Communication Jin Woo Shin Sang Joon Kim Paper Review By C.E. Shannon.

Slides:



Advertisements
Similar presentations
Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
Advertisements

Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Inserting Turbo Code Technology into the DVB Satellite Broadcasting System Matthew Valenti Assistant Professor West Virginia University Morgantown, WV.
Information Theory EE322 Al-Sanie.
The Cutoff Rate and Other Limits: Passing the Impassable Richard E. Blahut University of Illinois UIUC 5/4/20151.
Capacity of Wireless Channels
1 Channel Coding in IEEE802.16e Student: Po-Sheng Wu Advisor: David W. Lin.
Chapter 6 Information Theory
Fundamental limits in Information Theory Chapter 10 :
Sep 06, 2005CS477: Analog and Digital Communications1 Introduction Analog and Digital Communications Autumn
06 Dec 04Turbo Codes1 TURBO CODES Michelle Stoll.
2/28/03 1 The Virtues of Redundancy An Introduction to Error-Correcting Codes Paul H. Siegel Director, CMRR University of California, San Diego The Virtues.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 13.
Generalized Communication System: Error Control Coding Occurs In Right Column. 6.
2015/7/12VLC 2008 PART 1 Introduction on Video Coding StandardsVLC 2008 PART 1 Variable Length Coding  Information entropy  Huffman code vs. arithmetic.
Improving the Performance of Turbo Codes by Repetition and Puncturing Youhan Kim March 4, 2005.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Noise, Information Theory, and Entropy
Channel Polarization and Polar Codes
PAUL CUFF ELECTRICAL ENGINEERING PRINCETON UNIVERSITY Secure Communication for Distributed Systems.
Noise, Information Theory, and Entropy
Analysis of Iterative Decoding
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Review of modern noise proof coding methods D. Sc. Valeri V. Zolotarev.
ENTROPIC CHARACTERISTICS OF QUANTUM CHANNELS AND THE ADDITIVITY PROBLEM A. S. Holevo Steklov Mathematical Institute, Moscow.
Daphne Koller Message Passing Loopy BP and Message Decoding Probabilistic Graphical Models Inference.
Channel Capacity
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
§3 Discrete memoryless sources and their rate-distortion function §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion.
Redundancy The object of coding is to introduce redundancy so that even if some of the information is lost or corrupted, it will still be possible to recover.
Iterative Multi-user Detection for STBC DS-CDMA Systems in Rayleigh Fading Channels Derrick B. Mashwama And Emmanuel O. Bejide.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Introduction of Low Density Parity Check Codes Mong-kai Ku.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
DIGITAL COMMUNICATIONS Linear Block Codes
Turbo Codes COE 543 Mohammed Al-Shammeri. Agenda PProject objectives and motivations EError Correction Codes TTurbo Codes Technology TTurbo decoding.
Coding Theory Efficient and Reliable Transfer of Information
Transmission over composite channels with combined source-channel outage: Reza Mirghaderi and Andrea Goldsmith Work Summary STATUS QUO A subset Vo (with.
ON THE INTERMEDIATE SYMBOL RECOVERY RATE OF RATELESS CODES Ali Talari, and Nazanin Rahnavard IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 60, NO. 5, MAY 2012.
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
Channel Coding Binit Mohanty Ketan Rajawat. Recap…  Information is transmitted through channels (eg. Wires, optical fibres and even air)  Channels are.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Postacademic Interuniversity Course in Information Technology – Module C1p1 Chapter 4 Communications, Theory and Media.
FEC Linear Block Coding
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Implementation of Turbo Code in TI TMS320C8x Hao Chen Instructor: Prof. Yu Hen Hu ECE734 Spring 2004.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Doc.: IEEE / n Submission March 2004 PCCC Turbo Codes for IEEE n B. Bougard; B. Van Poucke; L. Van der Perre {bougardb,
Information Theory & Coding for Digital Communications Prof JA Ritcey EE 417 Source; Anderson Digital Transmission Engineering 2005.
Doc.: IEEE /0243r1 Submission Franck Lebeugle, France Telecom R&D March 2004 Slide 1 Turbo Codes for IEEE n Marie-Helene Hamon, Vincent.
Channel Coding and Error Control 1. Outline Introduction Linear Block Codes Cyclic Codes Cyclic Redundancy Check (CRC) Convolutional Codes Turbo Codes.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
FEC decoding algorithm overview VLSI 자동설계연구실 정재헌.
Refinement of Two Fundamental Tools in Information Theory Raymond W. Yeung Institute of Network Coding The Chinese University of Hong Kong Joint work with.
Coding and Interleaving
Trellis Codes With Low Ones Density For The OR Multiple Access Channel
Interleaver-Division Multiple Access on the OR Channel
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
January 2004 Turbo Codes for IEEE n
Mr. Ali Hussain Mugaibel
Digital Communication Chapter 1: Introduction
Distributed Compression For Binary Symetric Channels
Turbo Codes for IEEE n May 2004
Improving turbocode performance by cross-entropy
Low-Density Parity-Check Codes
Lecture 2: Basic Information Theory
Presentation transcript:

A Mathematical Theory of Communication Jin Woo Shin Sang Joon Kim Paper Review By C.E. Shannon

Contents Introduction Summary of Paper Discussion

Introduction This paper opened the information theory. Before this paper, people believed the only way to make the err. Prob. smaller is to reduce the data rate. This paper revealed that there is an achievable positive data rate with negligible errors. C.E. Shannon

Summary of Paper Preliminary Discrete Source & Discrete Channel Discrete Source & Cont. Channel Cont. Source & Cont. Channel

[Summary of Paper] Preliminary Entropy Ergodic source Irreducible, aperiodic property Capacity

[Summary of Paper] Disc. Source & Disc. Channel Capacity Theory (Theorem 11 at page 22) -The most important result of this paper If the discrete source entropy H is less than or equal to the channel capacity C then there exists a code that can be transmitted over the channel with arbitrarily small amount of errors. If H>C then there is no method of encoding which gives equivocation less than H-C.

[Summary of Paper] Disc. Source & Cont. Channel Domain size of input and output channel becomes infinity. The capacity of a continuous channel is: Tx rate does not exceed the channel capacity.

[Summary of Paper] Cont. Source & Cont. Channel Continuous source needs an infinite number of binary digits for exact specification. Fidelity: the measurement of how much distortion we allow Rate with fidelity constraint D of Cont. source P(X) is : with For given fidelity constraint D,

Discussion Ergodic source Practical approach Rate distortion

[Discussion] Ergodic source Ergodic Source assumption is the essential one in the paper. Source is ergodic -> AEP holds -> capacity theorem Finding a source that is not ergodic and holds AEP is a meaningful work. One example:

[Discussion] Practical approach -1 This paper provides the upper bound of achievable data rate. Finding a good encoding scheme is another problem. Turbo code, LDPC code are most efficient codes. Block size, rate, BER, decoding complexity are important factors when choosing a code for a specific system.

[Discussion] Practical approach -2 YearRate ½ CodeSNR Required for BER < SHANNON0dB 1967(255,123) BCH5.4dB 1977Convolutional Code4.5dB 1993Iterative Turbo Code0.7dB 2001Iterative LDPC Code0.0245dB ** This graph and chart are modified from the presentation data of Engling Yeo at Jan C. Berrou and A. Glavieux, "Near Optimum Error Correcting Coding And Decoding: Turbo- Codes," IEEE Trans. Comms., Vol.44, No.10, Oct 1996.

[Discussion] Rate distortion The ‘Fidelity’ concept motives ‘Rate Distortion’ theory. Rate with D distortion(fidelity) of Discrete source P(x) is defined as: subject to H(Entropy) is the rate with 0 distortion. (The Rate Distortion Theory) We can compress a Disc. source P(x) up to ratio when allowing D distortion.