Hybrid Codes and the Point-to-Point Channel Paul Cuff Curt Schieler.

Slides:



Advertisements
Similar presentations
Encoders and Decoders.
Advertisements

1 Computer Networks and Internets, 5e By Douglas E. Comer Lecture PowerPoints Adapted from the notes By Lami Kaya, © 2009 Pearson Education.
Relaying in networks with multiple sources has new aspects: 1. Relaying messages to one destination increases interference to others 2. Relays can jointly.
1 A Brief Review of Joint Source-Channel Coding CUBAN/BEATS Meeting 29th April, 2004 Fredrik Hekland Department of Electronics and Telecommunication NTNU.
PAUL CUFF ELECTRICAL ENGINEERING PRINCETON UNIVERSITY Causal Secrecy: An Informed Eavesdropper.
PAUL CUFF ELECTRICAL ENGINEERING PRINCETON UNIVERSITY Information Theory for Secrecy and Control.
Information Theory Introduction to Channel Coding Jalal Al Roumy.
June 4, 2015 On the Capacity of a Class of Cognitive Radios Sriram Sridharan in collaboration with Dr. Sriram Vishwanath Wireless Networking and Communications.
Digital Data Transmission ECE 457 Spring Information Representation Communication systems convert information into a form suitable for transmission.
Threshold Phenomena and Fountain Codes
Sep 06, 2005CS477: Analog and Digital Communications1 Introduction Analog and Digital Communications Autumn
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
Source-Channel Prediction in Error Resilient Video Coding Hua Yang and Kenneth Rose Signal Compression Laboratory ECE Department University of California,
CSI Uncertainty in A.I. Lecture 201 Basic Information Theory Review Measuring the uncertainty of an event Measuring the uncertainty in a probability.
Xinqiao LiuRate constrained conditional replenishment1 Rate-Constrained Conditional Replenishment with Adaptive Change Detection Xinqiao Liu December 8,
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
Noise, Information Theory, and Entropy
Channel Polarization and Polar Codes
Secure Communication for Signals Paul Cuff Electrical Engineering Princeton University.
PAUL CUFF ELECTRICAL ENGINEERING PRINCETON UNIVERSITY Secure Communication for Distributed Systems.
Why to Apply Digital Transmission?
Noise, Information Theory, and Entropy
GODIAN MABINDAH RUTHERFORD UNUSI RICHARD MWANGI.  Differential coding operates by making numbers small. This is a major goal in compression technology:
Introduction to Data communication
Rate-distortion Theory for Secrecy Systems
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
MD-based scheme could outperform MR-based scheme while preserving the source- channel interface Rate is not sufficient as source- channel interface, ordering.
Secure Communication for Distributed Systems Paul Cuff Electrical Engineering Princeton University.
CODED COOPERATIVE TRANSMISSION FOR WIRELESS COMMUNICATIONS Prof. Jinhong Yuan 原进宏 School of Electrical Engineering and Telecommunications University of.
Image Compression (Chapter 8) CSC 446 Lecturer: Nada ALZaben.
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
M S. Sandeep Pradhan University of Michigan, Ann Arbor joint work with K. Ramchandran Univ. of California, Berkeley A Comprehensive view of duality in.
Toward a Secure Data-Rate Theorem Paul Cuff. Control Setting Controller Encoder System (Plant) Sensors Rate R UiUi XiXi YiYi.
Communication Over Unknown Channels: A Personal Perspective of Over a Decade Research* Meir Feder Dept. of Electrical Engineering-Systems Tel-Aviv University.
Digital Image Processing Image Compression
Outline Kinds of Coding Need for Compression Basic Types Taxonomy Performance Metrics.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
A Mathematical Theory of Communication Jin Woo Shin Sang Joon Kim Paper Review By C.E. Shannon.
Lecture 2 Outline Announcements: No class next Wednesday MF lectures (1/13,1/17) start at 12:50pm Review of Last Lecture Analog and Digital Signals Information.
DIGITAL COMMUNICATIONS Linear Block Codes
Superposition encoding A distorted version of is is encoded into the inner codebook Receiver 2 decodes using received signal and its side information Decoding.
© 2009 Pearson Education Inc., Upper Saddle River, NJ. All rights reserved.1 Computer Networks and Internets, 5e By Douglas E. Comer Lecture PowerPoints.
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
Additive White Gaussian Noise
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
COMMUNICATION SYSTEM EECB353 Chapter 7 Part III MULTIPLE ACCESS Intan Shafinaz Mustafa Dept of Electrical Engineering Universiti Tenaga Nasional
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
CS294-9 :: Fall 2003 Joint Source/Channel Coding Ketan Mayer-Patel.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
The Communication Model Speech 8 key terms You will present today’s information in the form a speech on WEDNESDAY
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Mutual Information, Joint Entropy & Conditional Entropy
Error Detecting and Error Correcting Codes
DIGITAL COMMUNICATION. Introduction In a data communication system, the output of the data source is transmitted from one point to another. The rate of.
Distributed Joint Source-Channel Coding on a Multiple Access Channel with Side Information Vinod Sharma.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
COMPUTER NETWORKS and INTERNETS
Digital Communications
Introduction to electronic communication systems
Introduction King Saud University
Pulse Code Modulation (PCM)
Communication Systems (EE-341)
Multiplexers Anindya IE CSE.
Using Secret Key to Foil an Eavesdropper
Pulse Code Modulation (PCM)
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
Theory of Information Lecture 13
Introduction 1st semester King Saud University
Presentation transcript:

Hybrid Codes and the Point-to-Point Channel Paul Cuff Curt Schieler

Source-Channel Coding p(y|x) fg Correlation between S and Ŝ: Achieved with separately designed encoder and decoder.

Video Transmission (example) p(y|x) f

Systematic Transmission (example) Digital Hybrid Analog Relay Transmission

Copy-Robust Documents (example) A document gets printed with redundancy. Photocopy noise removed by photocopier.

Connection to General Point-to-Point Channel Setting p(y|x) fg

Digital Watermark (example) Media is modified to include extra information Scoundrel may try to delete the watermark – Channel input (X) is modified media – Ŝ is additional information (digital tag)

Define Empirical Coordination p(y|x) fg is achievable if: Empirical distribution:

Separation Method Index Channel Achieves product distributions: such that

A Better Idea (Hybrid Codes) Channel

Hybrid Codes Digital – Source is compressed and coded in blocks Analog – Channel input and reconstruction depend letter- by-letter on the source and channel output Hybrid Codes take advantage of correlation in network setting (i.e. interference channel) – [Minero, Lim, Kim – Allerton 2010, ISIT 2011]

Achievable Inner Bound is achievable if Source Channel Markov Chain Function (analog encoding) Function( analog decoding) Digital Decoding

Binary Example Source is Bern(p) Binary symmetric channel (Ɛ) Require reconstruction to equal channel input – i.e. X = Ŝ(systematic transmission) Minimize Hamming distortion If p=.5: D = Ɛ (Optimal) If p>0 and Ɛ>0 : D > 0 (Suboptimal)

State Amplification Channel State is known to the encoder Two objectives – Transmit a message – Help decoder estimate state p(y|x,s) fg [Kim, Sutivong, Cover – ‘08][Choudhuri, Kim, Mitra – ‘10, ‘11] No loss of generality

Causal Achievable Region is achievable iff Source Channel Markov Chain Function (analog encoding) Function( analog decoding) Digital Decoding

Strictly-Causal Achievable Region is achievable iff Source Channel Markov Chain Function (analog encoding) Function( analog decoding) Digital Decoding