Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,

Slides:



Advertisements
Similar presentations
Information Theory EE322 Al-Sanie.
Advertisements

Hybrid Codes and the Point-to-Point Channel Paul Cuff Curt Schieler.
Paul Cuff THE SOURCE CODING SIDE OF SECRECY TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA.
Equivocation A Special Case of Distortion-based Characterization.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
1 Distributed Source Coding Trial Lecture Fredrik Hekland 1. June 2007.
CWIT Robust Entropy Rate for Uncertain Sources: Applications to Communication and Control Systems Charalambos D. Charalambous Dept. of Electrical.
Chapter 6 Information Theory
Lossy Source Coding under a Maximum Distortion Constraint with Decoder Side- Information Jayanth Nayak 1, Ertem Tuncel 2, and Kenneth Rose 1 1 University.
Convergent and Correct Message Passing Algorithms Nicholas Ruozzi and Sekhar Tatikonda Yale University TexPoint fonts used in EMF. Read the TexPoint manual.
Fundamental limits in Information Theory Chapter 10 :
Multiple Description Coding and Distributed Source Coding: Unexplored Connections in Information Theory and Coding Theory S. Sandeep Pradhan Department.
Spatial and Temporal Data Mining
1 Network Source Coding Lee Center Workshop 2006 Wei-Hsin Gu (EE, with Prof. Effros)
Distributed Video Coding Bernd Girod, Anne Margot Aagon and Shantanu Rane, Proceedings of IEEE, Jan, 2005 Presented by Peter.
Wyner-Ziv Coding of Motion Video
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
BASiCS Group University of California at Berkeley Generalized Coset Codes for Symmetric/Asymmetric Distributed Source Coding S. Sandeep Pradhan Kannan.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
Lossless Compression - I Hao Jiang Computer Science Department Sept. 13, 2007.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Distributed Video Coding Bernd Girod, Anne Margot Aaron, Shantanu Rane, and David Rebollo-Monedero IEEE Proceedings 2005.
Noise, Information Theory, and Entropy
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
Huffman Coding Vida Movahedi October Contents A simple example Definitions Huffman Coding Algorithm Image Compression.
Information Theory & Coding…
Rate-distortion Theory for Secrecy Systems
EEET 5101 Information Theory Chapter 1
When rate of interferer’s codebook small Does not place burden for destination to decode interference When rate of interferer’s codebook large Treating.
Joint Physical Layer Coding and Network Coding for Bi-Directional Relaying Makesh Wilson, Krishna Narayanan, Henry Pfister and Alex Sprintson Department.
(Important to algorithm analysis )
Abhik Majumdar, Rohit Puri, Kannan Ramchandran, and Jim Chou /24 1 Distributed Video Coding and Its Application Presented by Lei Sun.
Sensor Networks, Rate Distortion Codes, and Spin Glasses NTT Communication Science Laboratories Tatsuto Murayama In collaboration.
The Secrecy of Compressed Sensing Measurements Yaron Rachlin & Dror Baron TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single.
A Continuity Theory of Source Coding over Networks WeiHsin Gu, Michelle Effros, Mayank Bakshi, and Tracey Ho FLoWS PI Meeting, Washington DC, September.
M S. Sandeep Pradhan University of Michigan, Ann Arbor joint work with K. Ramchandran Univ. of California, Berkeley A Comprehensive view of duality in.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
On Compression of Data Encrypted with Block Ciphers
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Coding Theory Efficient and Reliable Transfer of Information
Source Coding Efficient Data Representation A.J. Han Vinck.
BCS547 Neural Decoding.
Wyner-Ziv Coding of Motion Video Presented by fakewen.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
Vector Quantization CAP5015 Fall 2005.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Image Processing Architecture, © Oleh TretiakPage 1Lecture 4 ECE-C490 Winter 2004 Image Processing Architecture Lecture 4, 1/20/2004 Principles.
MAIN RESULT: We assume utility exhibits strategic complementarities. We show: Membership in larger k-core implies higher actions in equilibrium Higher.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
1 Department of Electrical Engineering, Stanford University Anne Aaron, Shantanu Rane, Rui Zhang and Bernd Girod Wyner-Ziv Coding for Video: Applications.
Samuel Cheng, Shuang Wang and Lijuan Cui University of Oklahoma
Entropy estimation and lossless compression Structure and Entropy of English How much lossless compression can be achieved for a given image? How much.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
Distributed Compression For Still Images
The Johns Hopkins University
Dimension reduction for finite trees in L1
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Sampling of min-entropy relative to quantum knowledge Robert König in collaboration with Renato Renner TexPoint fonts used in EMF. Read the TexPoint.
CH 8. Image Compression 8.1 Fundamental 8.2 Image compression models
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
CSE 589 Applied Algorithms Spring 1999
Foundation of Video Coding Part II: Scalar and Vector Quantization
Distributed Compression For Binary Symetric Channels
Image Coding and Compression
Information Theoretical Analysis of Digital Watermarking
Compute-and-Forward Can Buy Secrecy Cheap
Presentation transcript:

Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan, Ann Arbor TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A AA A A A A A A A

Presentation Overview Problem Formulation Motivation Nested Linear Codes Main Result Applications and Examples Conclusions

Problem Formulation Distributed Source Coding Typical application: Sensor networks. Example: Lossless reconstruction of all sources – joint entropy.

Problem Formulation We ask: What if the decoder is interested only in a function of the sources? In general: fidelity criterion of the form Ex: average of the sensor measurements. Obvious strategy: Reconstruct the sources and then compute the function. Are rate gains possible if we directly encode the function in a distributed setting?

Motivation: A Binary Example Korner and Marton – Reconstruction of Centralized encoder: –Compute –Compress using a good source encoder Suppose satisfies Centralized scheme becomes distributed scheme. Are there good source codes with this property? –Linear Codes.

The Korner-Marton Coding Scheme matrix such that: –Decoder with high probability. –Entropy achieving: Encoders transmit Decoder: with high probability. Rate pair achievable. Can be lower than Slepian-Wolf bound: Scheme works for addition in any finite field.

Properties of the Linear Code Matrix :Puts different typical in different bins. Consider - Coset code Good channel code for channel with noise Both encoders use identical codebooks –Binning completely “correlated” –Independent binning more prevalent in information theory.

Slepian-Wolf Coding Function to be reconstructed Treat binary sources as sources. Function equivalent to addition in : Encode the vector function one digit at a time First digit of Second digit of

Slepian-Wolf Coding contd. Use Korner-Marton coding scheme on each digit plane. Sequential strategy achieves Slepian-Wolf bound. General lossless strategy: –“Embed” the function in a digit plane field (DPF). –DPF – direct sum of Galois fields of prime order. –Encode the digits sequentially using Korner-Marton strategy.

Lossy Coding Quantize to, to - best estimate of w.r.t the distortion measure given Use lossless coding to encode What we need: Nested linear codes.

Nested Linear Codes Codes used in KM, SW – good channel codes –Cosets bin the entire space. –Suitable for lossless coding. Lossy coding: Need to quantize first. –Decrease coset density.

Nested Linear Codes Codes used in KM, SW – good channel codes –Cosets bin the entire space. –Suitable for lossless coding. Lossy coding: Need to quantize first. –Decrease coset density – Nested linear codes. –Fine code: quantizes the source. –Coarse code: bins only the fine code.

Nested Linear Codes Linear code nested if We need – : “good” source code Can find jointly typical with – :“good” channel code Can find unique typical for a given

Good Linear Source Codes Good linear code for the triple Assume for some prime Exists for large if Not a good source code in the Shannon sense. –Contains a subset that is a good Shannon source code. Linearity – rate loss of bits/sample

Good Linear Channel Codes Good linear code for the triple Assume for some prime Exists for large if Not a good channel code in the Shannon sense. –Every coset contains a subset which is a good channel code. Linearity – rate loss of bits/sample

Main Result Fix test channel such that and Embed in. Need to encode Fix order of encoding of digit planes – Idea: Encode one digit at a time. At b th stage: Use previous reconstructed digits as side information.

Coding Strategy for Good source codes, good channel code

Cardinalities of the Linear Code Cardinality of the nested codes Rate of encoder: Conventional coding:

Coding Theorem An achievable rate region Corollary:

Nested Linear Codes Achieve Rate Distortion Bound Choose as constant. Follows that achievable for any Can also recover –Berger-Tung inner bound. –Wyner-Ziv rate region. –Wyner’s source coding with side information. –Slepian-Wolf and Korner Marton rate regions.

Lossy Coding of Fix test channels independent binary random variables. Reconstruct Using corollary to rate region, can achieve Can achieve more rate points by –Choosing more general test channels. –Embedding in

Conclusions Presented an unified approach to distributed source coding. Involves use of nested linear codes. Coding: Quantization followed by “correlated” binning. Recovers the known rate regions for many problems. Presents new rate regions for other problems.