1 Network Source Coding Lee Center Workshop 2006 Wei-Hsin Gu (EE, with Prof. Effros)

Slides:



Advertisements
Similar presentations
Lecture 3: Source Coding Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Advertisements

Sampling and Pulse Code Modulation
1 Index Coding Part II of tutorial NetCod 2013 Michael Langberg Open University of Israel Caltech (sabbatical)
I NFORMATION CAUSALITY AND ITS TESTS FOR QUANTUM COMMUNICATIONS I- Ching Yu Host : Prof. Chi-Yee Cheung Collaborators: Prof. Feng-Li Lin (NTNU) Prof. Li-Yi.
B IPARTITE I NDEX C ODING Arash Saber Tehrani Alexandros G. Dimakis Michael J. Neely Department of Electrical Engineering University of Southern California.
Equivocation A Special Case of Distortion-based Characterization.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Chain Rules for Entropy
Chapter 4: Network Layer
CWIT Robust Entropy Rate for Uncertain Sources: Applications to Communication and Control Systems Charalambos D. Charalambous Dept. of Electrical.
Entropy Rates of a Stochastic Process
Complexity 11-1 Complexity Andrei Bulatov NP-Completeness.
Lossy Source Coding under a Maximum Distortion Constraint with Decoder Side- Information Jayanth Nayak 1, Ertem Tuncel 2, and Kenneth Rose 1 1 University.
SWE 423: Multimedia Systems
Lihua Weng Dept. of EECS, Univ. of Michigan Error Exponent Regions for Multi-User Channels.
Zero-error source-channel coding with source side information at the decoder J. Nayak, E. Tuncel and K. Rose University of California, Santa Barbara.
Fundamental limits in Information Theory Chapter 10 :
1 University of Freiburg Computer Networks and Telematics Prof. Christian Schindelhauer Mobile Ad Hoc Networks Theory of Data Flow and Random Placement.
Network Coding Project presentation Communication Theory 16:332:545 Amith Vikram Atin Kumar Jasvinder Singh Vinoo Ganesan.
Distributed Video Coding Bernd Girod, Anne Margot Aagon and Shantanu Rane, Proceedings of IEEE, Jan, 2005 Presented by Peter.
Theta Function Lecture 24: Apr 18. Error Detection Code Given a noisy channel, and a finite alphabet V, and certain pairs that can be confounded, the.
BASiCS Group University of California at Berkeley Generalized Coset Codes for Symmetric/Asymmetric Distributed Source Coding S. Sandeep Pradhan Kannan.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
A Graph-based Framework for Transmission of Correlated Sources over Multiuser Channels Suhan Choi May 2006.
Lossless Compression - I Hao Jiang Computer Science Department Sept. 13, 2007.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Distributed Video Coding Bernd Girod, Anne Margot Aaron, Shantanu Rane, and David Rebollo-Monedero IEEE Proceedings 2005.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
Networking Seminar Network Information Flow R. Ahlswede, N. Cai, S.-Y. R. Li, and R. W. Yeung. Network Information Flow. IEEE Transactions on Information.
Rate-distortion Theory for Secrecy Systems
Computing and Communicating Functions over Sensor Networks A.Giridhar and P. R. Kumar Presented by Srikanth Hariharan.
EEET 5101 Information Theory Chapter 1
Network Coding and Information Security Raymond W. Yeung The Chinese University of Hong Kong Joint work with Ning Cai, Xidian University.
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
Channel Capacity.
Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single.
§3 Discrete memoryless sources and their rate-distortion function §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion.
A Continuity Theory of Source Coding over Networks WeiHsin Gu, Michelle Effros, Mayank Bakshi, and Tracey Ho FLoWS PI Meeting, Washington DC, September.
EE 6332, Spring, 2014 Wireless Communication Zhu Han Department of Electrical and Computer Engineering Class 11 Feb. 19 th, 2014.
Information Theory for Mobile Ad-Hoc Networks (ITMANET): The FLoWS Project Thrusts 0 and 1 Metrics and Upper Bounds Muriel Medard, Michelle Effros and.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Superposition encoding A distorted version of is is encoded into the inner codebook Receiver 2 decodes using received signal and its side information Decoding.
Transmission over composite channels with combined source-channel outage: Reza Mirghaderi and Andrea Goldsmith Work Summary STATUS QUO A subset Vo (with.
BCS547 Neural Decoding.
1.7 Linear Inequalities.  With an inequality, you are finding all values of x for which the inequality is true.  Such values are solutions and are said.
Basic Concepts of Encoding Codes and Error Correction 1.
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
We aim to exploit cognition to maximize network performance What is the side information at a cognitive node? What is the best encoding scheme given this.
Unique Games Approximation Amit Weinstein Complexity Seminar, Fall 2006 Based on: “Near Optimal Algorithms for Unique Games" by M. Charikar, K. Makarychev,
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Inequalities for Stochastic Linear Programming Problems By Albert Madansky Presented by Kevin Byrnes.
Universal Linked Multiple Access Source Codes Sidharth Jaggi Prof. Michelle Effros.
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Compression for Fixed-Width Memories Ori Rottenstriech, Amit Berman, Yuval Cassuto and Isaac Keslassy Technion, Israel.
1 Department of Electrical Engineering, Stanford University Anne Aaron, Shantanu Rane, Rui Zhang and Bernd Girod Wyner-Ziv Coding for Video: Applications.
Entropy estimation and lossless compression Structure and Entropy of English How much lossless compression can be achieved for a given image? How much.
Network Topology Single-level Diversity Coding System (DCS) An information source is encoded by a number of encoders. There are a number of decoders, each.
Random Access Codes and a Hypercontractive Inequality for
IERG6120 Lecture 22 Kenneth Shum Dec 2016.
The Johns Hopkins University
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
Independent Encoding for the Broadcast Channel
THE PYTHAGOREAN THEOREM
2-8: Two Variable Inequalities
THE PYTHAGOREAN THEOREM
Lihua Weng Dept. of EECS, Univ. of Michigan
Presentation transcript:

1 Network Source Coding Lee Center Workshop 2006 Wei-Hsin Gu (EE, with Prof. Effros)

2 Outline Introduction Previously Solved Problems Our Results Summary

3 Problem Formulation (1) General network :  A directed graph G = (V,E).  Source inputs and reproduction demands.  All links are directed and error-free.  Source sequences  Distortion measures are given.

4 Problem Formulation (2) Rate Distortion Region  Given a network.  : source pmf.  : distortion vector.  A rate vector is - achievable if there exists a sequence of length n codes of rate whose reproductions satisfy the distortion constraints asymptotically.  The closure of the set consisting of all achievable is called the rate distortion region,.  A rate vector is losslessly achievable if the error probability of reproductions can be arbitrarily small. Lossless rate region.

5 Example Decoder Encoder Decoder Black : sources Red : reproductions is achievable if and only if

6 Outline Introduction Previously Solved Problems Our Results Summary

7 Known Results Black : sources Red : reproductions DecoderEncoder Lossless : Source Coding Theorem Lossy : Rate-Distortion Theorem Black : sources Red : reproductions Encoder DecoderEncoder Lossless : Solved [Ahlswede and Korner `75]. Encoder1 Encoder3 Encoder2 Decoder 1 Decoder 2 Black : sources Red : reproductions Lossy : [Gray and Wyner ’74] Encoder1 Encoder2 Decoder Lossless : Slepian-Wolf problem

8 Outline Introduction Previously Solved Problems Our Results  Two Multi-hop Networks  Properties of Rate Distortion Regions Summary

9 Multi-Hop Network (1) Achievability Result Source coding for the following multihop network Black : sources Red : reproductions Decoder Encoder Decoder Node 1 Node 2 Node 3 Achievability Result

10 Multi-Hop Network (1) Converse Result Source coding for the following multihop network Black : sources Red : reproductions Decoder Encoder Decoder Node 1 Node 2 Node 3 Converse Result

11 Multi-Hop Network (2) Diamond Network Encoder1 Encoder3 Encoder2 Decoder2 Decoder1 Decoder3

12 Diamond Network Simpler Case When are independent  Lossless (by Fano’s inequality) : Is optimal

13 Diamond Network Achievability Result

14 Diamond Network Converse Result

15 Outline Introduction Previously Solved Problems Our Results  Two Multi-hop Networks  Properties of Rate Distortion Regions Summary

16 Properties of RD Regions : Rate distortion region. : Lossless rate region. is continuous in for finite-alphabet sources. Conjecture  is continuous in source pmf

17 Continuity Can allow small errors estimating source pmf.  Trivial for point-to-point networks – rate distortion function is continuous in probability distribution. Two convex subsets of are if and only if  Continuity means that Proved to be true for those networks whose one-letter characterizations have been found.

18 Continuity - Example Slepian-Wolf problem

19 Continuity - Example Coded side information problem Since alphabet of is bounded, and are uniformly continuous in over all

20 Outline Introduction Previously Solved Problems Our Results Summary

21 Summary Study the RD regions for two multi-hop networks. Solved for independent sources. Achievability and converse results are not yet known to be tight. Study general properties of RD regions RD regions are continuous in distortion vector for finite- alphabet sources. Conjecture that RD regions are continuous in pmf for finite-alphabet sources.