Universal Linked Multiple Access Source Codes Sidharth Jaggi Prof. Michelle Effros.

Slides:



Advertisements
Similar presentations
Inapproximability of MAX-CUT Khot,Kindler,Mossel and O ’ Donnell Moshe Ben Nehemia June 05.
Advertisements

Sampling and Pulse Code Modulation
Capacity of Wireless Channels
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
1 exercise in the previous class Determine the stationary probabilities. Compute the probability that 010 is produced. A BC 0/0.4 0/0.5 1/0.6 0/0.81/0.5.
1 Distributed Source Coding Trial Lecture Fredrik Hekland 1. June 2007.
Chain Rules for Entropy
Chapter 6 Information Theory
Low Complexity Encoding for Network Codes Yuval Cassuto Michelle Effros Sidharth Jaggi.
Fundamental limits in Information Theory Chapter 10 :
Multiple Description Coding and Distributed Source Coding: Unexplored Connections in Information Theory and Coding Theory S. Sandeep Pradhan Department.
Probably Approximately Correct Model (PAC)
1 Network Source Coding Lee Center Workshop 2006 Wei-Hsin Gu (EE, with Prof. Effros)
We use Numerical continuation Bifurcation theory with symmetries to analyze a class of optimization problems of the form max F(q,  )=max (G(q)+  D(q)).
Distributed Video Coding Bernd Girod, Anne Margot Aagon and Shantanu Rane, Proceedings of IEEE, Jan, 2005 Presented by Peter.
On Hierarchical Type Covering Ertem Tuncel 1, Jayanth Nayak 2, and Kenneth Rose 2 1 University of California, Riverside 2 University of California, Santa.
BASiCS Group University of California at Berkeley Generalized Coset Codes for Symmetric/Asymmetric Distributed Source Coding S. Sandeep Pradhan Kannan.
Quantal Response Equilibrium APEC 8205: Applied Game Theory Fall 2007.
Distributed Source Coding 教師 : 楊士萱 老師 學生 : 李桐照. Talk OutLine Introduction of DSCIntroduction of DSC Introduction of SWCQIntroduction of SWCQ ConclusionConclusion.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
A Graph-based Framework for Transmission of Correlated Sources over Multiuser Channels Suhan Choi May 2006.
Variable-Length Codes: Huffman Codes
15-853Page :Algorithms in the Real World Error Correcting Codes I – Overview – Hamming Codes – Linear Codes.
Collecting Correlated Information from a Sensor Network Micah Adler University of Massachusetts, Amherst.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Distributed Video Coding Bernd Girod, Anne Margot Aaron, Shantanu Rane, and David Rebollo-Monedero IEEE Proceedings 2005.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
exercise in the previous class (1)
1 Statistical NLP: Lecture 5 Mathematical Foundations II: Information Theory.
Introduction to AEP In information theory, the asymptotic equipartition property (AEP) is the analog of the law of large numbers. This law states that.
§1 Entropy and mutual information
Rate-distortion Theory for Secrecy Systems
© 2012 A. Datta & F. Oggier, NTU Singapore Redundantly Grouped Cross-object Coding for Repairable Storage Anwitaman Datta & Frédérique Oggier NTU Singapore.
EEET 5101 Information Theory Chapter 1
Small subgraphs in the Achlioptas process Reto Spöhel, ETH Zürich Joint work with Torsten Mütze and Henning Thomas TexPoint fonts used in EMF. Read the.
§4 Continuous source and Gaussian channel
Great Theoretical Ideas in Computer Science.
Information and Coding Theory Transmission over noisy channels. Channel capacity, Shannon’s theorem. Juris Viksna, 2015.
Regents Strategies: ELIMINATION Ruling out answers that we know are incorrect.
Channel Capacity.
Rei Safavi-Naini University of Calgary Joint work with: Hadi Ahmadi iCORE Information Security.
Correction of Adversarial Errors in Networks Sidharth Jaggi Michael Langberg Tracey Ho Michelle Effros Submitted to ISIT 2005.
Cooperative Recovery of Distributed Storage Systems from Multiple Losses with Network Coding Yuchong Hu, Yinlong Xu, Xiaozhao Wang, Cheng Zhan and Pei.
Great Theoretical Ideas in Computer Science.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Machine Learning CUNY Graduate Center Lecture 4: Logistic Regression.
DIGITAL COMMUNICATIONS Linear Block Codes
Superposition encoding A distorted version of is is encoded into the inner codebook Receiver 2 decodes using received signal and its side information Decoding.
Transmission over composite channels with combined source-channel outage: Reza Mirghaderi and Andrea Goldsmith Work Summary STATUS QUO A subset Vo (with.
The Price of Uncertainty in Communication Brendan Juba (Washington U., St. Louis) with Mark Braverman (Princeton)
A generalization of quantum Stein’s Lemma Fernando G.S.L. Brandão and Martin B. Plenio Tohoku University, 13/09/2008.
Data Communications and Networking
Reliable Deniable Communication: Hiding Messages in Noise The Chinese University of Hong Kong The Institute of Network Coding Pak Hou Che Mayank Bakshi.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Lecture 3 Appendix 1 Computation of the conditional entropy.
ON AVCS WITH QUADRATIC CONSTRAINTS Farzin Haddadpour Joint work with Madhi Jafari Siavoshani, Mayank Bakshi and Sidharth Jaggi Sharif University of Technology,
Samuel Cheng, Shuang Wang and Lijuan Cui University of Oklahoma
The Capacity of Interference Channels with Partial Transmitter Cooperation Ivana Marić Roy D. Yates Gerhard Kramer Stanford WINLAB, Rutgers Bell Labs Ivana.
Network Topology Single-level Diversity Coding System (DCS) An information source is encoded by a number of encoders. There are a number of decoders, each.
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
Ranking: Compare, Don’t Score Ammar Ammar, Devavrat Shah (LIDS – MIT) Poster ( No preprint), WIDS 2011.
Information geometry.
Distributed Compression For Still Images
Path Coupling And Approximate Counting
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
Independent Encoding for the Broadcast Channel
Distributed Compression For Binary Symetric Channels
Presentation transcript:

Universal Linked Multiple Access Source Codes Sidharth Jaggi Prof. Michelle Effros

zIntroduction Source Codes (SCs) Models for Source Coding

zIntroduction Source Codes (SCs) Universal SCs, memoryless distributions on, Models for Source Coding R

zIntroduction Source Codes (SCs) Multiple Access SCs (Slepian-Wolf) Models for Source Coding zmemoryless distributions on Y X Z Xavier Yvonne Zorba R X (Q) R Y (Q)

Source Codes (SCs) Multiple Access SCs Models for Source Coding Y X Z R X (Q) R Y (Q)

Source Codes (SCs) Universal SCsMultiple Access SCs Models for Source Coding R Y X Z R X (Q) R Y (Q)

Slepian-Wolf Rate Region R X (Q) R Y (Q)

Source Codes (SCs) Multiple Access SCs Universal MASCs? Models for Source Coding memoryless distributions on

Source Codes (SCs) Universal SCsMultiple Access SCs Universal MASCs? Models for Source Coding memoryless distributions on

Universal MASCs? Let

Universal MASCs?

Source Codes (SCs) Universal SCsMultiple Access SCs Missing Link Linked MASCs Models for Source Coding

Linked MASC (LMASC) Model Y X Z Xavier Yvonne Zorba

(0,0)-LMASCs Y X Z Xavier Yvonne Zorba

RXRX RYRY (0,0)-LMASC Rate Region z(0,0)-LMASC Rate Region = Slepian-Wolf Rate Region

Source Codes (SCs) Universal SCsMultiple Access SCs Linked MASCs Universal LMASCs?

Universal (0,0)-LMASCs Code Y X Z Xavier Yvonne Zorba

Universal (0,0)-LMASCs Code Y X Z Xavier Yvonne Zorba

Results for (0,0)-LMASCs If Example: then zTradeoffs

Y X Z Xavier Yvonne Zorba z LMASCs

Y X Z zAchievable Region zUniversal Coding Possible z LMASCs

Y X Z zAchievable Region zUniversal Coding possible z LMASCs

Y X Z Yvonne Zorba Xavier Algernon A z -encoder LMASCs= -encoder MASC z - encoder LMASCs zUniversal Coding possible

Y X Z Xavier Yvonne Zorba z(0,0)-FMASCs =(0,0)-LMASCs z(,)-FMASCs =(0,0)-LMASCs zUniversal Coding possible zFeedback MASCs

Proof Sketch - Universal LMASCs Y X Z Xavier Yvonne Zorba

Let be the type of Tell Zorba value of in bits. Y X Z Xavier Yvonne Zorba Proof Sketch - Universal LMASCs

Y X Z Xavier Yvonne Zorba Proof Sketch - Universal LMASCs Let be the type of

Proof Sketch - Universal LMASCs

What could possibly go wrong? Estimate “far off” Probability of ErrorRate Redundancy

What could possibly go wrong? Probability of ErrorRedundancy Atypicality Code fails Source mismatch

zProbability of ErrorzRedundancy What could possibly go wrong?

Conclusions X Z zMASC z(0,0)-LMASC Universality Y X Z Y z - LMASC Universality X Z Y

Conclusions X Z z(0,0)-FMASC Y zl -encoder - LMASC Universality X Z Y z - FMASC Complicated diagrams

The bottom line is… It WORKS!

Universal SCs Let, class of memoryless distributions on zPre-designed codes (“Guess”): Code C such that Eg: Csiszár and Körner, zAdaptive Codes (“Estimate”): Code C such that Eg: Lempel-Ziv

Multiple Sources Individual Encoding z Xavier and Yvonne encode using individually optimal strategies Y X Z Xavier Yvonne R2R2 Individual Encoding Rate region R2R2 H(X) H(Y)

Multiple Sources Joint Encoding z Xavier and Yvonne encode together Y X Z Xavier Yvonne R2R2 H(X,Y) Joint Encoding Rate region R2R2 H(X,Y)

Universal (0,0)-LMASCs “Guess-timate”… Let be the type of Tell Zorba value of in bits. Y X Z Xavier Yvonne Zorba

Universal (0,0)-LMASCs Choosing the following rates works Parameters of code Choose a pre-designed Slepian-Wolf code matched to pmf

Sketch of Proof By Sanov’s Theorem, probability of being “far-off” from is “small”

Sketch of Proof Assume Then

Sketch of Proof If Probability of Error

Sketch of Proof … or if Probability of Error

Sketch of Proof … or if code fails for Probability of Error

Sketch of Proof … or if code fails for Probability of Error

Sketch of Proof Expected Rate Overhead Rate Overhead in Code Design for pmf …

Sketch of Proof Expected Rate Overhead … and Source Mismatch. If …

Sketch of Proof Expected Rate Overhead … and Source Mismatch. If …

Sketch of Proof Expected Rate Overhead … and Source Mismatch. … and if …

Results Inter-Encoder Communication Probability of Error Expected Rate Overhead

Other System Models z LMASC Rate Region “Transfer of rate” LMASC Rate Region R X (Q) R Y (Q)

Main Result For any m(n)  satisfying (1), and i.I.D. Sources X and Y, there exists a sequence of encoders and decoder (f n,g n,h n ) such that. zE((r(X n, Y n )) differs from the boundary of the Slepian- wolf region by at most –3|x||y|  (n)log  (n)+n -1 m(n). (For any  (n) > m -½ (n)). zE(Prob(error)) = 2 -o(m(n) ). Further, the rate region for UMASC under the above constraints is identical to that of Slepian-wolf encoding for the same source.  2 (n)

Sketch of Proof Estimate of p(x,y) = p’(x,y) = m -1 (n)  i,j 1(x i =x,y j =y) Define max (x,y) |p(x,y)-p’(x,y)| =  0 By Sanov’s theorem, Pr(  0 >  ) = 2 -o(m(n)D(P ||P)) where D(P * ||P) = min P’  S D(P’||P), S={p’(x,y):max (x,y) |p(x,y)-p’(x,y)| >  } * S P P* D(P * ||P) 

Lemma 1 D(P*||P) = d(p(x,y)+  || p(x,y)) for some particular (x,y) (Lagrange optimization) =  (  2 ) for sufficiently small   D(P * ||P) c 2  2

Lemma 2 1. Max (x,y) |p(x,y)-p’(x,y)| <  (n)  |H P (X,Y)-H P’ (X,Y)|< -|x||y|  (n)log(  (n)) 2. |H p (x,y)-h p’ (x,y)|>   Max (x,y) |p(x,y)-p’(x,y)| >  (|X||Y|) -1 {p(x,y)} H P (X,Y)  

Choice of R(X n,Y m(n) ), R(X m(n),Y n ) 1.Estimate p’(x,y) 2.Choose m(n) and  (n) satisfying Theorem statement 3.Find p’’(x,y) such that 1.max (x,y) |p’(x,y)-p’’(x,y)| =  (n) 2.p’’(x,y) = argmax H P’’ (X,Y) subject to above 4.Encode using a Slepian-Wolf-like codebook for p’’(x,y) {p(x,y)} H P’’ (X,Y)    Probability Entropy 0 0 H P’ (X,Y)H P (X,Y) {p’’(x,y)}{p’(x,y)} A (n) A’ (n)  

Excess Rate over Slepian-Wolf Encoding 1(a) With high probability, max (x,y) |p’(x,y)-p’’(x,y)| < 2  (n) Contribution to excess rate at most –2|X||Y|  (n)log(  (n)) 1(b) If 1(a) not satisfied, contribution to expected excess rate at most 2 -o(m(n) ) log((|X||Y|). Absorb into 1(a) 2. Rate communicated to Zorba to inform him of choice of codebook = n -1 m(n)  2 (n)

Probability of Error 1. Probability of catastrophically incorrect p’(x,y) at most exp(-O(m(n)  2 (n))) 2. Probability of atypical (x n,y n ) at most exp(-O(n  2 (n))) 3. Probability of distinct typical elements decoding to the same codewords at most exp(-O(-n  (n)log  (n))) 1. Dominates over 2. and 3.

THE END