Download presentation
Presentation is loading. Please wait.
Published byFrederick Simpson Modified over 9 years ago
1
Universal Linked Multiple Access Source Codes Sidharth Jaggi Prof. Michelle Effros
2
zIntroduction Source Codes (SCs) Models for Source Coding 101010011111010101101000101010101100101010101 010101010110101001111101010101101110001110100 010110101010101010011001010111101011010101001 0
3
zIntroduction Source Codes (SCs) Universal SCs, memoryless distributions on, Models for Source Coding 0101101111010110100 1111011001101100001 1010100101110100010 1111010101101001101 0101100111101011010 0010110100101011101 R
4
zIntroduction Source Codes (SCs) Multiple Access SCs (Slepian-Wolf) Models for Source Coding zmemoryless distributions on Y X Z Xavier Yvonne Zorba R X (Q) R Y (Q)
5
Source Codes (SCs) Multiple Access SCs Models for Source Coding Y X Z R X (Q) R Y (Q)
6
Source Codes (SCs) Universal SCsMultiple Access SCs Models for Source Coding R Y X Z R X (Q) R Y (Q)
7
Slepian-Wolf Rate Region R X (Q) R Y (Q)
8
Source Codes (SCs) Multiple Access SCs Universal MASCs? Models for Source Coding memoryless distributions on
9
Source Codes (SCs) Universal SCsMultiple Access SCs Universal MASCs? Models for Source Coding memoryless distributions on
10
Universal MASCs? Let
11
Universal MASCs?
12
Source Codes (SCs) Universal SCsMultiple Access SCs Missing Link Linked MASCs Models for Source Coding
13
Linked MASC (LMASC) Model Y X Z Xavier Yvonne Zorba
14
(0,0)-LMASCs Y X Z Xavier Yvonne Zorba
15
RXRX RYRY (0,0)-LMASC Rate Region z(0,0)-LMASC Rate Region = Slepian-Wolf Rate Region
16
Source Codes (SCs) Universal SCsMultiple Access SCs Linked MASCs Universal LMASCs?
17
Universal (0,0)-LMASCs Code Y X Z Xavier Yvonne Zorba
18
Universal (0,0)-LMASCs Code Y X Z Xavier Yvonne Zorba
19
Results for (0,0)-LMASCs If Example: then zTradeoffs
20
Y X Z Xavier Yvonne Zorba z LMASCs
21
Y X Z zAchievable Region zUniversal Coding Possible z LMASCs
22
Y X Z zAchievable Region zUniversal Coding possible z LMASCs
23
Y X Z Yvonne Zorba Xavier Algernon A z -encoder LMASCs= -encoder MASC z - encoder LMASCs zUniversal Coding possible
24
Y X Z Xavier Yvonne Zorba z(0,0)-FMASCs =(0,0)-LMASCs z(,)-FMASCs =(0,0)-LMASCs zUniversal Coding possible zFeedback MASCs
25
Proof Sketch - Universal LMASCs Y X Z Xavier Yvonne Zorba
26
Let be the type of Tell Zorba value of in bits. Y X Z Xavier Yvonne Zorba Proof Sketch - Universal LMASCs
27
Y X Z Xavier Yvonne Zorba Proof Sketch - Universal LMASCs Let be the type of
28
Proof Sketch - Universal LMASCs
29
What could possibly go wrong? Estimate “far off” Probability of ErrorRate Redundancy
30
What could possibly go wrong? Probability of ErrorRedundancy Atypicality Code fails Source mismatch
31
zProbability of ErrorzRedundancy What could possibly go wrong?
32
Conclusions X Z zMASC z(0,0)-LMASC Universality Y X Z Y z - LMASC Universality X Z Y
33
Conclusions X Z z(0,0)-FMASC Y zl -encoder - LMASC Universality X Z Y z - FMASC Complicated diagrams
34
The bottom line is… It WORKS!
35
Universal SCs Let, class of memoryless distributions on zPre-designed codes (“Guess”): Code C such that Eg: Csiszár and Körner, zAdaptive Codes (“Estimate”): Code C such that Eg: Lempel-Ziv
36
Multiple Sources Individual Encoding z Xavier and Yvonne encode using individually optimal strategies Y X Z Xavier Yvonne R2R2 Individual Encoding Rate region R2R2 H(X) H(Y)
37
Multiple Sources Joint Encoding z Xavier and Yvonne encode together Y X Z Xavier Yvonne R2R2 H(X,Y) Joint Encoding Rate region R2R2 H(X,Y)
38
Universal (0,0)-LMASCs “Guess-timate”… Let be the type of Tell Zorba value of in bits. Y X Z Xavier Yvonne Zorba
39
Universal (0,0)-LMASCs Choosing the following rates works Parameters of code Choose a pre-designed Slepian-Wolf code matched to pmf
40
Sketch of Proof By Sanov’s Theorem, probability of being “far-off” from is “small”
41
Sketch of Proof Assume Then
42
Sketch of Proof If Probability of Error
43
Sketch of Proof … or if Probability of Error
44
Sketch of Proof … or if code fails for Probability of Error
45
Sketch of Proof … or if code fails for Probability of Error
46
Sketch of Proof Expected Rate Overhead Rate Overhead in Code Design for pmf …
47
Sketch of Proof Expected Rate Overhead … and Source Mismatch. If …
48
Sketch of Proof Expected Rate Overhead … and Source Mismatch. If …
49
Sketch of Proof Expected Rate Overhead … and Source Mismatch. … and if …
50
Results Inter-Encoder Communication Probability of Error Expected Rate Overhead
51
Other System Models z LMASC Rate Region “Transfer of rate” LMASC Rate Region R X (Q) R Y (Q)
52
Main Result For any m(n) satisfying (1), and i.I.D. Sources X and Y, there exists a sequence of encoders and decoder (f n,g n,h n ) such that. zE((r(X n, Y n )) differs from the boundary of the Slepian- wolf region by at most –3|x||y| (n)log (n)+n -1 m(n). (For any (n) > m -½ (n)). zE(Prob(error)) = 2 -o(m(n) ). Further, the rate region for UMASC under the above constraints is identical to that of Slepian-wolf encoding for the same source. 2 (n)
53
Sketch of Proof Estimate of p(x,y) = p’(x,y) = m -1 (n) i,j 1(x i =x,y j =y) Define max (x,y) |p(x,y)-p’(x,y)| = 0 By Sanov’s theorem, Pr( 0 > ) = 2 -o(m(n)D(P ||P)) where D(P * ||P) = min P’ S D(P’||P), S={p’(x,y):max (x,y) |p(x,y)-p’(x,y)| > } * S P P* D(P * ||P)
54
Lemma 1 D(P*||P) = d(p(x,y)+ || p(x,y)) for some particular (x,y) (Lagrange optimization) = ( 2 ) for sufficiently small D(P * ||P) c 2 2
55
Lemma 2 1. Max (x,y) |p(x,y)-p’(x,y)| < (n) |H P (X,Y)-H P’ (X,Y)|< -|x||y| (n)log( (n)) 2. |H p (x,y)-h p’ (x,y)|> Max (x,y) |p(x,y)-p’(x,y)| > (|X||Y|) -1 {p(x,y)} H P (X,Y)
56
Choice of R(X n,Y m(n) ), R(X m(n),Y n ) 1.Estimate p’(x,y) 2.Choose m(n) and (n) satisfying Theorem statement 3.Find p’’(x,y) such that 1.max (x,y) |p’(x,y)-p’’(x,y)| = (n) 2.p’’(x,y) = argmax H P’’ (X,Y) subject to above 4.Encode using a Slepian-Wolf-like codebook for p’’(x,y) {p(x,y)} H P’’ (X,Y) Probability Entropy 0 0 H P’ (X,Y)H P (X,Y) {p’’(x,y)}{p’(x,y)} A (n) A’ (n)
57
Excess Rate over Slepian-Wolf Encoding 1(a) With high probability, max (x,y) |p’(x,y)-p’’(x,y)| < 2 (n) Contribution to excess rate at most –2|X||Y| (n)log( (n)) 1(b) If 1(a) not satisfied, contribution to expected excess rate at most 2 -o(m(n) ) log((|X||Y|). Absorb into 1(a) 2. Rate communicated to Zorba to inform him of choice of codebook = n -1 m(n) 2 (n)
58
Probability of Error 1. Probability of catastrophically incorrect p’(x,y) at most exp(-O(m(n) 2 (n))) 2. Probability of atypical (x n,y n ) at most exp(-O(n 2 (n))) 3. Probability of distinct typical elements decoding to the same codewords at most exp(-O(-n (n)log (n))) 1. Dominates over 2. and 3.
59
THE END
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.