Presentation is loading. Please wait.

Presentation is loading. Please wait.

Of 12 03/22/2012CISS: Compression w. Uncertain Priors1 Compression under uncertain priors Madhu Sudan Microsoft, New England Based on joint works with:

Similar presentations


Presentation on theme: "Of 12 03/22/2012CISS: Compression w. Uncertain Priors1 Compression under uncertain priors Madhu Sudan Microsoft, New England Based on joint works with:"— Presentation transcript:

1 of 12 03/22/2012CISS: Compression w. Uncertain Priors1 Compression under uncertain priors Madhu Sudan Microsoft, New England Based on joint works with: Adam Kalai (MSR), Sanjeev Khanna (U.Penn) and Brendan Juba (Harvard)

2 of 12 Motivation: Human Communication Human communication (dictated by languages, grammars) very different. Human communication (dictated by languages, grammars) very different. Grammar: Rules, often violated. Grammar: Rules, often violated. Dictionary: Often multiple meanings to a word. Dictionary: Often multiple meanings to a word. Redundant: But not as in any predefined way (not an error-correcting code). Redundant: But not as in any predefined way (not an error-correcting code). Theory? Theory? Information theory? Information theory? Linguistics? (Universal grammars etc.)? Linguistics? (Universal grammars etc.)? 03/22/2012CISS: Compression w. Uncertain Priors2

3 of 12 Behavioral aspects of natural communication (Vast) Implicit context. (Vast) Implicit context. Sender sends increasingly long messages to receiver till receiver “gets” (the meaning of) the message. Sender sends increasingly long messages to receiver till receiver “gets” (the meaning of) the message. Sender may use feedback from receiver if available; or estimates receiver’s knowledge if not. Sender may use feedback from receiver if available; or estimates receiver’s knowledge if not. Language provides sequence of (increasingly) long ways to represent a message. Language provides sequence of (increasingly) long ways to represent a message. Question: What is the benefit of choosing short/long messages? Question: What is the benefit of choosing short/long messages? 03/22/2012CISS: Compression w. Uncertain Priors3

4 of 12 Model: Reason to choose short messages: Compression. Reason to choose short messages: Compression. Channel is still a scarce resource; still want to use optimally. Channel is still a scarce resource; still want to use optimally. Reason to choose long messages (when short ones are available): Reducing ambiguity. Reason to choose long messages (when short ones are available): Reducing ambiguity. Sender unsure of receiver’s prior (context). Sender unsure of receiver’s prior (context). Sender wishes to ensure receiver gets the message, no matter what its prior (within reason). Sender wishes to ensure receiver gets the message, no matter what its prior (within reason). 03/22/2012CISS: Compression w. Uncertain Priors4

5 of 12 Model 03/22/2012CISS: Compression w. Uncertain Priors5

6 of 12 Contrast with some previous models Universal compression? Universal compression? Doesn’t apply: P,Q are not finitely specified. Doesn’t apply: P,Q are not finitely specified. Don’t have a sequence of samples from P; just one! Don’t have a sequence of samples from P; just one! K-L divergence? K-L divergence? Measures inefficiency of compressing for Q if real distribution is P. Measures inefficiency of compressing for Q if real distribution is P. But assumes encoding/decoding according to same distribution Q. But assumes encoding/decoding according to same distribution Q. Semantic Communication: Semantic Communication: Uncertainty of sender/receiver; but no special goal. Uncertainty of sender/receiver; but no special goal. 03/22/2012CISS: Compression w. Uncertain Priors6

7 of 12 Closeness of distributions: 03/22/2012CISS: Compression w. Uncertain Priors7

8 of 12 Dictionary = Shared Randomness? 03/22/2012CISS: Compression w. Uncertain Priors8

9 of 12 Solution (variant of Arith. Coding) 03/22/2012CISS: Compression w. Uncertain Priors9

10 of 12 Performance 03/22/2012CISS: Compression w. Uncertain Priors10

11 of 12 Implications 03/22/2012CISS: Compression w. Uncertain Priors11

12 of 12 Future work? Upcoming: Upcoming: Some partial derandomization Some partial derandomization [w. E. Haramaty and G.Ranade] [w. E. Haramaty and G.Ranade] Neat connections to fractional graph chromaticity and the Knesser conjecture/Lovasz theorem. Neat connections to fractional graph chromaticity and the Knesser conjecture/Lovasz theorem. Needed: Needed: Better understanding of forces on language. Better understanding of forces on language. Information-theoretic Information-theoretic Computational Computational Evolutionary Evolutionary 03/22/2012CISS: Compression w. Uncertain Priors12

13 of 12 Thank You 03/22/2012CISS: Compression w. Uncertain Priors13


Download ppt "Of 12 03/22/2012CISS: Compression w. Uncertain Priors1 Compression under uncertain priors Madhu Sudan Microsoft, New England Based on joint works with:"

Similar presentations


Ads by Google