1 Information complexity and exact communication bounds April 26, 2013 Mark Braverman Princeton University Based on joint work with Ankit Garg, Denis Pankratov, and Omri Weinstein
Overview: information complexity Information complexity :: communication complexity as Shannon’s entropy :: transmission cost 2
Background – information theory Shannon (1948) introduced information theory as a tool for studying the communication cost of transmission tasks. 3 communication channel Alice Bob
Shannon’s entropy 4 communication channel X
Shannon’s noiseless coding 5
Shannon’s entropy – cont’d communication channel X Y
A simple example 7 Easy and complete!
Communication complexity [Yao] Focus on the two party randomized setting. 8 A B X Y F(X,Y) Meanwhile, in a galaxy far far away… Shared randomness R
Communication complexity A B X Y F(X,Y) m 1 (X,R) m 2 (Y,m 1,R) m 3 (X,m 1,m 2,R) Communication cost = #of bits exchanged. Shared randomness R
Communication complexity Numerous applications/potential applications (streaming, data structures, circuits lower bounds…) Considerably more difficult to obtain lower bounds than transmission (still much easier than other models of computation). Many lower-bound techniques exists. Exact bounds?? 10
Communication complexity 11
Set disjointness and intersection
Information complexity 13
Basic definition 1: The information cost of a protocol A B X Y Protocol π what Alice learns about Y + what Bob learns about X
Mutual information 15 H(A) H(B) I(A,B)
Basic definition 1: The information cost of a protocol A B X Y Protocol π what Alice learns about Y + what Bob learns about X
Example A B X Y what Alice learns about Y + what Bob learns about X MD5(X) [128 bits] X=Y? [1 bit]
Information complexity 18
Prior-free information complexity 19
Connection to privacy 20
Information equals amortized communication 21
Without priors 22
Intersection 23
The two-bit AND 24
The optimal protocol for AND A B X {0,1} Y {0,1} If X=1, A=1 If X=0, A=U [0,1] If Y=1, B=1 If Y=0, B=U [0,1] 0 1 “Raise your hand when your number is reached”
The optimal protocol for AND A B If X=1, A=1 If X=0, A=U [0,1] If Y=1, B=1 If Y=0, B=U [0,1] 0 1 “Raise your hand when your number is reached” X {0,1} Y {0,1}
Analysis 27
The analytical view A message is just a mapping from the current prior to a distribution of posteriors (new priors). Ex: 28 Y=0Y=1 X= X= Y=0Y=1 X=02/31/3 X=100 Y=0Y=1 X=000 X= Alice sends her bit “0”: 0.6 “1”: 0.4
The analytical view 29 Y=0Y=1 X= X= Y=0Y=1 X= X= Y=0Y=1 X=02/91/9 X=11/21/6 Alice sends her bit w.p ½ and unif. random bit w.p ½. “0”: 0.55 “1”: 0.45
Analytical view – cont’d 30
IC of AND 31
*Not a real protocol 32
Previous numerical evidence [Ma,Ishwar’09] – numerical calculation results. 33
Applications: communication complexity of intersection 34
Applications 2: set disjointness 35
A hard distribution? Y=0Y=1 X=01/4 X=11/4 Very easy!
A hard distribution Y=0Y=1 X=01/3 X=11/3 At most one (1,1) location!
Communication complexity of Disjointness 38
Small-set Disjointness 39
Using information complexity Y=0Y=1 X=01-2k/nk/n X=1k/n
Overview: information complexity Information complexity :: communication complexity as Shannon’s entropy :: transmission cost Today: focused on exact bounds using IC. 41
Selected open problems 1
Interactive compression? 43
Interactive compression? 44
Selected open problems 2 45
External information cost A B X Y Protocol π C
External information complexity 47
48 Thank You!