Download presentation
Presentation is loading. Please wait.
1
Information Complexity Lower Bounds
Rotem Oshman, Princeton CCI Based on: Bar-Yossef,Jayram,Kumar,Srinivasanβ04 Braverman,Barak,Chen,Raoβ10
2
Communication Complexity
π π,π = ? π π Yao β79, βSome complexity questions related to distributive computingβ
3
Communication Complexity
Applications: Circuit complexity Streaming algorithms Data structures Distributed computing Property testing β¦
4
Deterministic Protocols
A protocol Ξ specifies, at each point: Which player speaks next What should the player say When to halt and what to output Formally, Ξ : 0,1 β β π΄,π΅,β₯ Γ 0,1 what weβve said so far who speaks next: π΄= Alice, π΅= Bob, β₯ = halt what to say/output
5
Randomized Protocols Can use randomness to decide what to say
Private randomness: each player has a separate source of random bits Public randomness: both players can use the same random bits Goal: for any π,π compute π π,π correctly with probability β₯1 βπ Communication complexity: worst-case length of transcript in any execution
6
Randomness Can Help a Lot
Example: Equality π,π Input: π,πβ 0,1 π Output: is π=π ? Trivial protocol: Alice sends π to Bob For deterministic protocols, this is optimal!
7
Equality Lower Bound 0 π 1 π β¦ 1 #rectangles β€ 2 transcript
8
Randomized Protocol Protocol with public randomness:
Select random πβ 0,1 π Alice sends π,π = π=1 π π π π π mod 2 Bob accepts iff π,π = π,π If π=π: always accept If πβ π: π,π + π,π mod 2= π+π,π mod 2 Reject with probability 1/2 non-zero vector
9
Set Disjointness Input: π,πβ 1,β¦,π Output: πβ©π=β
?
Theorem [Kalyanasundaran, Schnitger β92, Razborov β92]: randomized CC = Ξ© π Easy to see for deterministic protocols Today weβll see a proof by Bar-Yossef, Jayram, Kumar, Srinivasan β04
10
Application: Streaming Lower Bounds
Streaming algorithm: Example: how many distinct items in the data? Reduction from Disjointness [Alon, Matias, Szegedy β99] How much space is required to approximate f(data)? algorithm data
11
Reduction from Disjointness:
Fix a streaming algorithm for Distinct Elements with space π, universe size π Construct a protocol for Disj. with π elements: π={ π₯ 1 ,β¦, π₯ π } π={ π¦ 1 ,β¦, π¦ β } algorithm πβ©π=β
β #distinct elements in πβͺπ is π + π State of the algorithm and π (#bits = π+log π)
12
Application 2: KW Games Circuit depth lower bounds:
How deep does the circuit need to be? β§ β¨ π₯ 1 β¦ π₯ π π(π₯ 1 ,β¦, π₯ π )
13
Application 2: KW Games Karchmer-Wigdersonβ93,Karchmer-Raz-Wigdersonβ94: find π such that π π β π π π :π π =0 π:π π =1
14
Application 2: KW Games Claim: if πΎ π π has deterministic CC β₯π, then π requires circuit depth β₯π. Circuit with depth π β protocol with length π 1 β§ β¨ π₯ 1 β¦ π₯ π 1 1 1 π :π π =0 π:π π =1
15
Information-Theoretic Lower Bound on Set Disjointness
16
Some Basic Concepts from Info Theory
Entropy of a random variable: π» π =β π₯ Pr π=π₯ log Pr π=π₯ Important properties: π» π β₯0 π» π =0 β π is deterministic π»(π) = expected # bits needed to encode π
17
Some Basic Concepts from Info Theory
Conditional entropy: π» π π = πΈ π¦ π»( π |π=π¦ ) Important properties: π» π|π β€π»(π) π» π|π =π» π β π,π are independent Example: π,πβΌBernoulli π» π =β 1 2 log β 1 2 log =1 If π=0 then π=π, if π=1 then π=1βπ π» π π,π = 1 2 π» π|π π» π|1βπ =0 π» π π =1
18
Some Basic Concepts from Info Theory
Mutual information: πΌ π;π =π» π βπ» π π =π» π βπ»(π|π) Conditional mutual information: πΌ π;π|π =π» π|π βπ» π π,π =π» π|π βπ»(π|π,π) Important properties: πΌ π;π β₯0 πΌ π;π =0 β π,π are independent
19
Some Basic Concepts from Info Theory
Chain rule for mutual information: πΌ π 1 , π 2 ;π =πΌ π 1 ;π +πΌ π 2 ;π π 1 More generally, πΌ π 1 ,β¦, π π ;π = π=1 π πΌ π π ;π π 1 ,β¦, π πβ1
20
Information Cost of Protocols
Fix an input distribution π on π,π Given a protocol Ξ , let Ξ also denote the distribution of Ξ βs transcript Information cost of Ξ : πΌπΆ Ξ =πΌ Ξ ;π π +πΌ Ξ ;π π Information cost of a function π: πΌ πΆ π π = inf Ξ solves π π€/errorβ€π πΌπΆ Ξ
21
Information Cost of Protocols
Important property: πΌπΆ Ξ β€|Ξ | Proof: by induction. Let Ξ = Ξ 1 β¦ Ξ π‘ . βπβ€π‘ : πΌ Ξ β€π ;π π +πΌ Ξ β€π ;π π β€π. πΌ Ξ β€π ;π π +πΌ Ξ β€π ;π π what we know after r rounds =πΌ Ξ <π ;π π +πΌ Ξ <π ;π π what we knew after r-1 rounds + πΌ Ξ π ;Y X, Ξ <π +πΌ Ξ π ;X Y, Ξ <π what we learn in round r, given what we already know
22
Information vs. Communication
Want: πΌ Ξ π ;Y X, Ξ <π +πΌ Ξ π ;X Y, Ξ <π β€1. Suppose Ξ π is sent by Alice. What does Alice learn? Ξ π is a function of Ξ <π and π, so πΌ Ξ π ;Y X, Ξ <π =0. What does Bob learn? πΌ Ξ π ;Y X, Ξ <π β€ Ξ π =1.
23
Information vs. Communication
Important property: πΌπΆ Ξ β€|Ξ | Lower bound on information cost β lower bound on communication complexity In fact, IC lower bounds are the most powerful technique we know
24
Information Complexity of Disj.
Disjointness: is πβ©π=β
? Disj π,π = π=1 π π π β§ π π Strategy: for some βhard distributionβ π, Direct sum: πΌ πΆ π π Disj β₯πβ
πΌ πΆ π (And) Prove that πΌ πΆ π And β₯Ξ©(1).
25
Hard Distribution for Disjointness
For each coordinate πβ π : π π =0 π π =1 π π =0 1/3 1/3 π π =1 1/3
26
πΌ πΆ π π Disj β₯πβ
πΌ πΆ π (And) Let Ξ be a protocol for Disj on π,πβ 0,1 π
Construct Ξ β² for And as follows: Alice and Bob get inputs π,πβ 0,1 Choose a random coordinate πβ π , set π π =π, π π =π Sample π βπ , π βπ and run Ξ For each πβ π, π π β§ π π =0 β Disj π,π =And π π , π π π π π π
27
πΌ πΆ π π Disj β₯πβ
πΌ πΆ π (And) Let Ξ be a protocol for Disj on π,πβ 0,1 π
Construct Ξ β² for And as follows: Alice and Bob get inputs π,πβ 0,1 Choose a random coordinate πβ π , set π π =π, π π =π Bad idea: publicly sample π βπ , π βπ π Suppose in Ξ , Alice sends π 1 ββ¦β π π . In Ξ , Bob learns one bit β in Ξ β² he should learn 1/π bit But if π βπ is public Bob learns 1 bit about π! π π π
28
πΌ πΆ π π Disj β₯πβ
πΌ πΆ π (And) Let Ξ be a protocol for Disj on π,πβ 0,1 π
Construct Ξ β² for And as follows: Alice and Bob get inputs π,πβ 0,1 Choose a random coordinate πβ π , set π π =π, π π =π Another bad idea: publicly sample π βπ , Bob privately samples π βπ given π βπ But the players canβt sample π βπ , π βπ independentlyβ¦
29
πΌ πΆ π π Disj β₯πβ
πΌ πΆ π (And) Let Ξ be a protocol for Disj on π,πβ 0,1 π
Construct Ξ β² for And as follows: Alice and Bob get inputs π,πβ 0,1 Choose a random coordinate πβ π , set π π =π, π π =π Publicly sample π 1 ,β¦, π πβ1 Privately sample π (π+1) ,β¦, π π π π Privately sample π 1 ,β¦, π πβ1 Publicly sample π π+1 ,β¦, π π π π
30
Direct Sum Theorem Transcript of Ξ β² =π, π <π , π >π ,Ξ
Need to show: πΌ π Ξ β² ;π π + πΌ π Ξ β² ;π π β€ πΌ π π Ξ ;π π + πΌ π π Ξ ;π π /π πΌ π Ξ β² ;π π = πΌ π π π, π <π , π >π ,Ξ ; π π π π = πΌ π π Ξ ; π π π β€π , π >π ,π + πΌ π π π, π <π , π >π ; π π π π β€πΌ Ξ , π >π ; π π π β€π , π >π ,π =πΌ π >π ; π π π β€π , π >π ,π +πΌ Ξ ; π π π, π >π ,π =(1/π) π=1 π πΌ Ξ ; π π π, π >π =πΌ(Ξ ;πβπ)/π.
31
Information Complexity of Disj.
Disjointness: is πβ©π=β
? Disj π,π = π=1 π π π β§ π π Strategy: for some βhard distributionβ π, Direct sum: πΌ πΆ π π Disj β₯πβ
πΌ πΆ π (And) Prove that πΌ πΆ π And β₯Ξ©(1). οΌ
32
Hardness of And πΌπΆ And =πΌ Ξ ;π π + πΌ Ξ ;π π β₯Ξ© 1
2 3 πΌ Ξ ;π π= πΌ Ξ ;π π=1 + 2 3 πΌ Ξ ;π π= πΌ Ξ ;π π=1 11 01 00 10 1/3 transcript on 11 should be βvery differentβ =0 =0
33
Hellinger Distance β 2 π,π =1β π π π π π Examples:
β 2 π,π =1β π π π π π Examples: β 2 π,π =1β π π π 2 =1β π π π =0 If π,π have disjoint support, β π,π =1
34
Hellinger Distance Hellinger distance is a metric
β π,π β₯0, with equality iff π=π β π,π =β π,π Triangle inequality: β π,π β€β π,π
+β π
,π π π π
35
Hellinger Distance If for some π we have π π βπ π =πΏ, then
β 2 π,π β₯ πΏ 2 11 01 00 10 ββ₯
36
Hellinger Distance vs. Mutual Info
Let π 0 , π 1 be two distributions Select π by choosing π½βΌBernoulli , then drawing πβΌ π π½ Then πΌ π;π½ β₯ β 2 π 0 , π 1 11 01 00 10 1/3 πΌ Ξ ;π π=0 β₯ β 2 Ξ 00 , Ξ 01 πΌ Ξ ;π π=0 β₯ β 2 Ξ 00 , Ξ 10
37
Hardness of And 01 11 00 10 1/3 1/3 1/3 Same for Bob until
Alice acts differently 1/3 01 ββ₯ 11 Same for Alice until Bob acts differently 1/3 1/3 00 10
38
βCut-n-Paste Lemmaβ β Ξ 00 , Ξ 11 =β Ξ 01 , Ξ 10
β Ξ 00 , Ξ 11 =β Ξ 01 , Ξ 10 Recall: β 2 Ξ ππ , Ξ π β² π β² =1β π‘ Ξ ππ π‘ Ξ π β² π β² π‘ Enough to show: we can write Ξ ππ π‘ = π π΄ π‘,π β
π π΅ π‘,π β 2 Ξ 00 , Ξ 11 =1β π‘ π π΄ π‘,0 π π΅ π‘,0 π π΄ π‘,1 π π΅ π‘,1 =1β π‘ π π΄ π‘,0 π π΅ π‘,1 π π΄ π‘,0 π π΅ π‘,1 = β 2 Ξ 01 , Ξ 11
39
βCut-n-Paste Lemmaβ We can write Ξ ππ π‘ = π π΄ π‘,π β
π π΅ π‘,π Proof:
Ξ ππ π‘ = π π΄ π‘,π β
π π΅ π‘,π Proof: Ξ induces a distribution on βpartial transcriptsβ of each length π: Ξ ππ π π‘ = probability that first π bits are π‘ By induction: Ξ ππ π π‘ = π π΄ π π‘,π β
π π΅ π π‘,π Base case: Ξ ππ π π =1 Set π π΄ π π,π = π π΅ π π,π =1
40
βCut-n-Paste Lemmaβ Step: Ξ ππ π+1 π‘ = Ξ ππ π π‘ β€π β
Pr next bit= π‘ π+1 Suppose after π‘ β€π it is Aliceβs turn to speak What Alice says depends on: Her input Her private randomness The transcript so far, π‘ β€π So Pr next bit= π‘ π+1 =π π‘ β€π ,π, π‘ π+1 =π π‘,π Set π π΄ π+1 π‘,π = π π΄ π π‘ β€π ,π β
π π‘,π , π π΅ π+1 π‘,π = π π΅ π π‘ β€π ,π
41
Hardness of And 1/3 πΌπΆ And =πΌ Ξ ;π π + πΌ Ξ ;π π = πΌ Ξ ;π π= πΌ Ξ ;π π=0 β₯constβ
β 2 Ξ 00 , Ξ β 2 Ξ 00 , Ξ 10 β₯constβ²β
β Ξ 00 , Ξ 01 +β Ξ 00 , Ξ β₯constβ²β
β 2 Ξ 01 , Ξ 10 =constβ²β
β 2 Ξ 00 , Ξ 11 β₯Ξ© 1 01 ββ₯ 11 1/3 1/3 00 10
42
Multi-Player Communication Complexity
43
The Coordinator Model π π 1 ,β¦, π π = ? π sites π bits π 1 π 2 π π β¦
44
Multi-Party Set Disjointness
Input: π 1 ,β¦, π π β π Output: is β π π =β
? Braverman,Ellen,O.,Pitassi,Vaikuntanathanβ13: lower bound of Ξ© ππ bits
45
Reduction from Disj to graph connectivity
Given π 1 ,β¦, π π , we want to Choose vertices π Design inputs πΈ 1 ,β¦, πΈ π such that πΊ π, πΈ 1 βͺβ¦βͺ πΈ π π, πΈ 1 βͺβ¦βͺ πΈ π is connected iff β π π =β
46
Reduction from Disj to graph connectivity
π 1 π 2 π π (Players) (Elements) 1 2 3 4 5 6 π π π β β π π input graph connected β β π π β π β π π β β
47
Other Stuff Distributed computing
48
Other Stuff Compressing down to information cost
Number-on-forehead lower bounds Open questions in communication complexity
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.