Shengyu Zhang The Chinese University of Hong Kong

Slides:



Advertisements
Similar presentations
1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Advertisements

Optimal Space Lower Bounds for All Frequency Moments David Woodruff MIT
Numerical Linear Algebra in the Streaming Model Ken Clarkson - IBM David Woodruff - IBM.
Optimal Space Lower Bounds for all Frequency Moments David Woodruff Based on SODA 04 paper.
The Average Case Complexity of Counting Distinct Elements David Woodruff IBM Almaden.
Efficient Private Approximation Protocols Piotr Indyk David Woodruff Work in progress.
Xiaoming Sun Tsinghua University David Woodruff MIT
Tight Lower Bounds for the Distinct Elements Problem David Woodruff MIT Joint work with Piotr Indyk.
Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Metric Embeddings As Computational Primitives Robert Krauthgamer Weizmann Institute of Science [Based on joint work with Alex Andoni]
Notation Intro. Number Theory Online Cryptography Course Dan Boneh
Turnstile Streaming Algorithms Might as Well Be Linear Sketches Yi Li Huy L. Nguyen David Woodruff.
Rotem Zach November 1 st, A rectangle in X × Y is a subset R ⊆ X × Y such that R = A × B for some A ⊆ X and B ⊆ Y. A rectangle R ⊆ X × Y is called.
CS151 Complexity Theory Lecture 6 April 15, 2015.
On the tightness of Buhrman- Cleve-Wigderson simulation Shengyu Zhang The Chinese University of Hong Kong On the relation between decision tree complexity.
CS151 Complexity Theory Lecture 7 April 20, 2004.
Quantum Algorithms II Andrew C. Yao Tsinghua University & Chinese U. of Hong Kong.
Shengyu Zhang The Chinese University of Hong Kong.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
Complexity Theory Lecture 2 Lecturer: Moni Naor. Recap of last week Computational Complexity Theory: What, Why and How Overview: Turing Machines, Church-Turing.
Quantum Computing MAS 725 Hartmut Klauck NTU TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A.
One-way multi-party communication lower bound for pointer jumping with applications Emanuele Viola & Avi Wigderson Columbia University IAS work done while.
Asymmetric Communication Complexity And its implications on Cell Probe Complexity Slides by Elad Verbin Based on a paper of Peter Bro Miltersen, Noam Nisan,
Fall 2013 CMU CS Computational Complexity Lectures 8-9 Randomness, communication, complexity of unique solutions These slides are mostly a resequencing.
Randomization Carmella Kroitoru Seminar on Communication Complexity.
Umans Complexity Theory Lectures Lecture 7b: Randomization in Communication Complexity.
Massive Data Sets and Information Theory Ziv Bar-Yossef Department of Electrical Engineering Technion.
Data Stream Algorithms Lower Bounds Graham Cormode
Communication Complexity Guy Feigenblat Based on lecture by Dr. Ely Porat Some slides where adapted from various sources Complexity course Computer science.
List Decoding Using the XOR Lemma Luca Trevisan U.C. Berkeley.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Tight Bound for the Gap Hamming Distance Problem Oded Regev Tel Aviv University TexPoint fonts used in EMF. Read the TexPoint manual before you delete.
Hartmut Klauck Centre for Quantum Technologies Nanyang Technological University Singapore.
Sketching complexity of graph cuts Alexandr Andoni joint work with: Robi Krauthgamer, David Woodruff.
Imperfectly Shared Randomness
Random Access Codes and a Hypercontractive Inequality for
Probabilistic Algorithms
Information Complexity Lower Bounds
New Characterizations in Turnstile Streams with Applications
Foundations of Secure Computation
Worst case to Average case Reductions for Polynomials
Approximating the MST Weight in Sublinear Time
Lecture 22: Linearity Testing Sparse Fourier Transform
And now for something completely different!
Randomized Algorithms
Communication Amid Uncertainty
Communication Complexity as a Lower Bound for Learning in Games
Communication Amid Uncertainty
Background: Lattices and the Learning-with-Errors problem
Maliciously Secure Two-Party Computation
Effcient quantum protocols for XOR functions
CS 154, Lecture 6: Communication Complexity
The Complexity of Algorithms and the Lower Bounds of Problems
Turnstile Streaming Algorithms Might as Well Be Linear Sketches
Linear sketching with parities
Cryptography Lecture 12 Arpita Patra © Arpita Patra.
Linear sketching over
Advances in Linear Sketching over Finite Fields
Linear sketching with parities
Classical Algorithms from Quantum and Arthur-Merlin Communication Protocols Lijie Chen MIT Ruosong Wang CMU.
Imperfectly Shared Randomness
Time Complexity Lecture 14 Sec 10.4 Thu, Feb 22, 2007.
Communication Amid Uncertainty
CS21 Decidability and Tractability
Emanuele Viola Harvard University June 2005
CSCI284 Spring 2009 GWU Sections 5.1, 5.2.2, 5.3
CS151 Complexity Theory Lecture 7 April 23, 2019.
Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Lecture 17 (2005) Richard Cleve DC 3524
Pseudorandomness: New Results and Applications
Presentation transcript:

Shengyu Zhang The Chinese University of Hong Kong On the Power of Randomness in Communication Shengyu Zhang The Chinese University of Hong Kong

Communication complexity [Yao79] Two parties, Alice and Bob, jointly compute a function f(x,y) with x known only to Alice and y only to Bob. Communication complexity: how many bits are needed to be exchanged? --- D(f) x y Alice Bob f(x,y) f(x,y)

Communication complexity: other models One-way: Alice sends a message to Bob. --- D1(f) SMP (Simultaneous Message Passing): They both send a message to a Referee. --- D∥(f) Relation: D(f) ≤ D1(f) ≤ D∥(f) y Alice Bob f(x,y) x y Alice Bob Ref f(x,y)

Applications of CC Though defined in an information theoretical setting, it turned out to provide lower bounds to many computational models. Data structures, circuit complexity, streaming algorithms, decision tree complexity, VLSI, algorithmic game theory, optimization, pseudo-randomness…

Rank lower bound Lower bounds in communication complexity are crucial (for those applications). Communication Matrix Mf = [f(x,y)](x,y). A c-bit protocol partitions the matrix into 2c rectangles (submatrices). All entries in a rectangle are the same either all 0 or all 1. 0 0 0 0 1 1 1 1 1 1

Rank lower bound So, where Mi is the submatrix in rectangle Ri. rank(Mf) = rank(∑i Mi) ≤ ∑i rank(Mi) ≤ 2c. [Thm] (logrank l.b.) D(f) ≥ log2 rank(Mf). [Open] (logrank Conj.) D(f) = poly(log2 rank(Mf)) M f = P 2 c i 1 0 0 0 0 1 1 1 1 1 1

Equality EQ (Equality): MEQ = [EQ(x,y)](x,y) = IN, where N = 2n. D(EQ) ≥ log2 rank(MEQ) = log2 rank(IN) = n. E Q ( x ; y ) = 1 i f 6 I N = 2 6 4 1 ¢ . 3 7 5

Communication complexity: Randomized Randomized communication complexity: min # bits exchanged s.t. output is correct w.p. 0.99. Private-coin: r1 and r2 are independent. --- Rpriv(f) Public-coin: r1 and r2 are the same. --- Rpub(f) r r2 x y Alice Bob

Communication complexity: other models One-way: Private-coin: Rpriv(f) Public-coin: Rpub(f) SMP r y Alice Bob x r1 r r2 y Alice Bob Ref

Randomized protocol for EQ Recall that we’ve proved that D(EQ) ≥ n. [RY] R1,priv(EQ) = O(log n). Fix some p in [n2, 2n2]. ∀a0…an-1∊{0,1}n, define a(t) = a0 + a1t + … + an-1tn-1 mod p Communication: 2log2 p = O(log n). x y Alice Bob Protocol: t ∊R {0, …, p-1} t, x(t) O u t p ( 1 i f x ) = y 6

Correctness Recall: p∊[n2,2n2], a(t) = a0 + a1t + … + an-1tn-1 mod p x Alice Bob y Protocol: t ∊R {0, …, p-1} t, x(t) O u t p ( 1 i f x ) = y 6 Recall: p∊[n2,2n2], a(t) = a0 + a1t + … + an-1tn-1 mod p Case x = y: x(t) = y(t), ∀t. So Bob outputs 1. Case x  y: x(t) and y(t) are different polynomials of degree n-1. [Thm] Any polynomial of degree d has ≤d roots. A random t∊{0, …, p-1} is one of these roots w.p. (n-1)/p < 1/n. So Bob outputs 1 w.p. < 1/n.

Public coin [Thm] R∥,pub(EQ) = O(1). Complexity: 1 from Alice and 1 from Bob. Case x = y: x·r = y·r, ∀r. Case x  y: x·r = y·r ⇔ (x⊕y)·r = 0. So Bob outputs 1 w.p. ½. [Fact] If z  00…0, then x y r 2 R f ; 1 g n Alice Bob Protocol: x ¢ r y ¢ r Ref O u t p ( 1 i f x ¢ r = y 6 x·r = x1r1 + … + xnrn mod 2 Repeat the protocol k times (with fresh r) decreases the error prob. to 2-k. z ¢ r = ( w . p 1 2

Approximate rank Given a matrix M, rankε(M) = min {rank(M’): |Mij - M’ij| ≤ ε}. Rpub(f) = Ω(log rankε(Mf)) Rpub(EQ) = O(1) ⇒ IN can be made to O(1) rank by perturbing each entry by 0.01. I N = 2 6 4 1 ¢ . 3 7 5

Newman’s result We’ve seen the difference between Rpub and Rpriv. [Newman91] Rpriv(f) ≤ Rpub(f) + O(log n). Let’s see whether we have time for the proof at the end of the talk.

Hamming Distance HD(x,y) = |x⊕y| = |{i: xi  yi}|. [Yao03] R||,pub(Hamd) = O(d2) [GKdW04] R||,pub(Hamd) = O(d log n) [HSZZ06] R||,pub(Hamd) = O(d log d) Also shows Rpub(Hamd) = Ω(d). H a m d ( x ; y ) = 1 i f D · > Q

[HSZZ06] R||,pub(Hamd) = O(d log d) Assume that HD(x,y) ≤ m = O(d2). The case of distinguishing HD(x,y) ≤ d and HD(x,y) > m is easy.

[HSZZ06] R||,pub(Hamd) = O(d log d) Assume that HD(x,y) ≤ m = O(d2). Complexity: O(d log(d2)) = O(d log d) x Alice Bob y Protocol: R a n d o m l y p r t i [ ] b c k s B ( 1 ) ; : . a j = P r i t y ( x B ) b j = P a r i t y ( B ) ( 8 j 2 [ m ] ) GKdW04-Protocol for Hamd(a,b) Ref

[HSZZ06] R||,pub(Hamd) = O(d log d) Assume that HD(x,y) ≤ m = O(d2). [Fact] Whp, each block contains ≤1 i s.t. xi  yi. Thus HD(xB(j),yB(j)) = Parity(xB(j)⊕yB(j)) = Parity(xB(j))⊕Parity(yB(j)) Hamd(x,y) = Hamd(a,b). x Alice Bob y Protocol: R a n d o m l y p r t i [ ] b c k s B ( 1 ) ; : . a j = P r i t y ( x B ) b j = P a r i t y ( B ) ( 8 j 2 [ m ] ) GKdW04-Protocol for Hamd(a,b) Ref

XOR functions EQ and Hamd are XOR functions f(x⊕y) EQ: f = OR Hamd: f = threshold(d). XOR functions are an important class of functions for communication complexity Relation to decision tree complexity Position in strongly balanced composed functions What can we say about CC of XOR fn’s?

Symmetric XOR functions f is symmetric: f(x) only depends on |x|. i.e. f(x) = f(π(x)), ∀π∊Sn. i.e. f(x) = S(|x|). Let r = r0 + r1, where r0, r1 ≤ n/2 are the min integers s.t. S(k)=S(k+2), ∀k∊[r0, n−r1). [SZ09] For symmetric fn’s, n/2 n r0 r1 … n/2 n r0 r1 … R ( f x © y ) = ~ £ r

R(f(x⊕y)) = Õ(r) Alice Bob Protocol: Use Hamr0 and Hamr1 to see |x⊕y| ∊ [0,r0), [r0,n-r1), or [n-r1,n]? If |x⊕y|∊[r0,n-r1): Alice sends Parity(x). If |x⊕y|∊[0,r0) or [n-r1,n], Alice and Bob use binary search and Hamd to find |x⊕y|. Output S(|x⊕y|). Complexity: O(r log r) + 1 + (log2r)O(r log r) = Õ(r). Correctness: If |x⊕y|∊[0,r0) or [n-r1,n], Alice and Bob finds |x⊕y| exactly. If |x⊕y|∊[r0,n-r1), then f(x) depends only on Parity(x⊕y), by def of r. So Alice sending Parity(x) is enough.

Questions for you [SZ09] Question: Close the gap for R1(f(x⊕y))! One-way: no binary search possible. Question: Close the gap for R1(f(x⊕y))! Question: Or even pin down R||,pub(f(x⊕y))? Question: What can we say about general XOR fn’s? R ( f x © y ) = ~ £ r , R 1 ; p u b ( f x © y ) = ~ O r 2

Thanks

Proof of R p r i v ² + ± ( f ) · u b O l o g n ¡ 1 ε-error pub-coin protocol → (ε+δ)-error pub-coin protocol using O(log n + logδ-1) random bits. Actually, Ǝr1, …, rt s.t. ∀(x,y), Pri∊[t] [P(x,y,ri)f(x,y)] < ε+δ (*) Why exist? Choose them randomly! Prr1…rt[(*)] < exp(-δ2t) // Chernoff bound < 2-2n, when t=O(n/δ2). Then Prr1…rt[Ǝ(x,y) s.t. (*)] < 1, i.e. Ǝr1, …, rt s.t. ∀(x,y), (*) happens.

Thanks Again