Shengyu Zhang The Chinese University of Hong Kong.

Slides:



Advertisements
Similar presentations
Optimal Space Lower Bounds for All Frequency Moments David Woodruff MIT
Advertisements

The Learnability of Quantum States Scott Aaronson University of Waterloo.
The Future (and Past) of Quantum Lower Bounds by Polynomials Scott Aaronson UC Berkeley.
Limitations of Quantum Advice and One-Way Communication Scott Aaronson UC Berkeley IAS Useful?
Lower Bounds for Local Search by Quantum Arguments Scott Aaronson (UC Berkeley) August 14, 2003.
Optimal Space Lower Bounds for all Frequency Moments David Woodruff Based on SODA 04 paper.
Xiaoming Sun Tsinghua University David Woodruff MIT
Tight Lower Bounds for the Distinct Elements Problem David Woodruff MIT Joint work with Piotr Indyk.
Unconditional Weak derandomization of weak algorithms Explicit versions of Yao s lemma Ronen Shaltiel, University of Haifa :
Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Shengyu Zhang The Chinese University of Hong Kong.
1 Information complexity and exact communication bounds April 26, 2013 Mark Braverman Princeton University Based on joint work with Ankit Garg, Denis Pankratov,
Metric Embeddings As Computational Primitives Robert Krauthgamer Weizmann Institute of Science [Based on joint work with Alex Andoni]
Quantum Computing MAS 725 Hartmut Klauck NTU
CS151 Complexity Theory Lecture 6 April 15, 2015.
On the tightness of Buhrman- Cleve-Wigderson simulation Shengyu Zhang The Chinese University of Hong Kong On the relation between decision tree complexity.
Superdense coding. How much classical information in n qubits? Observe that 2 n  1 complex numbers apparently needed to describe an arbitrary n -qubit.
Randomized and Quantum Protocols in Distributed Computation Michael Ben-Or The Hebrew University Michael Rabin’s Birthday Celebration.
Avraham Ben-Aroya (Tel Aviv University) Oded Regev (Tel Aviv University) Ronald de Wolf (CWI, Amsterdam) A Hypercontractive Inequality for Matrix-Valued.
Derandomizing LOGSPACE Based on a paper by Russell Impagliazo, Noam Nissan and Avi Wigderson Presented by Amir Rosenfeld.
On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research.
Bit Complexity of Breaking and Achieving Symmetry in Chains and Rings.
Chapter 11 Limitations of Algorithm Power Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Sketching and Embedding are Equivalent for Norms Alexandr Andoni (Simons Inst. / Columbia) Robert Krauthgamer (Weizmann Inst.) Ilya Razenshteyn (MIT, now.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
Simulating independence: new constructions of Condensers, Ramsey Graphs, Dispersers and Extractors Boaz Barak Guy Kindler Ronen Shaltiel Benny Sudakov.
Is Communication Complexity Physical? Samuel Marcovitch Benni Reznik Tel-Aviv University arxiv
Complexity Theory Lecture 2 Lecturer: Moni Naor. Recap of last week Computational Complexity Theory: What, Why and How Overview: Turing Machines, Church-Turing.
Tight Bounds for Graph Problems in Insertion Streams Xiaoming Sun and David P. Woodruff Chinese Academy of Sciences and IBM Research-Almaden.
1 Information and interactive computation January 16, 2012 Mark Braverman Computer Science, Princeton University.
Small clique detection and approximate Nash equilibria Danny Vilenchik UCLA Joint work with Lorenz Minder.
Information Complexity Lower Bounds for Data Streams David Woodruff IBM Almaden.
Quantum Computing MAS 725 Hartmut Klauck NTU TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A.
Algorithms  Al-Khwarizmi, arab mathematician, 8 th century  Wrote a book: al-kitab… from which the word Algebra comes  Oldest algorithm: Euclidian algorithm.
Aug 2, Quantum Communication Complexity Richard Cleve Institute for Quantum Computing University of Waterloo.
1 Lower Bounds Lower bound: an estimate on a minimum amount of work needed to solve a given problem Examples: b number of comparisons needed to find the.
Information Theory for Data Streams David P. Woodruff IBM Almaden.
PODC Distributed Computation of the Mode Fabian Kuhn Thomas Locher ETH Zurich, Switzerland Stefan Schmid TU Munich, Germany TexPoint fonts used in.
A limit on nonlocality in any world in which communication complexity is not trivial IFT6195 Alain Tapp.
Quantum Cryptography Slides based in part on “A talk on quantum cryptography or how Alice outwits Eve,” by Samuel Lomonaco Jr. and “Quantum Computing”
Asymmetric Communication Complexity And its implications on Cell Probe Complexity Slides by Elad Verbin Based on a paper of Peter Bro Miltersen, Noam Nisan,
Umans Complexity Theory Lectures Lecture 7b: Randomization in Communication Complexity.
Massive Data Sets and Information Theory Ziv Bar-Yossef Department of Electrical Engineering Technion.
Data Stream Algorithms Lower Bounds Graham Cormode
Forrelation: A Problem that Optimally Separates Quantum from Classical Computing.
Communication Complexity Guy Feigenblat Based on lecture by Dr. Ely Porat Some slides where adapted from various sources Complexity course Computer science.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Tight Bound for the Gap Hamming Distance Problem Oded Regev Tel Aviv University TexPoint fonts used in EMF. Read the TexPoint manual before you delete.
Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.
Quantum Cryptography Antonio Acín
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Hartmut Klauck Centre for Quantum Technologies Nanyang Technological University Singapore.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Random Access Codes and a Hypercontractive Inequality for
Probabilistic Algorithms
Information Complexity Lower Bounds
Andris Ambainis (Latvia), Martins Kokainis (Latvia),
Unbounded-Error Classical and Quantum Communication Complexity
Effcient quantum protocols for XOR functions
Analysis and design of algorithm
Linear sketching with parities
Quantum Information Theory Introduction
Linear sketching over
Linear sketching with parities
Chapter 11 Limitations of Algorithm Power
Imperfectly Shared Randomness
Shengyu Zhang The Chinese University of Hong Kong
Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Lecture 17 (2005) Richard Cleve DC 3524
Presentation transcript:

Shengyu Zhang The Chinese University of Hong Kong

Quantum Computing Communication Complexity Question: What’s the largest gap between classical and quantum communication complexities? Algorithms Info. theory crypto games … Circuit lb Streaming Algorithms VLSI Data Structures …

Communication complexity [Yao79] Two parties, Alice and Bob, jointly compute a function f(x,y) with x known only to Alice and y only to Bob. Communication complexity: how many bits are needed to be exchanged? AliceBob f(x,y) xy

Various protocols Deterministic: D(f) Randomized: R(f) –A bounded error probability is allowed. –Private or public coins? Differ by ±O(log n). Quantum: Q(f) –A bounded error probability is allowed. –Assumption: No shared Entanglement. (Does it help? Open.)

Communication complexity: one-way model One-way: Alice sends a message to Bob. --- D 1 (f), R 1 (f), Q 1 (f) AliceBob x y f(x,y)

About one-way model Power: –Efficient protocols for specific functions such as Equality, Hamming Distance, and in general, all symmetric XOR functions. Applications: –Lower bound for space complexity of streaming algorithms. Lower bound? Can be quite hard, especially for quantum. As efficient as the best two-way protocol.

Question Question: What’s the largest gap between classical and quantum communication complexities? Partial functions, relations: exponential. Total functions, two-way: –Largest gap: Q(Disj) = Θ( √n), R(Disj) = Θ( n). –Best bound: R(f) = exp(Q(f)). Conjecture: R(f) = poly(Q(f)).

Question Question: What’s the largest gap between classical and quantum communication complexities? Partial functions, relations: exponential. Total functions, one-way: –Largest gap: R 1 (EQ) = 2∙Q 1 (EQ), –Best bound: R 1 (f) = exp(Q 1 (f)). Conjecture: R 1 (f) = poly(Q 1 (f)), –or even R 1 (f) = O(Q 1 (f)).

Approaches Approach 1: Directly simulate a quantum protocol by classical one. –[Aaronson] R 1 (f) = O(m∙Q 1 (f)). Approach 2: L(f) ≤ Q 1 (f) ≤ R 1 (f) ≤ poly(L(f)). –[Nayak99; Jain, Z.’09] R 1 (f) = O(I μ ∙VC(f)), where I μ is the mutual info of any hard distribution μ. Note: For the approach 2 to be possibly succeed, the quantum lower bound L(f) has to be polynomially tight for Q 1 (f).

Main result There are three lower bound techniques known for Q 1 (f). –Nayak’99: Partition Tree –Aaronson’05: Trace Distance –The two-way complexity Q(f) [Thm] All of these lower bounds can be arbitrarily weak. Actually, random functions have Q(f) = Ω(n), but the first two lower bounds only give O(1).

Next Closer look at the Partition Tree bound. Compare Q and Partition Tree (PT) and Trace Distance (TD) bounds.

Nayak’s info. theo. argument [Nayak’99] Q 1 (Index) = Ω(n). –ρ x contains Ω(1) info of x 1, since i may be 1. –Regardless of x 1, ρ x contains Ω(1) info of x 2. –And so on. AliceBob x  {0,1} n i  [n] Index(x,i) = x i ρxρx

Nayak’s info. theo. argument ρ = ∑ x p x ∙ρ x S( ρ) = S(½ρ 0 +½ρ 1 ) // ρ b = 2 ∑ x:x_1=b p x ∙ρ x ≥ I(X 1,M 1 ) + ½S(ρ 0 )+½S(ρ 1 ) // Holevo bound. M 1 : Bob’s conclusion about X 1 ≥ 1 – H(ε) + ½S(ρ 0 )+½S(ρ 1 ) // Fano’s Inequ. ≥ … ≥ n(1 – H(ε)). AliceBob x  {0,1} n i  [n] Index(x,i) = x i ρxρx

Partition tree ρ = ∑ x p x ∙ρ x ρ b = 2 ∑ x:x1=b p x ∙ρ x ρ b1b2 = 4 ∑ x:x1=b1,x2=b2 p x ∙ρ x ρ 00 ρ 01 ρ 10 ρ 11 ρ0ρ0 ρ1ρ1 ρ AliceBob x  {0,1} n i  [n] Index(x,i) = x i ρxρx

Partition tree ρ = ∑ x p x ∙ρ x In general: –Distri. p on {0,1} n –Partition tree for {0,1} n –Gain H( δ)-H(ε) at v v is partitioned by (δ,1-δ) AliceBob x  {0,1} n i  [n] Index(x,i) = x i ρxρx

Issue [Fano’s inequality] I(X;Y) ≥ H(X) – H(ε). –X,Y over {0,1}. – ε = Pr[X ≠ Y]. What if H(δ) < H(ε)? Idea 1: use success amplification to decrease ε to ε*. Idea 2: give up those vertices v with small H(X). Bound: max T,p,ε* log(1/ε*)∙ ∑ v p(v)[H(X v )- H(ε*)] + Question: How to calculate this? H(δ)

Picture clear max T,p,ε* log(1/ε*)∙ ∑ v p(v)[H(X v )- H(ε*)] + Very complicated. Compare to Index where the tree is completely binary and each H(δ v ) = 1 (i.e. δ v =1/2). [Thm] the maximization is achieved by a complete binary tree with δ v =1/2 everywhere.

Two interesting comparisons Comparison to decision tree: –Decision tree complexity: make the longest path short –Here: make the shortest path long. Comparison to VC-dim lower bound: [Thm] The value is exactly the extensive equivalence query complexity. –A measure in learning theory. –Strengthen the VC-dim lower bound by Nayak.

Trace distance bound [Aaronson’05] –μ is a distri on 1-inputs –D 1 : (x, y) ← μ. –D 2 : y ← μ, x 1, x 2 ← μ y. Then Q 1 (f) = Ω(log ∥D 2 -D 1 2 ∥ 1 -1 )

Separation [Thm] Take a random graph G(N,p) with ω(log 4 N/N) ≤ p ≤ 1- Ω(1). Its adjacency matrix, as a bi-variate function f, has the following w.p. 1-o(1) Q(f) = Ω(log(pN)). Q*(f) ≥ Q(f) = Ω(log(1/disc(f))). disc(f) is related to σ 2 ( D -1/2 AD -1/2 ), which can be bounded by O(1/√pN) for a random graph.

[Thm] For p = N - Ω(1), PT(f) = O(1) w.h.p. –By our characterization, it’s enough to consider complete binary tree. –For p = N - Ω(1), each layer of tree shrinks the #1’s by a factor of p. pN → p 2 N → p 3 N → … → 0: Only O(1) steps. [Thm] For p = o(N -6/7 ), TD(f) = O(1) w.h.p. –Quite technical, omitted here.

Putting together [Thm] Take a random graph G(N,p) with ω(log 4 N/N) ≤ p ≤ 1- Ω(1). Its adjacency matrix, as a bi-variate function f, has the following w.p. 1-o(1) Q(f) = Ω(log(pN)). [Thm] For p = o(N -6/7 ), TD(f) = O(1) w.h.p. [Thm] For p = N - Ω(1), PT(f) = O(1) w.h.p. Taking p between ω(log 4 N/N) and o(N -6/7 ) gives the separation.

Discussions Negative results on the tightness of known quantum lower bound methods. Calls for new method. Somehow combine the advantages of these methods? –Hope the paper shed some light on this by identifying their weakness.