Quantum One-Way Communication is Exponentially Stronger than Classical Communication TexPoint fonts used in EMF. Read the TexPoint manual before you delete.

Slides:



Advertisements
Similar presentations
Optimal Space Lower Bounds for All Frequency Moments David Woodruff MIT
Advertisements

How Much Information Is In Entangled Quantum States? Scott Aaronson MIT |
The Learnability of Quantum States Scott Aaronson University of Waterloo.
Quantum t-designs: t-wise independence in the quantum world Andris Ambainis, Joseph Emerson IQC, University of Waterloo.
Quantum Versus Classical Proofs and Advice Scott Aaronson Waterloo MIT Greg Kuperberg UC Davis | x {0,1} n ?
Lower Bounds for Local Search by Quantum Arguments Scott Aaronson.
Limitations of Quantum Advice and One-Way Communication Scott Aaronson UC Berkeley IAS Useful?
QMA/qpoly PSPACE/poly: De-Merlinizing Quantum Protocols Scott Aaronson University of Waterloo.
The Equivalence of Sampling and Searching Scott Aaronson MIT.
Numerical Linear Algebra in the Streaming Model Ken Clarkson - IBM David Woodruff - IBM.
The Average Case Complexity of Counting Distinct Elements David Woodruff IBM Almaden.
Optimal Bounds for Johnson- Lindenstrauss Transforms and Streaming Problems with Sub- Constant Error T.S. Jayram David Woodruff IBM Almaden.
Numerical Linear Algebra in the Streaming Model
Tight Lower Bounds for the Distinct Elements Problem David Woodruff MIT Joint work with Piotr Indyk.
On the Amortized Complexity of Zero-Knowledge Proofs Ronald Cramer, CWI Ivan Damgård, Århus University.
Shortest Vector In A Lattice is NP-Hard to approximate
Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
Quantum Information and the PCP Theorem Ran Raz Weizmann Institute.
Random non-local games Andris Ambainis, Artūrs Bačkurs, Kaspars Balodis, Dmitry Kravchenko, Juris Smotrovs, Madars Virza University of Latvia.
Random non-local games Andris Ambainis, Artūrs Bačkurs, Kaspars Balodis, Dmitry Kravchenko, Juris Smotrovs, Madars Virza University of Latvia.
The Communication Complexity of Approximate Set Packing and Covering
Quantum Computing MAS 725 Hartmut Klauck NTU
C&O 355 Mathematical Programming Fall 2010 Lecture 12 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
Theoretical Analysis. Objective Our algorithm use some kind of hashing technique, called random projection. In this slide, we will show that if a user.
The Unique Games Conjecture with Entangled Provers is False Julia Kempe Tel Aviv University Oded Regev Tel Aviv University Ben Toner CWI, Amsterdam.
Eigenvalues and Eigenvectors
1 List Coloring and Euclidean Ramsey Theory TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A Noga Alon, Tel Aviv.
Complexity 26-1 Complexity Andrei Bulatov Interactive Proofs.
A Parallel Repetition Theorem for Any Interactive Argument Iftach Haitner Microsoft Research TexPoint fonts used in EMF. Read the TexPoint manual before.
Derandomized DP  Thus far, the DP-test was over sets of size k  For instance, the Z-Test required three random sets: a set of size k, a set of size k-k’
Chapter 5 Orthogonality
Department of Computer Science & Engineering University of Washington
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Superdense coding. How much classical information in n qubits? Observe that 2 n  1 complex numbers apparently needed to describe an arbitrary n -qubit.
Avraham Ben-Aroya (Tel Aviv University) Oded Regev (Tel Aviv University) Ronald de Wolf (CWI, Amsterdam) A Hypercontractive Inequality for Matrix-Valued.
Oded Regev Tel-Aviv University On Lattices, Learning with Errors, Learning with Errors, Random Linear Codes, Random Linear Codes, and Cryptography and.
Oded Regev (Tel Aviv University) Ben Toner (CWI, Amsterdam) Simulating Quantum Correlations with Finite Communication.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
BB84 Quantum Key Distribution 1.Alice chooses (4+  )n random bitstrings a and b, 2.Alice encodes each bit a i as {|0>,|1>} if b i =0 and as {|+>,|->}
Lo-Chau Quantum Key Distribution 1.Alice creates 2n EPR pairs in state each in state |  00 >, and picks a random 2n bitstring b, 2.Alice randomly selects.
1 Introduction to Quantum Information Processing QIC 710 / CS 768 / PH 767 / CO 681 / AM 871 Richard Cleve QNC 3129 Lecture 18 (2014)
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 9 Hypothesis Testing.
Correlation testing for affine invariant properties on Shachar Lovett Institute for Advanced Study Joint with Hamed Hatami (McGill)
1 Introduction to Quantum Information Processing QIC 710 / CS 667 / PH 767 / CO 681 / AM 871 Richard Cleve DC 2117 Lecture 16 (2011)
All of Statistics Chapter 5: Convergence of Random Variables Nick Schafer.
Entropy-based Bounds on Dimension Reduction in L 1 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A AAAA A Oded Regev.
Section 8.1 Estimating  When  is Known In this section, we develop techniques for estimating the population mean μ using sample data. We assume that.
Information Complexity Lower Bounds for Data Streams David Woodruff IBM Almaden.
Quantum Computing MAS 725 Hartmut Klauck NTU TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A.
1 CS546: Machine Learning and Natural Language Discriminative vs Generative Classifiers This lecture is based on (Ng & Jordan, 02) paper and some slides.
Communication Complexity Rahul Jain Centre for Quantum Technologies and Department of Computer Science National University of Singapore. TexPoint fonts.
Quantum Computing MAS 725 Hartmut Klauck NTU
A limit on nonlocality in any world in which communication complexity is not trivial IFT6195 Alain Tapp.
Massive Data Sets and Information Theory Ziv Bar-Yossef Department of Electrical Engineering Technion.
Lattice-based cryptography and quantum Oded Regev Tel-Aviv University.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Tight Bound for the Gap Hamming Distance Problem Oded Regev Tel Aviv University TexPoint fonts used in EMF. Read the TexPoint manual before you delete.
Complexity 24-1 Complexity Andrei Bulatov Interactive Proofs.
Random Access Codes and a Hypercontractive Inequality for
Information Complexity Lower Bounds
Dimension reduction for finite trees in L1
Unbounded-Error Classical and Quantum Communication Complexity
Sampling of min-entropy relative to quantum knowledge Robert König in collaboration with Renato Renner TexPoint fonts used in EMF. Read the TexPoint.
Background: Lattices and the Learning-with-Errors problem
Sketching and Embedding are Equivalent for Norms
CS 154, Lecture 6: Communication Complexity
Quantum Information Theory Introduction
Renaming and Oriented Manifolds
Clustering.
Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Lecture 17 (2005) Richard Cleve DC 3524
Presentation transcript:

Quantum One-Way Communication is Exponentially Stronger than Classical Communication TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A AAAA A Bo’az Klartag Tel Aviv University Oded Regev Tel Aviv University

Alice is given input x and Bob is given yAlice is given input x and Bob is given y Their goal is to compute some (possibly partial) function f(x,y) using the minimum amount of communicationTheir goal is to compute some (possibly partial) function f(x,y) using the minimum amount of communication Two central models:Two central models: 1.Classical (randomized bounded-error) communication 2.Quantum communication Communication Complexity x y f(x,y)

Relation Between Models

Raz’s quantum protocol, however, requires two rounds of communicationRaz’s quantum protocol, however, requires two rounds of communication This naturally leads to the following fundamental question:This naturally leads to the following fundamental question: Is quantum one-way communication exponentially stronger than classical communication? Is One-way Communication Enough?

[BarYossef-Jayram-Kerenidis’04] showed a relational problem for which quantum one- way communication is exponentially stronger than classical one-way[BarYossef-Jayram-Kerenidis’04] showed a relational problem for which quantum one- way communication is exponentially stronger than classical one-way This was improved to a (partial) function by [Gavinsky-Kempe-Kerenidis-Raz-deWolf’07]This was improved to a (partial) function by [Gavinsky-Kempe-Kerenidis-Raz-deWolf’07] [Gavinsky’08] showed a relational problem for which quantum one-way communication is exponentially stronger than classical communication[Gavinsky’08] showed a relational problem for which quantum one-way communication is exponentially stronger than classical communication Previous Work

Our Result

Alice is given a unit vector v   n and Bob is given an n/2-dimensional subspace W   nAlice is given a unit vector v   n and Bob is given an n/2-dimensional subspace W   n They are promised that eitherThey are promised that either v is in W or v is in W  v is in W or v is in W  Their goal is to decide which is the case using the minimum amount of communicationTheir goal is to decide which is the case using the minimum amount of communication Vector in Subspace Problem [Kremer95,Raz99] vnvnvnvn W   n n/2-dimsubspace

There is an easy logn qubit one-way protocol –A–A–A–Alice sends a logn qubit state corresponding to her input and Bob performs the projective measurement specified by his input No classical lower bound was known We settle the open question by proving: R(VIS)=Ω(n1/3) This is nearly tight as there is an O(n1/2) protocol Vector in Subspace Problem

Open Questions Improve the lower bound to a tight n 1/2Improve the lower bound to a tight n 1/2 Should be possible using the “smooth rectangle bound”Should be possible using the “smooth rectangle bound” Improve to a functional separation between quantum SMP and classicalImprove to a functional separation between quantum SMP and classical Seems very challenging, and maybe even impossibleSeems very challenging, and maybe even impossible

The Proof

A routine application of the rectangle bound (which we omit), shows that the following implies the Ω (n 1/3 ) lower bound:A routine application of the rectangle bound (which we omit), shows that the following implies the Ω (n 1/3 ) lower bound: Thm 1: Let A  S n-1 be an arbitrary set of measure at least exp(-n 1/3 ). Let H be a uniform n/2 dimensional subspace. Then, the measure of A  H is 1  0.1 thatThm 1: Let A  S n-1 be an arbitrary set of measure at least exp(-n 1/3 ). Let H be a uniform n/2 dimensional subspace. Then, the measure of A  H is 1  0.1 that of A except with probability at most exp(-n 1/3 ). Remark: this is tightRemark: this is tight The Main Sampling Statement A

Thm 1 is proven by a recursive application of the following:Thm 1 is proven by a recursive application of the following: Thm 2: Let A  S n-1 be an arbitraryThm 2: Let A  S n-1 be an arbitrary set of measure at least exp(-n 1/3 ). Let H be a uniform n-1 dimensional subspace. Then, the measure of A  H is 1  t that of A except with probability at most exp(-t n 2/3 ). So the error is typicallySo the error is typically 1  n -2/3 and has 1  n -2/3 and has exponential tail Sampling Statement for Equators A

Here is an equivalent way to choose a uniform n/2 dimensional subspace:Here is an equivalent way to choose a uniform n/2 dimensional subspace: –First choose a uniform n-1 dimensional subspace, then choose inside it a uniform n-2 dimensional subspace, etc. Thm 2 shows that at each step we get an extra multiplicative error of 1  n -2/3. Hence, after n/2 steps, the error becomes 1  n 1/2 · n -2/3 = 1  n -1/6Thm 2 shows that at each step we get an extra multiplicative error of 1  n -2/3. Hence, after n/2 steps, the error becomes 1  n 1/2 · n -2/3 = 1  n -1/6 Assuming a normal behavior, this means probability of deviating by more than 1  0.1 is at most exp(-n 1/3 )Assuming a normal behavior, this means probability of deviating by more than 1  0.1 is at most exp(-n 1/3 ) (Actually proving all of this requires a very delicate martingale argument…)(Actually proving all of this requires a very delicate martingale argument…) Thm 1 from Thm 2

The proof of Theorem 2 is based on:The proof of Theorem 2 is based on: –the hypercontractive inequality on the sphere, –the Radon transform, and –spherical harmonics. We now present a proof of an analogous statement in the possibly more familiar setting of the hypercube {0,1} nWe now present a proof of an analogous statement in the possibly more familiar setting of the hypercube {0,1} n Proof of Theorem 2 A

For y  {0,1} n let y  be the set of all w  {0,1} n at Hamming distance n/2 from yFor y  {0,1} n let y  be the set of all w  {0,1} n at Hamming distance n/2 from y Let A  {0,1} n be of measure  (A):=|A|/2 n at least exp(-n 1/3 )Let A  {0,1} n be of measure  (A):=|A|/2 n at least exp(-n 1/3 ) Assume we choose a uniform y  {0,1} nAssume we choose a uniform y  {0,1} n Then our goal is to show that the fraction of points of y  contained in A is (1  n -1/3 )  (A) except with probability at most exp(-n 1/3 )Then our goal is to show that the fraction of points of y  contained in A is (1  n -1/3 )  (A) except with probability at most exp(-n 1/3 ) –(This is actually false for a minor technical reason…) Equivalently, our goal is to prove that for all A,B  {0,1} n of measure at least exp(-n 1/3 ),Equivalently, our goal is to prove that for all A,B  {0,1} n of measure at least exp(-n 1/3 ), Hypercube Sampling

For a function f:{0,1} n  , define its Radon transform R(f):{0,1} n   asFor a function f:{0,1} n  , define its Radon transform R(f):{0,1} n   as Define f=1 A /  (A) and g=1 B /  (B)Define f=1 A /  (A) and g=1 B /  (B) Then our goal is to proveThen our goal is to prove Radon Transform

So our goal is to proveSo our goal is to prove This is equivalent toThis is equivalent to It is easy to check that R is diagonal in the Fourier basis, and that its eigenvalues are 1,0,-1/n,0,1/n 2,… Hence above sum is equal to:It is easy to check that R is diagonal in the Fourier basis, and that its eigenvalues are 1,0,-1/n,0,1/n 2,… Hence above sum is equal to: Fourier Transform Negligible

Lemma [Gavinsky-Kempe-Kerenidis-Raz-deWolf’07] : Let A  {0,1} n be of measure , and let f=1 A /  (A) be its normalized indicator function. Then,Lemma [Gavinsky-Kempe-Kerenidis-Raz-deWolf’07] : Let A  {0,1} n be of measure , and let f=1 A /  (A) be its normalized indicator function. Then, Equivalently, this lemma says the following: if (x 1,…,x n ) is uniformly chosen from A, then the sum over all pairs {i,j} of the bias squared of x i  x j is at most (log(1/  )) 2.Equivalently, this lemma says the following: if (x 1,…,x n ) is uniformly chosen from A, then the sum over all pairs {i,j} of the bias squared of x i  x j is at most (log(1/  )) 2. This is tight: take, e.g.,This is tight: take, e.g., Average Bias

Lemma [Gavinsky-Kempe-Kerenidis-Raz-deWolf’07] : Let A  {0,1} n be of measure , and let f=1 A /  (A) be its normalized indicator function. Then,Lemma [Gavinsky-Kempe-Kerenidis-Raz-deWolf’07] : Let A  {0,1} n be of measure , and let f=1 A /  (A) be its normalized indicator function. Then, Proof: By the hypercontractive inequality, for all 1  p  2,Proof: By the hypercontractive inequality, for all 1  p  2, Average Bias