Download presentation
Presentation is loading. Please wait.
Published byHayden Stewart Modified over 11 years ago
1
The Equivalence of Sampling and Searching Scott Aaronson MIT
2
In complexity theory, we love at least four types of problems Languages / Decision Problems. Decide if x L or x L Promise Problems. Decide if x YES or x NO Search Problems. Output an element of a (nonempty) set A x {0,1} m, with probability 1-, in poly(n,1/ ) time Sampling Problems. Sample from a probability distribution D x over m-bit strings, with error in variation distance, in poly(n,1/ ) time Given an input x {0,1} n …
3
Suppose we want to know whether quantum computers are stronger than classical computers BPP vs. BQP? PromiseBPP vs. PromiseBQP? FBPP vs. FBQP? SampBPP vs. SampBQP? (To pick a random example of a complexity question) Then which formal question do we really mean to ask?
4
Easy Implications SampBPP=SampBQP FBPP=FBQP PromiseBPP=PromiseBQP BPP=BQP Crucial question: Can these implications be reversed? We show that at least one of them can: FBPP=FBQP SampBPP=SampBQP
5
Application to Linear Optics [A.-Arkhipov, STOC11] study a rudimentary type of quantum computer based entirely on linear optics: identical, non-interacting photons passing through a network of beamsplitters Our model doesnt seem to be universal for quantum computing (or even classical computing)but it can solve sampling problems that we give evidence are hard classically Using todays result, we automatically also get a search problem solvable with linear optics that ought to be hard classically
6
But the QC stuff is just one application of a much more general result… Informal Statement: Let S={D x } x be any sampling problem. Then there exists a search problem R S ={A x } x thats equivalent to S, in the following sense: For any reasonable complexity class C (BPP, BQP, BPPSPACE, etc.), R S FC S SampC
7
Intuition Suppose our sampling problem is to sample uniformly from a set A {0,1} n First stab at an equivalent search problem: output any element of A That clearly doesnt workfinding an A element could be much easier than sampling a random element! Better idea: output an element y A whose Kolmogorov complexity K(y) is close to log 2 |A|
8
Clearly, if we can sample a random y A, then with high probability K(y) log 2 |A| But conversely, if a randomized machine M outputs a y with K(y) log 2 |A|, it can only do so by sampling y almost-uniformly from A. For otherwise, M would yield a succinct description of y, contrary to assumption! Technical part: Generalize to nonuniform distributions Requires notion of a universal randomness test from algorithmic information theory
9
Comments Our reduction from sampling to search is non- black-box: it requires the assumption that we have a Turing machine to solve R S ! Our result provides an extremely natural application of Kolmogorov complexity to standard complexity: one that doesnt just amount to a counting argument If we just wanted a search problem at least as hard as S, that would be easy: Kolmogorov complexity only comes in because we need R S to be equivalent to S
10
Kolmogorov Review K(y | x): Prefix-free Kolmogorov complexity of y, conditioned on x Kolmogorentropy Lemma: Let D={p y } be a distribution, and let y be in its support. Then where K(D) is the length of the shortest program to sample from D. Same holds if we replace K(y) by K(y|x) and K(D) by K(D|x).
11
Constructing the Search Problem Were given a sampling problem S={D x } x, where on input x {0,1} n, >0, the goal is to sample an m-bit string from a distribution C thats -close to D=D x, in poly(n,1/ ) time. Let Then the search problem R S is this: on input x {0,1} n, >0, output an N-tuple Y= y 1,…,y N A x, with probability 1-, in poly(n,1/ ) time
12
Equivalence Proof Lemma: Let C be any distribution over {0,1} m such that |C-D x |. Then In other words, any algorithm that solves the sampling problem also solves the search problem w.h.p. Proof: Counting argument.
13
Lemma: Given a probabilistic Turing machine B, suppose Let C be the distribution over m-bit strings obtained by running B(x, ), then picking one its N outputs y 1,…,y N randomly. Then there exists a constant Q B such that Proof Sketch: Use Kolmogorentropy Lemma to show B(x, )s output distribution has small KL-divergence from D N. Similar to Parallel Repetition Theorem, this implies C has small KL-divergence from D. By Pinskers Inequality, this implies |C-D x | is small. In other words: if B solves the search problem w.h.p., then it also solves the sampling problem
14
Wrapping Up Theorem: Let O be any oracle that, given x, 0 1/, and a random string r, outputs a sample from a distribution C such that |C-D x |. Then R S FBPP O. Let B be any probabilistic Turing machine that, given x,0 1/, outputs a Y A x, with probability 1-. Then S SampBPP B.
15
Application to Quantum Complexity Suppose FBPP=FBQP. Let S SampBQP. Then R S FBQP [R S S reduction] R S FBPP[by hypothesis] S SampBPP.[S R S reduction] Therefore SampBPP=SampBQP.
16
Open Problems The converse direction: Given a search problem, can we construct an equivalent sampling problem? Can we show theres no black-box equivalence between search and sampling problems? (I.e., that our use of Kolmogorov complexity was necessary?) What if we want the search problem to be checkable? Can redo proof with space-bounded Kolmogorov complexity to put search problem in PSPACE, but seems hard to do better More equivalence theoremsideally, involving decision and promise problems?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.