Presentation is loading. Please wait.

Presentation is loading. Please wait.

When Qubits Go Analog A Relatively Easy Problem in Quantum Information Theory Scott Aaronson (MIT)

Similar presentations


Presentation on theme: "When Qubits Go Analog A Relatively Easy Problem in Quantum Information Theory Scott Aaronson (MIT)"— Presentation transcript:

1 When Qubits Go Analog A Relatively Easy Problem in Quantum Information Theory Scott Aaronson (MIT)

2 Erik Demaine (motivated by a computational genetics problem): Suppose a PSPACE machine can flip a coin with bias p an unlimited number of times. Can it extract an exponential amount of information (or even more) about p? Me: Im sure whatever the answer is, its obvious... Didnt seem too likely there could be superpowerful Advice Coins

3 Indeed, Hellman-Cover (1970) proved the following... Suppose a probabilistic finite automaton is trying to decide whether a coin has bias ½ or ½+. Then even if it can flip the coin an unlimited number of times, the automaton needs (1/ ) states to succeed with probability (say) 2/3. Implies PSPACE/coin = PSPACE/poly Bias= 0.000000000000000110101111101 poly(n) advice bitsanother poly(n) advice bits

4 Yet quantum mechanics nullifies the Hellman-Cover Theorem! Theorem: For any >0, can distinguish a coin with bias p=½ from a coin with bias p=½+ (with bounded error) using a single qutrit of memory. Keep flipping the coin. Whenever the coin lands heads, rotate /100 radians counterclockwise. Whenever it lands tails, rotate /100 radians clockwise. Halt with probability ~ 2 /100 at each time step, by measuring along the third dimension Expected difference in final angle after halting, in p=½ vs. p=½+ cases: 1 radian Standard deviation in angle:

5 So, could BQPSPACE/coin=ALL? Theorem: No. Proof: Lets even let the machine run infinitely long; it only has to get the right answer in the limit Let 0 = superoperator applied to our memory qubits whenever coin lands heads, 1 = superoperator when it lands tails Then induced superoperator at each time step: Were interested in a fixed-point of p : a mixed state p such that p = coins bias

6 Fixed-Points of Superoperators Studied by [A.-Watrous 2008] in the context of quantum computing with closed timelike curves 0.000000000000000110101111101 Our result there: BQP CTC = P CTC = PSPACE Quantum computers with CTCs have exactly the same power as classical computers with CTCs, namely PSPACE (or: CTCs make time and space equivalent as computational resources)

7 Key Point 0.000000000000000110101111101 Fixed-point p of a superoperator p can be expressed in terms of degree-2 s rational functions of p, where s is the number of qubits (Proof: Use Cramers Rule on 2 s 2 s matrices) Let a x (p) be the probability that the PSPACE machine accepts, on input x {1,...,N} and an advice coin with bias p Then a x (p) is a degree-2 s rational function of p By calculus, a degree-2 s rational function can cross the origin (or the line y=½) at most 2 2 s times

8 To specify p, well enough to decide whether a x (p) ½ for any x: suffices to say how many reals 0<q<p there are such that some a x (p) crosses the line y=½ at q This takes log 2 (2 2 s N)=s+1+log 2 (N) bits So, coin can specify distinct functions p a x (p)

9 OK, how about a harder problem? Is there an oracle relative to which BQP PH? Given oracle access to two Boolean functions, f,g:{-1,1} n {-1,1} n Promised that either (1)f and g are both uniformly random, or (2)f,g were chosen by picking a random unit vector and letting f(x)=sgn(v x ), g(x)=sgn(H n v x ) Problem: Decide which New candidate problem we should use for this: Fourier Checking

10 I claim this problem is in BQP then measure in the Hadamard basis On the other hand, I conjecture the Fourier Checking problem is not in PH I can show that any poly(n) bits of f(x) and g(x) are close to uniformly random. I conjecture that this suffices to put the problem outside PH (Generalized Linial-Nisan Conjecture)

11 NO WE CANT PROVE QUANTUM LOWER BOUNDS


Download ppt "When Qubits Go Analog A Relatively Easy Problem in Quantum Information Theory Scott Aaronson (MIT)"

Similar presentations


Ads by Google