Quantum Computing and the Limits of the Efficiently Computable

Slides:



Advertisements
Similar presentations
Multilinear Formulas and Skepticism of Quantum Computing Scott Aaronson UC Berkeley IAS.
Advertisements

NP-complete Problems and Physical Reality
An Invitation to Quantum Complexity Theory The Study of What We Cant Do With Computers We Dont Have Scott Aaronson (MIT) QIP08, New Delhi BQP NP- complete.
New Evidence That Quantum Mechanics Is Hard to Simulate on Classical Computers Scott Aaronson Parts based on joint work with Alex Arkhipov.
Pretty-Good Tomography Scott Aaronson MIT. Theres a problem… To do tomography on an entangled state of n qubits, we need exp(n) measurements Does this.
BQP PSPACE NP P PostBQP Limits on Efficient Computation in the Physical World Scott Aaronson MIT.
The Computational Complexity of Linear Optics Scott Aaronson and Alex Arkhipov MIT vs.
Quantum Computing and the Limits of the Efficiently Computable Scott Aaronson MIT.
New Computational Insights from Quantum Optics Scott Aaronson.
Scott Aaronson Associate Professor, EECS Quantum Computers and Beyond.
Solving Hard Problems With Light Scott Aaronson (Assoc. Prof., EECS) Joint work with Alex Arkhipov vs.
Quantum Computing and the Limits of the Efficiently Computable
Quantum Computing and the Limits of the Efficiently Computable Scott Aaronson MIT.
Scott Aaronson (MIT) The Limits of Computation: Quantum Computers and Beyond.
Copyright © CALTECH SCOTT AARONSON Massachusetts Institute of Technology QUANTUM COMPUTING AND THE LIMITS OF THE EFFICIENTLY COMPUTABLE.
THE QUANTUM COMPLEXITY OF TIME TRAVEL Scott Aaronson (MIT)
The Cryptographic Hardness of Decoding Hawking Radiation Scott Aaronson (MIT)
Computational Phenomena in Physics Scott Aaronson MIT.
Exploring the Limits of the Efficiently Computable Scott Aaronson (MIT) Papers & slides at
Quantum Computing and the Limits of the Efficiently Computable Scott Aaronson MIT.
By: Mike Neumiller & Brian Yarbrough
Quantum Computing and the Limits of the Efficiently Computable Scott Aaronson (MIT)
Quantum Computing and the Limits of the Efficiently Computable Scott Aaronson MIT.
Quantum Computing and the Limits of the Efficiently Computable Scott Aaronson MIT.
Can computer science help physicists resolve the firewall paradox?
Introduction to Quantum Computing
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 667 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 653 Lecture.
Verification of BosonSampling Devices Scott Aaronson (MIT) Talk at Simons Institute, February 28, 2014.
The Kind of Stuff I Think About Scott Aaronson (MIT) LIDS Lunch, October 29, 2013 Abridged version of plenary talk at NIPS’2012.
An Introduction to Quantum Computation Sandy Irani Department of Computer Science University of California, Irvine.
Quantum Computing and the Limits of the Efficiently Computable Scott Aaronson (MIT) Papers & slides at
Quantum Computing and the Limits of the Efficiently Computable Scott Aaronson (MIT  UT Austin) NYSC, West Virginia, June 24, 2016.
Limits on Efficient Computation in the Physical World
Mr Barton’s Maths Notes
Scott Aaronson (MIT) April 30, 2014
Complexity-Theoretic Foundations of Quantum Supremacy Experiments
Scott Aaronson Computer Science, UT Austin AAAS Meeting, Feb. 19, 2017
Scott Aaronson (MIT) QIP08, New Delhi
Poomipat Phusayangkul
Introduction to Quantum Computing Lecture 1 of 2
Scott Aaronson Associate Professor, EECS
Bio Scott Aaronson is David J. Bruton Centennial Professor of Computer Science at the University of Texas at Austin.  He received his bachelor's from Cornell.
Quantum Computing and the Limits of the Efficiently Computable
Scott Aaronson (Computer Science) Explore UT Day March 4, 2017
Firewalls, AdS/CFT, and Computational Complexity
Quantum Computing and the Limits of the Efficiently Computable
Limits and Horizon of Computing
NP-Completeness Yin Tat Lee
Black Holes, Firewalls, and the Limits of Quantum Computers
Quantum Computing: What’s It Good For?
Black Holes, Firewalls, and the Complexity of States and Unitaries
Three Questions About Quantum Computing
Scott Aaronson (UT Austin) October 28, 2016
Black Holes, Firewalls, and the Limits of Quantum Computers
Quantum Computing and the Limits of the Efficiently Computable
Three Questions About Quantum Computing
Scott Aaronson (UT Austin) Lakeway Men’s Breakfast Club April 19, 2017
3rd Lecture: QMA & The local Hamiltonian problem (CNT’D)
Halting Problem.
Computational Complexity and Fundamental Physics
Quantum Computing and the Quest for Quantum Computational Supremacy
Mr Barton’s Maths Notes
Quantum Computing and the Limits of the Efficiently Computable
The Computational Complexity of Decoding Hawking Radiation
Scott Aaronson (UT Austin) Bazaarvoice May 24, 2017
What Google Won’t Find: The Ultimate Physical Limits of Search
Quantum Computing and the Limits of the Efficiently Computable
What Quantum Computing Isn’t
Scott Aaronson (UT Austin) Papers and slides at
Quantum Computing Hakem Alazmi Jhilakshi Sharma Linda Vu.
Presentation transcript:

Quantum Computing and the Limits of the Efficiently Computable Scott Aaronson (UT Austin) Rice University, Oct. 20, 2016

Things we never see… Warp drive Perpetuum mobile Übercomputer GOLDBACH CONJECTURE: TRUE NEXT QUESTION Warp drive Perpetuum mobile Übercomputer The (seeming) impossibility of the first two machines reflects fundamental principles of physics—Special Relativity and the Second Law respectively So what about the third one? What are the fundamental physical limits on computation? The starting point for this talk is, there are certain technologies we never see that would be REALLY cool if we had them. The first is warp drive. For this crowd especially, I can say: what’s taking you so long? The second is perpetual-motion machines. The third is what I like to call the Ubercomputer. This is a machine where you feed it any well-posed mathematical question and it instantly tells you the answer. Currently, even with the fastest computers today, if you ask them to prove a hard theorem, they could do it eventually, but it might take longer than the age of the universe. That’s why there are still human mathematicians. In this talk, I want to convince you that the impossibility of ubercomputers is also something physicists should think about, and also something that may have implications for physics.

Moore’s Law ZOOM THRU Moore’s Law. I’m sure you’ve all seen this before: the number of transistors per computer has doubled pretty much every two years. This is arguably the main thing that’s driven the progress of human civilization since World War II.

Extrapolating: Robot uprising? First and most obvious is the robot uprising.

But even a killer robot would still be “merely” a Turing machine, operating on principles laid down in the 1930s… = But at a fundamental level, all computers we talk about today (even killer robots!) are still just a Turing machine – this theoretical device that was invented in the 1930s and that we teach about in our undergraduate courses. Macs, PCs, killer robots: on the inside, they’re all the same stuff. So is there anything else beyond that, that’s more interesting? And Turing machines have limitations—on what they can compute at all, and certainly on what they can compute efficiently

NP Efficiently verifiable NP-hard All NP problems are efficiently reducible to these Matrix permanent Halting problem … Steiner tree Coin balancing Maximum cut Satisfiability Maximum clique … NP-complete NP Efficiently verifiable Factoring … Here’s a rough map of the world. At the bottom is P, which includes everything we know how to solve quickly with today’s computers. Containing it is NP, the class of problems where we could recognize an answer if we saw it, and at the top of NP is this huge family of NP-complete problems. There are plenty of problems that are even harder than NP-complete – one famous example is the halting problem, to determine whether a given computer program will ever stop running. Very interestingly, there are also problems believed to be intermediate between P and NP-complete. One example is factoring. These intermediate problems are extremely important for quantum computing, as we’ll see later, and they’re also important for cryptography. Graph connectivity Primality testing Matrix determinant Linear programming … P Efficiently solvable

Even if we believe PNP, there’s still a further question: is there any way to solve NP-complete problems in polynomial time, consistent with the laws of physics?

Old proposal: Dip two glass plates with pegs between them into soapy water. Let the soap bubbles form a minimum Steiner tree connecting the pegs—thereby solving a known NP-hard problem “instantaneously”

Relativity Computer DONE But while we’re waiting for scalable quantum computers, we can also base computers on that other great theory of the 20th century, relativity! The idea here is simple: you start your computer working on some really hard problem, and leave it on earth. Then you get on a spaceship and accelerate to close to the speed of light. When you get back to earth, billions of years have passed on Earth and all your friends are long dead, but at least you’ve got the answer to your computational problem. I don’t know why more people don’t try it!

STEP 1 Zeno’s Computer STEP 2 Time (seconds) STEP 3 STEP 4 Another of my favorites is Zeno’s computer. The idea here is also simple: this is a computer that would execute the first step in one second, the next step in half a second, the next in a quarter second, and so on, so that after two seconds it’s done an infinite amount of computation. Incidentally, do any of you know why that WOULDN’T work? The problem is that, once you get down to the Planck time of 10^{-43} seconds, you’d need so much energy to run your computer that fast that, according to our best current theories, you’d exceed what’s called the Schwarzschild radius, and your computer would collapse to a black hole. You don’t want that to happen. STEP 3 STEP 4 STEP 5

Ah, but what about quantum computing? (you knew it was coming) Quantum mechanics: “Probability theory with minus signs” (Nature seems to prefer it that way)

The Famous Double-Slit Experiment Another of my favorites is Zeno’s computer. The idea here is also simple: this is a computer that would execute the first step in one second, the next step in half a second, the next in a quarter second, and so on, so that after two seconds it’s done an infinite amount of computation. Incidentally, do any of you know why that WOULDN’T work? The problem is that, once you get down to the Planck time of 10^{-43} seconds, you’d need so much energy to run your computer that fast that, according to our best current theories, you’d exceed what’s called the Schwarzschild radius, and your computer would collapse to a black hole. You don’t want that to happen. Probability of landing in “dark patch” = |amplitude|2 = |amplitudeSlit1 + amplitudeSlit2|2 = 0 Yet if you close one of the slits, the photon can appear in that previously dark patch!!

A bit more precisely: the key claim of quantum mechanics is that, if an object can be in two distinguishable states, call them |0 or |1, then it can also be in a superposition a|0 + b|1 Here a and b are complex numbers called amplitudes satisfying |a|2+|b|2=1 If we observe, we see |0 with probability |a|2 |1 with probability |b|2 Also, the object collapses to whichever outcome we see

To modify a state we can multiply the vector of amplitudes by a unitary matrix—one that preserves

We’re seeing interference of amplitudes—the source of “quantum weirdness”

Quantum Computing Interesting A general state of n qubits requires ~2n amplitudes to specify: Where we are: A QC has factored 21 into 37, with high probability (Martín-López et al. 2012) Scaling up is hard, because of decoherence! But unless QM is wrong, there doesn’t seem to be any fundamental obstacle Presents an obvious practical problem when using conventional computers to simulate quantum mechanics Interesting As zany as this sounds, Deutsch’s speculations are part of what gave rise to the modern field of quantum computing. So, what’s the idea of quantum computing? Well, a general entangled state of n qubits requires 2^n amplitudes to specify, since you need to give an amplitude for every configuration of all n of the bits. That’s a staggering amount of information! It suggests that Nature, off to the side somewhere, needs to write down 2^1000 numbers just to keep track of 1000 particles. And that presents an obvious practical problem when people try to use conventional computers to SIMULATE quantum mechanics – they have all sorts of approximate techniques, but even then, something like 10% of supercomputer cycles today are used, basically, for simulating quantum mechanics. In 1981, Richard Feynman said, if Nature is going to all this work, then why not turn it around, and build computers that THEMSELVES exploit superposition? What would such computers be useful for? Well, at least one thing: simulating quantum physics! As tautological as that sounds, I predict that if QCs ever become practical, simulating quantum physics will actually be the main thing that they’re used for. That actually has *tremendous* applications to materials science, drug design, understanding high-temperature superconductivity, etc. But of course, what got everyone excited about this field was Peter Shor’s discovery, in 1994, that a quantum computer would be good for MORE than just simulating quantum physics. It could also be used to factor integers in polynomial time, and thereby break almost all of the public-key cryptography currently used on the Internet. (Interesting!) Where we are: After 18 years and more than a billion dollars, I’m proud to say that a quantum computer recently factored 21 into 3*7, with high probability. (For a long time, it was only 15.) Scaling up is incredibly hard because of decoherence – the external environment, as it were, constantly trying to measure the QC’s state and collapse it down to classical. With classical computers, it took more than 100 years from Charles Babbage until the invention of the transistor. Who knows how long it will take in this case? But unless quantum mechanics itself is wrong, there doesn’t seem to be any fundamental obstacle to scaling this up. On the contrary, we now know that, IF the decoherence can be kept below some finite but nonzero level, then there are very clever error-correcting codes that can render its remaining effects insignificant. So, I’m optimistic that if civilization lasts long enough, we’ll eventually have practical quantum computers. Feynman 1981: So then why not turn things around, and build computers that themselves exploit superposition? Shor 1994: Such a computer could do more than simulate QM—e.g., it could factor integers in polynomial time

Bounded-Error Quantum Polynomial-Time NP-complete Bounded-Error Quantum Polynomial-Time NP Factoring BQP P

Factoring is in BQP, but not believed to be NP-complete! Today, we don’t believe quantum computers can solve NP-complete problems in polynomial time in general (though not surprisingly, we can’t prove it) Bennett et al. 1997: “Quantum magic” won’t be enough If you throw away the problem structure, and just consider an abstract “landscape” of 2n possible solutions, then even a quantum computer needs ~2n/2 steps to find the correct one (That bound is actually achievable, using Grover’s algorithm!) If there’s a fast quantum algorithm for NP-complete problems, it will have to exploit their structure somehow

Operation with easily-prepared lowest energy state The “Adiabatic Optimization” Approach to Solving NP-Hard Problems with a Quantum Computer Hi Hf Operation with easily-prepared lowest energy state Operation whose lowest-energy state encodes solution to NP-hard problem

Problem: “Eigenvalue gap” can be exponentially small Hope: “Quantum tunneling” could give speedups over classical optimization methods for finding local optima Remains unclear whether you can get a practical speedup this way over the best classical algorithms. We might just have to build QCs and test it! Problem: “Eigenvalue gap” can be exponentially small

Some Examples of My Research… BosonSampling (with Alex Arkhipov): A proposal for a rudimentary photonic quantum computer, which doesn’t seem useful for anything (e.g. breaking codes), but does seem hard to simulate using classical computers (We showed that a fast, exact classical simulation would “collapse the polynomial hierarchy to the third level”) Experimentally demonstrated with 6 photons by a group in Bristol, UK

Quantum Computing and Black Holes Hawking 1970s: Black holes radiate The radiation seems thermal (uncorrelated with whatever fell in). But if quantum mechanics is true, then it can’t be! Susskind, ‘t Hooft 1990s: “Black-hole complementarity.” Idea that quantum states emerging from black hole are somehow “the same states” as the ones trapped inside, just measured in a different way

The Firewall Paradox [Almheiri et al. 2012] If the black hole interior is “built” out of the same qubits coming out as Hawking radiation, then why can’t we do something to those Hawking qubits (after waiting ~1067 years for enough to come out), then dive into the black hole, and see that we’ve completely destroyed the spacetime geometry in the interior? Entanglement among Hawking photons detected!

Harlow-Hayden 2013: Argued that, to do the experiment on the Hawking radiation that would produce a “firewall” in the interior, would require an amount of processing time exponential in the number of qubits—meaning for a black hole the mass of our sun! In which case, long before one had made a dent in the problem, the black hole would’ve already evaporated… Their evidence used a theorem I proved as a grad student in 2002: given a “black box” function with N outputs and >>N inputs, any quantum algorithm needs at least ~N1/5 steps to find two inputs that both map to the same output (improved to ~N1/3 by Yaoyun Shi, which is optimal) Read their paper – there’s strings and branes flying all over the place but then when you finally get to why the problem is hard – the collision lower bound!

Summary Quantum computers are the most powerful kind of computer allowed by the currently-known laws of physics There’s a realistic prospect of building them Contrary to what you read, even quantum computers would have limits But those limits might help protect the geometry of spacetime!