Download presentation
Presentation is loading. Please wait.
Published byEzra Francis Modified over 7 years ago
1
Fernando G.S.L. Brandão MSR -> Caltech Faculty Summit 2016
Quantum Speed-ups for Semidefinite Programming Fernando G.S.L. Brandão MSR -> Caltech Faculty Summit 2016 based on joint work with Krysta Svore MSR
2
Quantum Algorithms Exponential speed-ups:
Simulate quantum physics, factor big numbers (Shor’s algorithm), …, Polynomial Speed-ups: Searching (Grover’s algorithm: N1/2 vs O(N)), ... Heuristics: Quantum annealing (adiabatic algorithm), machine learning, ...
3
Solving Semidefinite Programming belongs here
Quantum Algorithms Exponential speed-ups: Simulate quantum physics, factor big numbers (Shor’s algorithm), …, Polynomial Speed-ups: Searching (Grover’s algorithm: N1/2 vs O(N)), ... Heuristics: Quantum annealing (adiabatic algorithm), machine learning, ... This Talk: Solving Semidefinite Programming belongs here
4
Semidefinite Programming
… is an important class of convex optimization problems Input: n x n, r-sparse matrices C, A1, ..., Am and numbers b1, ..., bm Output: X Some Applications: operations research (location probems, scheduing, ...), bioengineering (flux balance analysis, ...), approximating NP-hard problems (max-cut, ...), field theory (conformal bootstraping), ... Algorithms Interior points: O((m2nr + mn2)log(1/ε)) Multilicative Weights: O((mnr/ε2)) Lower bound: No faster than Ω(nm), for constant ε and r
5
Semidefinite Programming
… is an important class of convex optimization problems Input: n x n, r-sparse matrices C, A1, ..., Am and numbers b1, ..., bm Output: X Some Applications: operations research (location probems, scheduling, ...), bioengineering (flux balance analysis, ...), approximating NP-hard problems (max-cut, ...), ... Algorithms Interior points: O((m2nr + mn2)log(1/ε)) Multilicative Weights: O((mnr/ε2)) Lower bound: No faster than Ω(nm), for constant ε and r
6
Semidefinite Programming
… is an important class of convex optimization problems Input: n x n, r-sparse matrices C, A1, ..., Am and numbers b1, ..., bm Output: X Some Applications: operations research (location probems, scheduling, ...), bioengineering (flux balance analysis, ...), approximating NP-hard problems (max-cut, ...), ... Algorithms Interior points: O((m2nr + mn2)log(1/ε)) Multiplicative Weights: O((mnr (⍵R)/ε2)) Lower bound: No faster than Ω(nm), for constant ε and r “width” “size of solution”
7
Semidefinite Programming
… is an important class of convex optimization problems Input: n x n, r-sparse matrices C, A1, ..., Am and numbers b1, ..., bm Output: X Some Applications: operations research (location probems, scheduling, ...), bioengineering (flux balance analysis, ...), approximating NP-hard problems (max-cut, ...), ... Algorithms Interior points: O((m2nr + mn2)log(1/ε)) Multiplicative Weights: O((mnr (⍵R)/ε2)) Lower bound: No faster than Ω(nm), for constant ε, r, ⍵, R
8
Semidefinite Programming
Normalization: We assume: Reduction optimization to decision: or
9
SDP Duality Primal: Dual:
10
Quantum Algorithm for SDP
thm There is a quantum algorithm for solving SDPs running in time Õ(n1/2 m1/2 r δ-2 R2)
11
Quantum Algorithm for SDP
thm There is a quantum algorithm for solving SDPs running in time Õ(n1/2 m1/2 r δ-2 R2) Input: n x n matrices r-sparse C, A1, ..., Am and numbers b1, ..., bm Output: Samples from y/||y||2 and ||y||2 Q. Samples from X/tr(X) and tr(X) Primal: Dual: or
12
Quantum Algorithm for SDP
thm There is a quantum algorithm for solving SDPs running in time Õ(n1/2 m1/2 r δ-2 R2) Input: n x n matrices r-sparse C, A1, ..., Am and numbers b1, ..., bm Output: Samples from y/||y||2 and ||y||2 Q. Samples from X/tr(X) and tr(X) Oracle Model: We assume there’s an oracle that outputs a chosen non-zero entry of C, A1, ..., Am at unit cost: choice of Aj row k l non-zero element
13
Quantum Algorithm for SDP
thm There is a quantum algorithm for solving SDPs running in time Õ(n1/2 m1/2 r δ-2 R2) Classical lower bound At least time Ω(max(n, m) for r, δ, R = Ω(1). Reduction from Search Quantum lower bound At least time Ω(max(n1/2, m1/2) for r, δ, R = Ω(1). Reduction from Search Ex. Ω(max(m)): bi = 1, C = I Aj = I for all j (obj = 1) Aj = 2I for random j in [m] and Ak = I for k ≠ j (obj = 1/2)
14
Quantum Algorithm for SDP
thm There is a quantum algorithm for solving SDPs running in time Õ(n1/2 m1/2 r δ-2 R2) thm 2 There is a quantum algorithm for solving SDPs running in time Õ(TGibbs m1/2 r δ-2 R2) If Gibbs states can be prepared quickly (e.g. by quantum metropolis), larger speed-ups possible
15
Quantum Algorithm for SDP
The quantum algorithm is based on a classical algorithm for SDP due to Arora and Kale (2007) based on the multiplicative weight method Let’s review their method
16
The Oracle Searches for a vector y s.t. i) ii) Dual:
17
Width of SDP
18
Arora-Kale Algorithm 1. 2. 3. 4.
19
Why Arora-Kale works? Since Must check is feasible From Oracle,
We need:
20
Matrix Multiplicative Weight
MMW (Arora, Kale ‘07) Given n x n matrices Mt and ε < ½, with and λn : min eigenvalue 2-player zero-sum game interpretation: - Player A chooses density matrix Xt - Player B chooses matrix 0 < Mt<I Pay-off: tr(Xt Mt) “Xt = ⍴t strategy almost as good as global strategy”
21
Arora-Kale Algorithm 1. 2. 3. 4.
22
Arora-Kale Algorithm 1. 2. 3. 4. How to implement the Oracle?
23
Implementing Oracle by Gibbs Sampling
Searches for a vector y s.t. i) ii)
24
Implementing Oracle by Gibbs Sampling
Searches for (non-normalized) probability distribution y satisfying two linear constraints: claim: We can take Y to be Gibbs: There are constants N, x, y s.t.
25
Jaynes’ Principle Then there is a Gibbs state of the form
(Jaynes 57) Let ⍴ be a quantum state s.t. Then there is a Gibbs state of the form with same expectation values. Drawback: no control over size of the λi’s.
26
Finitary Jaynes’ Principle
(Lee, Raghavendra, Steurer ‘15) Let ⍴ s.t. Then there is a with s.t. (Note: Used to prove limitations of SDPs for approximating constraints satisfaction problems)
27
Implementing Oracle by Gibbs Sampling
Claim There is a Y of the form with x, y < log(n)/ε and N < α s.t. N < α:
28
Implementing Oracle by Gibbs Sampling
Claim There is a Y of the form with x, y < log(n)/ε and N < α s.t. Can implement oracle by exhaustive searching over x, y, N for a Gibbs distribution satisfying constraints above (only α log2(n)/ε3 different triples needed to be checked)
29
Arora-Kale Algorithm 1. 2. 3. 4. Using Gibbs Sampling
Again, it’s Gibbs Sampling
30
Quantum Speed-ups for Gibbs Sampling
(Poulin, Wocjan ’09, Chowdhury, Somma ‘16) Given a r-sparse D x D Hamiltonian H, one can prepare exp(H)/tr(…) to accuracy ε on a quantum computer with O(D1/2 r ||H|| polylog(1/ ε)) calls to an oracle for the entries of H Based on amplitude amplification
31
Quantizing Arora-Kale
1. 2. 3. 4. Use Gibbs Sampling Again, Gibbs Sampling
32
Quantizing Arora-Kale
Problem: Mt is not sparse. Solution: Sparsify it using operator Chernoff bound 1. 2. 3. 4. Using Gibbs Sampling Again, Gibbs Sampling
33
Sparsification Suppose Z1, …, Zk are independent n x n Hermitian matrices s.t. E(Zi)=0, ||Zi||<λ. Then Applying to Sampling j1, …, jk from with k = O(log(n)/ε2)
34
Quantizing Arora-Kale
1. 2. 3. 4. Use Gibbs sampling Sparsify by random sampling Again, Gibbs sampling
35
Quantizing Arora-Kale
1. 2. 3. 4. Use Gibbs sampling Takes Õ(n1/2r) time on a quantum computer Sparsify by random sampling Again, Gibbs sampling
36
Quantizing Arora-Kale
1. 2. 3. 4. Use Gibbs sampling Needs to prepare: exp(∑i hi |i><i|)/tr(…) with hi = x tr(Ai ⍴t) + y bi Can do quantumly with Õ(m1/2) calls to an oracle to h. Each call to h consists in estimating the value of tr(Ai ⍴t). To estimate needs to prepare copies of ⍴t, which costs Õ(n1/2r) Overall: Õ(n1/2m1/2r) Sparsify by random sampling
37
Conclusion and Open Problems
We showed quantum computers can speed-up the solution of SPDs One application is to solve the Goesman-Williamson SDP for Max-Cut in time Õ(|V|). Are there more applications? - Can we improve the parameters? Can we find (interesting) instances where there are superpolynomial speed-ups? Quantum speed-up for interior-point methods? Thanks!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.