Generalised probabilistic theories and the extension complexity of polytopes Serge Massar.

Slides:



Advertisements
Similar presentations
Optimal Space Lower Bounds for All Frequency Moments David Woodruff MIT
Advertisements

Quantum t-designs: t-wise independence in the quantum world Andris Ambainis, Joseph Emerson IQC, University of Waterloo.
Lower Bounds for Local Search by Quantum Arguments Scott Aaronson (UC Berkeley) August 14, 2003.
Primal Dual Combinatorial Algorithms Qihui Zhu May 11, 2009.
Lecture #3; Based on slides by Yinyu Ye
C&O 355 Mathematical Programming Fall 2010 Lecture 22 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A.
The Unique Games Conjecture with Entangled Provers is False Julia Kempe Tel Aviv University Oded Regev Tel Aviv University Ben Toner CWI, Amsterdam.
C&O 355 Mathematical Programming Fall 2010 Lecture 21 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
How should we define corner points? Under any reasonable definition, point x should be considered a corner point x What is a corner point?
Short course on quantum computing Andris Ambainis University of Latvia.
PCPs and Inapproximability Introduction. My T. Thai 2 Why Approximation Algorithms  Problems that we cannot find an optimal solution.
A Randomized Polynomial-Time Simplex Algorithm for Linear Programming Daniel A. Spielman, Yale Joint work with Jonathan Kelner, M.I.T.
Complexity 16-1 Complexity Andrei Bulatov Non-Approximability.
Linear Programming and Approximation
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
Approximation Algorithm: Iterative Rounding Lecture 15: March 9.
Avraham Ben-Aroya (Tel Aviv University) Oded Regev (Tel Aviv University) Ronald de Wolf (CWI, Amsterdam) A Hypercontractive Inequality for Matrix-Valued.
Vapnik-Chervonenkis Dimension Definition and Lower bound Adapted from Yishai Mansour.
1 INTRODUCTION NP, NP-hardness Approximation PCP.
Quantum Algorithms II Andrew C. Yao Tsinghua University & Chinese U. of Hong Kong.
Chapter 11 Limitations of Algorithm Power Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
1 Slides by Asaf Shapira & Michael Lewin & Boaz Klartag & Oded Schwartz. Adapted from things beyond us.
Shengyu Zhang The Chinese University of Hong Kong.
Pablo A. Parrilo ETH Zürich Semialgebraic Relaxations and Semidefinite Programs Pablo A. Parrilo ETH Zürich control.ee.ethz.ch/~parrilo.
Is Communication Complexity Physical? Samuel Marcovitch Benni Reznik Tel-Aviv University arxiv
C&O 355 Mathematical Programming Fall 2010 Lecture 17 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
Correlation testing for affine invariant properties on Shachar Lovett Institute for Advanced Study Joint with Hamed Hatami (McGill)
Chapter 11 Limitations of Algorithm Power. Lower Bounds Lower bound: an estimate on a minimum amount of work needed to solve a given problem Examples:
ENTROPIC CHARACTERISTICS OF QUANTUM CHANNELS AND THE ADDITIVITY PROBLEM A. S. Holevo Steklov Mathematical Institute, Moscow.
Computational Geometry Piyush Kumar (Lecture 5: Linear Programming) Welcome to CIS5930.
Linear Programming Piyush Kumar. Graphing 2-Dimensional LPs Example 1: x y Feasible Region x  0y  0 x + 2 y  2 y  4 x  3 Subject.
Equality Function Computation (How to make simple things complicated) Nitin Vaidya University of Illinois at Urbana-Champaign Joint work with Guanfeng.
Approximating Minimum Bounded Degree Spanning Tree (MBDST) Mohit Singh and Lap Chi Lau “Approximating Minimum Bounded DegreeApproximating Minimum Bounded.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
1 Lower Bounds Lower bound: an estimate on a minimum amount of work needed to solve a given problem Examples: b number of comparisons needed to find the.
CS38 Introduction to Algorithms Lecture 16 May 22, CS38 Lecture 16.
A limit on nonlocality in any world in which communication complexity is not trivial IFT6195 Alain Tapp.
§1.4 Algorithms and complexity For a given (optimization) problem, Questions: 1)how hard is the problem. 2)does there exist an efficient solution algorithm?
1/19 Minimizing weighted completion time with precedence constraints Nikhil Bansal (IBM) Subhash Khot (NYU)
Implicit Hitting Set Problems Richard M. Karp Erick Moreno Centeno DIMACS 20 th Anniversary.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
NP-completeness NP-complete problems. Homework Vertex Cover Instance. A graph G and an integer k. Question. Is there a vertex cover of cardinality k?
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Non-Locality Swapping and emergence of quantum correlations Nicolas Brunner Paul Skrzypczyk, Sandu Popescu University of Bristol.
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
Linear Programming Chap 2. The Geometry of LP  In the text, polyhedron is defined as P = { x  R n : Ax  b }. So some of our earlier results should.
Linear Programming Piyush Kumar Welcome to CIS5930.
What Can be Observed Locally? Round-based Models of Quantum Distributed Computing Cyril Gavoille LaBRI – University of Bordeaux Adrian Kosowski LaBRI -
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
Approximation algorithms
Lower Bounds on Extended Formulations Noah Fleming University of Toronto Supervised by Toniann Pitassi.
Random Access Codes and a Hypercontractive Inequality for
Information Complexity Lower Bounds
Optimization problems such as
Polynomial Norms Amir Ali Ahmadi (Princeton University) Georgina Hall
Umans Complexity Theory Lectures
Background: Lattices and the Learning-with-Errors problem
Computability and Complexity
Effcient quantum protocols for XOR functions
Chapter 6. Large Scale Optimization
Intro to NP Completeness
Linear sketching with parities
Chapter 34: NP-Completeness
Chapter 11 Limitations of Algorithm Power
I.4 Polyhedral Theory (NW)
Embedding Metrics into Geometric Spaces
Quantum Foundations Lecture 3
I.4 Polyhedral Theory.
CS151 Complexity Theory Lecture 5 April 16, 2019.
Chapter 6. Large Scale Optimization
Presentation transcript:

Generalised probabilistic theories and the extension complexity of polytopes Serge Massar

Physical Theories Classical Quantum Generalised Probablisitic Theories (GPT) Factorisation of Communication / Slack Matrix Linear SDP Conic Extended Formulations linear SDP Conic Polytopes & Combinat. Optimisation Comm. Complexity From Foundations to Combinatorial Optimisation

Physical Theories Classical Quantum Generalised Probablisitic Theories (GPT) Factorisation of Communication / Slack Matrix Linear SDP Conic Extended Formulations linear SDP Conic Polytopes & Combinat. Optimisation Comm. Complexity From Foundations to Combinatorial Optimisation  M. Yannakakis, Expressing Combinatorial Problems by Linear Programs, STOC 1988  S. Gouveia, P. Parillo, R. Rekha, Lifts of Convex Sets and Conic Factorisations, Math. Op. Res  S. Fiorini, S. Massar, S. Pokutta, H. R. Tiwary, R. de Wolf, Linear vs. Semi definite Extended Formulations: Exponential Separation and Strong Lower Bounds, STOC 2012  S. Fiorini, S. Massar, M. K. Patra, H. R. Tiwary, Generalised probabilistic theories and the extension complexity of polytopes, arXiv: arXiv:

Physical Theories Classical Quantum Generalised Probablisitic Theories (GPT) Factorisation of Communication / Slack Matrix Linear SDP Conic Extended Formulations linear SDP Conic Polytopes & Combinat. Optimisation Comm. Complexity From Foundations to Combinatorial Optimisation

Generalised Probabilistic Theories Minimal framework to build theories – States = convex set – Measurements: Predict probability of outcomes Adding axioms restricts to Classical or Quantum Theory – Aim: find « Natural » axioms for quantum theory. (Fuchs, Brassar, Hardy, Barrett, Masanes Muller, D’Ariano etal, etc…) GPTs with « unphysical » behavior -> rule them out. – PR boxes make Communication Complexity trivial (vanDam 05) – Correlations that violate Tsirelson bound violate Information Causality (Pawlowski et al 09)

A bit of geometry

Generalised Probabilistic Theories Mixture of states = state – State space is convex Theory predicts probability of outcome of measurement. Generalised Probabilistic Theory GPT(C,u) Space of unnormalised states = Cone Effects belong to dual Cone Normalisation – Unit – Normalised state – Measurement – Probability of outcome i : C* C. u Normalised states 0

Classical Theory u=(1,1,1,…,1) Normalised state  =(p 1,p 2,…,p n ) Probability distribution over possible states Canonical measurement={e i } e i =(0,..,0,1,0,..,0)

Quantum Theory u=I=identity matrix Normalised states = density matrices Measurements = POVM

Lorentz Cone Second Order Cone Programming C SOCP ={x = (x 0, x 1,…,x n ) such that x 1 2 +x 2 2 +…+x n 2 ≤ x 0 2 } Lorentz cone has a natural SDP formulation -> subcone of the cone of SDP matrices Can be arbitrarily well approximated using linear inequalities Linear programs  SOCP  SDP Status?

Completely Positive and Co-positive Cones

Open Question. Other interesting families of Cones ?

One way communication complexity. AliceBob : M(b) ab  (a) r

Classical Capacity. Holevo Theorem: – How much classical information can be stored in a GPT state? Max I(A:R) ? – At most log(n) bits can be stored in AliceBob : M a  (a) r

Proof 1: Refining Measurements Generalised Probabilistic Theory GPT(C,u) States Measurement Refining measurements – If e i =pf i +(1-p)g i with – then we can refine the measurement to contain effect pf i and effect (1-p)g i rather than e i Theorem: Measurements can be refined so that all effects are extreme points of C* (Krein-Milman theorem)

Proof 2: Extremal Measurements Generalised Probabilistic Theory GPT(C,u) States Measurement Convex combinations of measurements: – M 1 ={e i } & M 2 ={f i } – pM 1 +(1-p)M 2 ={pe i +(1-p)f i } If has m>n outcomes – Carathéodory: Then there exists a subset of size n, such that – Hence M=pM 1 +(1-p)M 2 & M 1 has n outcomes & M 2 has m-1 outcomes. By recurrence: all measurements can be written as convex combination of measurements with at most n effects.

Proof 3: Classical Capacity of GPT Holevo Theorem for – Refining a measurement and decomposing measurement into convex combination can only increase the capacity of the channel – Capacity of channel  log( # of measurement outcomes)  Capacity of channel ≤ log(n) bits OPEN QUESTION: – Get better bounds on the classical capacity for specific theories? AliceBob : M a  (a) r

Physical Theories Classical Quantum Generalised Probablisitic Theories (GPT) Factorisation of Communication / Slack Matrix Linear SDP Conic Extended Formulations linear SDP Conic Polytopes & Combinat. Optimisation Comm. Complexity From Foundations to Combinatorial Optimisation

Randomised one way communication complexity with positive outcomes Theorem: Randomised one way communication with positive outcomes using GPT(C,u) and one bit of classical communication produces on average C ab on inputs a,b If and only if Cone factorisation of Alice a b  (a) r(i,b)  1 bit {0,1}

Different Cone factorisations Theorem: Randomised one way communication with positive outcomes using GPT(C,u) and one bit of classical communication produces on average C ab on inputs a,b If and only if Cone factorisation of Alice a b  (a) r(i,b) 

Physical Theories Classical Quantum Generalised Probablisitic Theories (GPT) Factorisation of Communication / Slack Matrix Linear SDP Conic Extended Formulations linear SDP Conic Polytopes & Combinat. Optimisation Comm. Complexity From Foundations to Combinatorial Optimisation

Background: solving NP by LP? Famous P-problem: linear programming (Khachian’79) Famous NP-hard problem: traveling salesman problem A polynomial-size LP for TSP would show P = NP Swart’86–87 claimed to have found such LPs Yannakakis’88 showed that any symmetric LP for TSP needs exponential size Swart’s LPs were symmetric, so they couldn’t work 20-year open problem: what about non-symmetric LP? There are examples where non-symmetry helps a lot (Kaibel’10) Any LP for TSP needs exponential size (Fiorini et al 12)

Polytope P = conv {vertices} = {x : A e x < b e } v e

Combinatorial Polytopes Travelling Salesman Problem (TSP) polytope – R n(n-1)/2 : one coordinate per edge of graph – Cycle C : v C =(1,0,0,1,1,…,0) – P TSP =conv{v C } – Shortest cycle: min Correlation polytope – Bell polytope with 2 parties, N settings, 2 outcomes Linear optimisation over these polytopes is NP Hard Deciding if a point belongs to the polytope is NP Hard

Physical Theories Classical Quantum Generalised Probablisitic Theories (GPT) Factorisation of Communication / Slack Matrix Linear SDP Conic Extended Formulations linear SDP Conic Polytopes & Combinat. Optimisation Comm. Complexity From Foundations to Combinatorial Optimisation

Extended Formulations View polytope as projection of a simpler object in a higher dimensional space.  Q=extended formulation P=polytope

Linear Extensions: the higher dimensional object is a polytope  Q=extended formulation P=polytope Size of linear extended formulation = # of facets of Q

Conic extensions: Extended object= intersection of cone and hyperplane.  Cone=C Q Polytope P

Conic extensions  Cone=C Linear extensions – positive orthant SDP extensions – cone of SDP matrices Conic extensions – C=cone in R n Why this construction? – Small extensions exist for many problems – Algorithmics: optimise over small extended formulation is efficient for linear and SDP extension – Possible to obtain Lower bound on size of extension Q Polytope P

Physical Theories Classical Quantum Generalised Probablisitic Theories (GPT) Factorisation of Communication / Slack Matrix Linear SDP Conic Extended Formulations linear SDP Conic Polytopes & Combinat. Optimisation Comm. Complexity From Foundations to Combinatorial Optimisation

Slack Matrix of a Polytope P = conv {vertices} = {x : A e x – b e  } Slack Matrix – S ve = distance between v and e = A e x v – b e v e

Factorisation Theorem (Yannakakis88) Theorem: Polytope P has Cone C extension Iff Slack matrix has Conic factorisation – Iff Alice and Bob can solve communication complexity problem based on S ev by sending GPT  C,u) states. AliceBob ev GPT(C) s : =S ev

Physical Theories Classical Quantum Generalised Probablisitic Theories (GPT) Factorisation of Communication / Slack Matrix Linear SDP Conic Extended Formulations linear SDP Conic Polytopes & Combinat. Optimisation Comm. Complexity From Foundations to Combinatorial Optimisation S. Fiorini, S. Massar, S. Pokutta, H. R. Tiwary, R. de Wolf, Linear vs. Semi definite Extended Formulations: Exponential Separation and Strong Lower Bounds, STOC 2012 There do not exist polynomial size linear extensions of the TSP polytope

A Classical versus Quantum gap AliceBob ab Classical/Quantum Communication m : =M ab

Theorem: Linear Extension Complexity of Correlation Polytope= AliceBob ab Classical Communication m : =M ab

Linear extension complexity of polytopes

OPEN QUESTION? Prove that SDP (Quantum) extension complexity of TSP, Correlation, etc.. polytopes is exponential – Strongly conjectured to be true – The converse would almost imply P=NP – Requires method to lower bound quantum communication complexity in the average output model (cannot give the parties shared randomness)

Physical Theories Classical Quantum Generalised Probablisitic Theories (GPT) Factorisation of Communication / Slack Matrix Linear SDP Conic Extended Formulations linear SDP Conic Polytopes & Combinat. Optimisation Comm. Complexity From Foundations to Combinatorial Optimisation S. Fiorini, S. Massar, M. K. Patra, H. R. Tiwary, Generalised probabilistic theories and the extension complexity of polytopes, arXiv: arXiv: GPT based on cone of completely positive matrices allow exponential saving with respect to classical (conjectured quantum) communication All combinatorial polytopes (vertices computable with poly size circuit) have poly size completely positive extension.

Recall: Completely Positive and Co-positive Cones

Completely Positive extention of Correlation Polytope Theorem: The Correlation polytope COR(n) has a 2n+1 size extension for the Completely Positive Cone. – Sketch of proof: Consider arbitary linear optimisation over COR(n) Use Equivalence (Bürer2009) to linear optimisation over C* 2n+1 Implies COR(n)=projection of intersection of C* 2n+1 with hyperplane

Polynomialy definable 0/1 polytopes

Polynomialy definable 0/1-polytopes Theorem (Maksimenko2012): All polynomialy definable 0/1- polytopes in R d are projections of faces of the correlation polytope COR(poly(d)). Corollary: All polynomialy definable 0/1-polytopes in R d have poly(d) size extension for the Completely Positive Cone. – Generalises a large number of special cases proved before. – « Cook-Levin» like theorem for combinatorial polytopes

Summary Generalised Probabilistic Theories – Holevo Theorem for GPT Connection between Classical/Quantum/GPT communication complexity and Extension of Polytopes – Exponential Lower bound on linear extension complexity of COR, TSP polytopes – All 0/1 combinatorial polytopes have small extension for the Completely Positive Cone – Hence: GPT(Completely Positive Cone) allows exponential saving with respect to classical (conjectured quantum) communication. Use this to rule out the theory? (Of course many other reasons to rule out the theory using other axioms) OPEN QUESTIONS: Gaps between Classical/Quantum/GPT for – Other models of communication complexity? – Models of Computation

Physical Theories Classical Quantum Generalised Probablisitic Theories (GPT) Factorisation of Communication / Slack Matrix Linear SDP Conic Extended Formulations linear SDP Conic Polytopes & Combinat. Optimisation Comm. Complexity From Foundations to Combinatorial Optimisation  M. Yannakakis, Expressing Combinatorial Problems by Linear Programs, STOC 1988  S. Gouveia, P. Parillo, R. Rekha, Lifts of Convex Sets and Conic Factorisations, Math. Op. Res  S. Fiorini, S. Massar, S. Pokutta, H. R. Tiwary, R. de Wolf, Linear vs. Semi definite Extended Formulations: Exponential Separation and Strong Lower Bounds, STOC 2012  There do not exist polynomial size linear extensions of the TSP polytope  S. Fiorini, S. Massar, M. K. Patra, H. R. Tiwary, Generalised probabilistic theories and the extension complexity of polytopes, arXiv: arXiv: All combinatorial polytopes (vertices computable with poly size circuit) have poly size completely positive extension. GPT based on cone of completely positive matrices allow exponential saving with respect to classical (conjectured quantum) communication