The Solution of the Kadison-Singer Problem Adam Marcus (Crisply, Yale) Daniel Spielman (Yale) Nikhil Srivastava (MSR/Berkeley) IAS, Nov 5, 2014.

Slides:



Advertisements
Similar presentations
Aim: How do we find the zeros of polynomial functions?
Advertisements

Parikshit Gopalan Georgia Institute of Technology Atlanta, Georgia, USA.
Lower Bounds for Additive Spanners, Emulators, and More David P. Woodruff MIT and Tsinghua University To appear in FOCS, 2006.
Property Testing of Data Dimensionality Robert Krauthgamer ICSI and UC Berkeley Joint work with Ori Sasson (Hebrew U.)
5.1 Real Vector Spaces.
Shortest Vector In A Lattice is NP-Hard to approximate
Eigen Decomposition and Singular Value Decomposition
On Complexity, Sampling, and -Nets and -Samples. Range Spaces A range space is a pair, where is a ground set, it’s elements called points and is a family.
Matroid Bases and Matrix Concentration
1 Decomposing Hypergraphs with Hypertrees Raphael Yuster University of Haifa - Oranim.
3D Geometry for Computer Graphics
Sparse Approximations
Solving linear systems through nested dissection Noga Alon Tel Aviv University Raphael Yuster University of Haifa.
1 Heat flow and a faster Algorithm to Compute the Surface Area of a Convex Body Hariharan Narayanan, University of Chicago Joint work with Mikhail Belkin,
Lecture 17 Introduction to Eigenvalue Problems
OCE301 Part II: Linear Algebra lecture 4. Eigenvalue Problem Ax = y Ax = x occur frequently in engineering analysis (eigenvalue problem) Ax =  x [ A.
A Linear Round Lower Bound for Lovasz-Schrijver SDP relaxations of Vertex Cover Grant Schoenebeck Luca Trevisan Madhur Tulsiani UC Berkeley.
Computer Graphics Recitation 5.
2007 Kézdy André Kézdy Department of Mathematics University of Louisville * Preliminary report.  More -valuations for trees via the combinatorial nullstellensatz*
2.III. Basis and Dimension 1.Basis 2.Dimension 3.Vector Spaces and Linear Systems 4.Combining Subspaces.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
EXPANDER GRAPHS Properties & Applications. Things to cover ! Definitions Properties Combinatorial, Spectral properties Constructions “Explicit” constructions.
Chapter 3 Determinants and Matrices
Expanders Eliyahu Kiperwasser. What is it? Expanders are graphs with no small cuts. The later gives several unique traits to such graph, such as: – High.
Ramanujan Graphs of Every Degree Adam Marcus (Crisply, Yale) Daniel Spielman (Yale) Nikhil Srivastava (MSR India)
Upper and Lower Bounds for Roots
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
Edge-disjoint induced subgraphs with given minimum degree Raphael Yuster 2012.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Use of Computer Technology for Insight and Proof A. Eight Historical Examples B. Weaknesses and Strengths R. Wilson Barnard, Kent Pearce Texas Tech University.
Rational Root and Complex Conjugates Theorem. Rational Root Theorem Used to find possible rational roots (solutions) to a polynomial Possible Roots :
5 5.2 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors THE CHARACTERISTIC EQUATION.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Spectrally Thin Trees Nick Harvey University of British Columbia Joint work with Neil Olver (MIT  Vrije Universiteit) TexPoint fonts used in EMF. Read.
The Verification of an Inequality Roger W. Barnard, Kent Pearce, G. Brock Williams Texas Tech University Leah Cole Wayland Baptist University Presentation:
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Section 4.6 Complex Zeros; Fundamental Theorem of Algebra.
Graphs, Vectors, and Matrices Daniel A. Spielman Yale University AMS Josiah Willard Gibbs Lecture January 6, 2016.
Verification of Inequalities (i) Four practical mechanisms The role of CAS in analysis (ii) Applications Kent Pearce Texas Tech University Presentation:
Presented by Alon Levin
Ch 4.2: Homogeneous Equations with Constant Coefficients Consider the nth order linear homogeneous differential equation with constant, real coefficients:
Zeros of Polynomial Fns Algebra III, Sec. 2.5 Objective Determine the number of rational and real zeros of polynomial functions, and find the zeros.
Giansalvo EXIN Cirrincione unit #4 Single-layer networks They directly compute linear discriminant functions using the TS without need of determining.
Laplacian Matrices of Graphs: Algorithms and Applications ICML, June 21, 2016 Daniel A. Spielman.
EE611 Deterministic Systems Multiple-Input Multiple-Output (MIMO) Feedback Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Mathematics-I J.Baskar Babujee Department of Mathematics
CS479/679 Pattern Recognition Dr. George Bebis
Dimension reduction for finite trees in L1
Markov Chains Mixing Times Lecture 5
Rational Root and Complex Conjugates Theorem
3.8 Complex Zeros; Fundamental Theorem of Algebra
Monomer-dimer model and a new deterministic approximation algorithm for computing a permanent of a 0,1 matrix David Gamarnik MIT Joint work with Dmitriy.
Sum of Squares, Planted Clique, and Pseudo-Calibration
Systems of First Order Linear Equations
Vapnik–Chervonenkis Dimension
Computability and Complexity
Basis and Dimension Basis Dimension Vector Spaces and Linear Systems
Matrix Martingales in Randomized Numerical Linear Algebra
Boyce/DiPrima 10th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Partitioning and decomposing graphs László Lovász
Welcome to the Kernel-Club
2.III. Basis and Dimension
Fundamental Theorem of Algebra
Clustering.
Eigenvalues and Eigenvectors
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Locality In Distributed Graph Algorithms
Presentation transcript:

The Solution of the Kadison-Singer Problem Adam Marcus (Crisply, Yale) Daniel Spielman (Yale) Nikhil Srivastava (MSR/Berkeley) IAS, Nov 5, 2014

The Kadison-Singer Problem (‘59) A positive solution is equivalent to: Anderson’s Paving Conjectures (‘79, ‘81) Bourgain-Tzafriri Conjecture (‘91) Feichtinger Conjecture (‘05) Many others Implied by: Akemann and Anderson’s Paving Conjecture (‘91) Weaver’s KS 2 Conjecture

A positive solution is equivalent to: Anderson’s Paving Conjectures (‘79, ‘81) Bourgain-Tzafriri Conjecture (‘91) Feichtinger Conjecture (‘05) Many others Implied by: Akemann and Anderson’s Paving Conjecture (‘91) Weaver’s KS 2 Conjecture The Kadison-Singer Problem (‘59)

A positive solution is equivalent to: Anderson’s Paving Conjectures (‘79, ‘81) Bourgain-Tzafriri Conjecture (‘91) Feichtinger Conjecture (‘05) Many others Implied by: Akemann and Anderson’s Paving Conjecture (‘91) Weaver’s KS 2 Conjecture The Kadison-Singer Problem (‘59)

Let be a maximal Abelian subalgebra of, the algebra of bounded linear operators on Let be a pure state. Is the extension of to unique? See Nick Harvey’s Survey or Terry Tao’s Blog The Kadison-Singer Problem (‘59)

Anderson’s Paving Conjecture ‘79 For all there is a k so that for every n-by-n symmetric matrix A with zero diagonals, there is a partition of into Recall

Anderson’s Paving Conjecture ‘79 For all there is a k so that for every self-adjoint bounded linear operator A on, there is a partition of into

Anderson’s Paving Conjecture ‘79 For all there is a k so that for every n-by-n symmetric matrix A with zero diagonals, there is a partition of into Is unchanged if restrict to projection matrices. [Casazza, Edidin, Kalra, Paulsen ‘07]

Anderson’s Paving Conjecture ‘79 There exist an and a k so that for such that and then exists a partition of into k parts s.t. Equivalent to [Harvey ‘13]:

Moments of Vectors The moment of vectors in the direction of a unit vector is

Moments of Vectors The moment of vectors in the direction of a unit vector is

Vectors with Spherical Moments For every unit vector

Vectors with Spherical Moments For every unit vector Also called isotropic position

Partition into Approximately ½-Spherical Sets

because

Partition into Approximately ½-Spherical Sets because

Big vectors make this difficult

Weaver’s Conjecture KS 2 There exist positive constants and so that if all then exists a partition into S 1 and S 2 with

Weaver’s Conjecture KS 2 There exist positive constants and so that if all then exists a partition into S 1 and S 2 with Implies Akemann-Anderson Paving Conjecture, which implies Kadison-Singer

Main Theorem For all if all then exists a partition into S 1 and S 2 with Implies Akemann-Anderson Paving Conjecture, which implies Kadison-Singer

A Random Partition? Works with high probability if all (by Tropp ‘11, variant of Matrix Chernoff, Rudelson)

A Random Partition? Works with high probability if all (by Tropp ‘11, variant of Matrix Chernoff, Rudelson) Troublesome case: each is a scaled axis vector are of each

A Random Partition? Works with high probability if all (by Tropp ‘11, variant of Matrix Chernoff, Rudelson) Troublesome case: each is a scaled axis vector are of each chance that all in one direction land in same set is

A Random Partition? Works with high probability if all (by Tropp ‘11, variant of Matrix Chernoff, Rudelson) Troublesome case: each is a scaled axis vector are of each chance that all in one direction land in same set is Chance there exists a direction in which all land in same set is

The Graphical Case From a graph G = (V,E) with |V| = n and |E| = m Create m vectors in n dimensions: If G is a good d-regular expander, all eigs close to d very close to spherical

Partitioning Expanders Can partition the edges of a good expander to obtain two expanders. Broder-Frieze-Upfal ‘94: construct random partition guaranteeing degree at least d/4, some expansion Frieze-Molloy ‘99: Lovász Local Lemma, good expander Probability is works is low, but can prove non-zero

We want

Consider expected polynomial with a random partition.

Our approach 1.Prove expected characteristic polynomial has real roots 2.Prove its largest root is at most 3.Prove is an interlacing family, so exists a partition whose polynomial has largest root at most

The Expected Polynomial Indicate choices by :

The Expected Polynomial

Mixed Characteristic Polynomials For independently chosen random vectors is their mixed characteristic polynomial. Theorem: only depends on and, is real-rooted

Mixed Characteristic Polynomials For independently chosen random vectors is their mixed characteristic polynomial. Theorem: only depends on and, is real-rooted

Mixed Characteristic Polynomials For independently chosen random vectors is their mixed characteristic polynomial. Theorem: only depends on and, is real-rooted

Mixed Characteristic Polynomials For independently chosen random vectors is their mixed characteristic polynomial. The constant term is the mixed discriminant of

The constant term When diagonal and, is a matrix permanent.

The constant term is minimized when Proved by Egorychev and Falikman ‘81. Simpler proof by Gurvits (see Laurent-Schrijver) If and When diagonal and, is a matrix permanent. Van der Waerden’s Conjecture becomes

The constant term is minimized when This was a conjecture of Bapat. If and For Hermitian matrices, is the mixed discriminant Gurvits proved a lower bound on :

Real Stable Polynomials A multivariate generalization of real rootedness. Complex roots of come in conjugate pairs. So, real rooted iff no roots with positive complex part.

Real Stable Polynomials Isomorphic to Gårding’s hyperbolic polynomials Used by Gurvits (in his second proof) is real stable if it has no roots in the upper half-plane for all i implies

Real Stable Polynomials Isomorphic to Gårding’s hyperbolic polynomials Used by Gurvits (in his second proof) is real stable if it has no roots in the upper half-plane for all i implies See surveys of Pemantle and Wagner

Real Stable Polynomials Borcea-Brändén ‘08: For PSD matrices is real stable

Real Stable Polynomials implies is real rooted real stable implies is real stable real stable (Lieb Sokal ‘81)

Real Roots So, every mixed characteristic polynomial is real rooted.

Interlacing Polynomial interlaces if

Common Interlacing and have a common interlacing if can partition the line into intervals so that each contains one root from each polynomial ) ) ) ) ( (( (

Common Interlacing ) ) ) ) ( (( ( If p 1 and p 2 have a common interlacing, for some i. Largest root of average

Common Interlacing ) ) ) ) ( (( ( If p 1 and p 2 have a common interlacing, for some i. Largest root of average

Without a common interlacing

If p 1 and p 2 have a common interlacing, for some i. Common Interlacing ) ) ) ) ( (( ( for some i. Largest root of average

and have a common interlacing iff is real rooted for all ) ) ) ) ( (( ( Common Interlacing

is an interlacing family Interlacing Family of Polynomials if its members can be placed on the leaves of a tree so that when every node is labeled with the average of leaves below, siblings have common interlacings

is an interlacing family Interlacing Family of Polynomials if its members can be placed on the leaves of a tree so that when every node is labeled with the average of leaves below, siblings have common interlacings

Interlacing Family of Polynomials Theorem: There is a so that

Our Interlacing Family Indicate choices by :

and have a common interlacing iff is real rooted for all We need to show that is real rooted. Common Interlacing

and have a common interlacing iff is real rooted for all We need to show that is real rooted. It is a mixed characteristic polynomial, so is real-rooted. Set with probability Keep uniform for Common Interlacing

An upper bound on the roots Theorem: If and then

An upper bound on the roots Theorem: If and then An upper bound of 2 is trivial (in our special case). Need any constant strictly less than 2.

An upper bound on the roots Theorem: If and then

An upper bound on the roots Define: is an upper bound on the roots of if for

An upper bound on the roots Define: is an upper bound on the roots of if for Eventually set so want

Action of the operators

The roots of Define: Theorem (Batson-S-Srivastava): If is real rooted and

The roots of Theorem (Batson-S-Srivastava): If is real rooted and Gives a sharp upper bound on the roots of associated Laguerre polynomials.

The roots of Theorem (Batson-S-Srivastava): If is real rooted and Proof: Define Set, so Algebra + is convex and monotone decreasing

An upper bound on the roots Theorem: If and then

A robust upper bound Define: is an -upper bound on if it is an -max, in each, and Theorem: If is an -upper bound on, then is an -upper bound on, for

A robust upper bound Theorem: If is an -upper bound on, and is an -upper bound on, Proof: Same as before, but need to know that is decreasing and convex in, above the roots

A robust upper bound Proof: Same as before, but need to know that is decreasing and convex in, above the roots Follows from a theorem of Helton and Vinnikov ’07: Every bivariate real stable polynomial can be written

A robust upper bound Proof: Same as before, but need to know that is decreasing and convex in, above the roots Or, as pointed out by Renegar, from a theorem Bauschke, Güler, Lewis, and Sendov ’01 Or, see Terry Tao’s blog for a (mostly) self-contained proof

Getting Started

at

An upper bound on the roots Theorem: If and then

A probabilistic interpretation For independently chosen random vectors with finite support such that and then

Main Theorem For all if all then exists a partition into S 1 and S 2 with Implies Akemann-Anderson Paving Conjecture, which implies Kadison-Singer

Anderson’s Paving Conjecture ‘79 Reduction by Casazza-Edidin-Kalra-Paulsen ’07 and Harvey ’13: There exist an and a k so that if all and then exists a partition of into k parts s.t. Can prove using the same technique

A conjecture For all there exists an so that if all then exists a partition into S 1 and S 2 with

Another conjecture If and then is largest when

Questions How do operations that preserve real rootedness move the roots and the Stieltjes transform? Can the partition be found in polynomial time? What else can one construct this way?