Download presentation
Presentation is loading. Please wait.
Published bySibyl Hawkins Modified over 9 years ago
1
Bayesianism, Convexity, and the quest towards Optimal Algorithms Boaz Barak Harvard UniversityMicrosoft Research
2
Talk Plan Skipping today: Dubious historical analogy. Philosophize about automating algorithms. Wave hands about convexity and the Sum of Squares algorithm. Sudden shift to Bayesianism vs Frequentism. Non-results on the planted clique problem (or, how to annoy your friends). Sparse coding / dictionary learning / tensor prediction [B-Kelner-Steurer’14,’15 B-Moitra’15] Unique games conjecture / small set expansion [..B-Brandao-Harrow-Kelner-Steurer-Zhou’12..] Connections to quantum information theory
3
Prologue: Solving equations Solutions for cubics and quartics. Babylonians (~2000BC): del Ferro-Tartaglia- Cardano-Ferrari (1500’s): Solutions for quadratic equations. Euler(1740’s): Gauss (1796): Special cases of quintics … Ruffini-Abel-Galois (early 1800’s): Some equations can’t be solved in radicals Characterization of solvable equations. Birth of group theory 17-gon construction now “boring”: few lines of Mathematica. Vandermonde(1777): van Roomen/Viete (1593): “Challenge all mathematicians in the world”
4
A prototypical TCS paper Interesting problem Efficient Algorithm (e.g. MAX-FLOW in P) Hardness Reduction (e.g. MAX-CUT NP-hard) Can we make algorithms boring? Can we reduce creativity in algorithm design? Can we characterize the “easy” problems?
5
Characterizing easy problems Goal: A single simple algorithm that solves efficiently every problem that can be efficiently solved. Trivially True: Algorithm that enumerates Turing machines. Revised Goal: A single simple algorithm that is conjectured to be optimal in some interesting domain of problems. Byproducts: New algorithms, theory of computational knowledge. Next slide Part 1 ??? Part 2
6
Domain: Combinatorial Optimization* Maximize/minimize an objective subject to constraints Examples: Satisfiability, Graph partitioning and coloring, Traveling Salesperson, Matching,... Characteristics: Hope: Make this formal for some subclass of optimization. Non-Examples: Integer factoring, Determinant
7
Theme: Convexity
8
Convexity in optimization Interesting Problem Convex Problem General Solver Creativity!! Algorithmic version of works related to Hilbert’s 17 th problem [Artin 27,Krivine64,Stengle74]
9
Talk Plan Dubious historical analogy. Philosophize about automating algorithms. Wave hands about convexity and the Sum of Squares algorithm. Sudden shift to Bayesianism vs Frequentism. Non-results on the planted clique problem.
10
Frequentists vs Bayesians “Nonsense! The digit is either 7 or isn’t.”
11
Computational version
12
Making this formal Classical Bayesian Uncertainty: posterior distribution Theorem: Computational
13
Making this formal Classical Bayesian Uncertainty: posterior distribution Theorem: Computational A General Perspective: Computational analog to Bayesian probabilities. Algorithms : Proof Systems Frequentist : Bayesian Pseudorandom : Pseudodistribution
14
Planted Clique Problem [Karp’76,Kucera’95] Central problem in average-case complexity. Related to problems in statistics, sparse recovery, finding equilibrium, … [Hazan-Krauthgamer’09, Koiran-Zouzias’12, Berthet-Rigolet’12] Can SOS do better?
15
Bug [Pisier] : Concentration bound is false.
16
MW’s “moral” error Pseudo-distributions should be as simple as possible but not simpler. Following A. Einstein. Pseudo-distributions should have maximum entropy but respect the data.
17
MW violated Bayeisan reasoning: By Bayesian reasoning: Pseudo-distributions should have maximum entropy but respect the data.
18
Why is MW’s error interesting? Shows SoS captures Bayesian reasoning in a way that other algorithms do not. Even if SoS is not the optimal algorithm we’re looking for, the dream of a more general theory of hardness, easiness and knowledge is worth pursuing. Suggests new way to define what a computationally bounded observer knows about some quantity....and a more principled way to design algorithms based on such knowledge. (see [B-Kelner-Steurer’14,’15] )
19
Why is MW’s error interesting? Shows SoS captures Bayesian reasoning in a way that other algorithms do not. Suggests new way to define what a computationally bounded observer knows about some quantity....and a more principled way to design algorithms based on such knowledge. (see [B-Kelner-Steurer’14,’15] ) Even if SoS is not the optimal algorithm we’re looking for, the dream of a more general theory of hardness, easiness and knowledge is worth pursuing. Thanks!!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.