Ryan O’Donnell Carnegie Mellon University analysisofbooleanfunctions.org
f : {−1,+1} n → {−1,+1} {−1,+1} n +1 −1 f = ±1 S S ⊆ {−1,+1} n
Form “ρ-correlated” For each 1 ≤ i ≤ n… with probability ρ, with probability 1−ρ, + + − − − + − − + + − − − −
+1 −1 f : {−1,+1} n → {−1,+1} {−1,+1} n For.9-correlated x x f = ±1 S
ρ-Sensitivity[f] : = a kind of measure of the “boundary size” of S We’ll focus on “volume - sets” S. Equivalently, “balanced” f:
Which balanced f : {−1,+1} n → {−1,+1} minimizes ρ-Sensitivity[f]? ρ-Isoperimetric Problem on Discrete Cube
Social Choice interpretation Election with n voters, 2 candidates named ±1. f : {−1,+1} n → {−1,+1} is the voting rule: x j ∈ {−1,+1} is j th voter’s preference. f(x) = f(x 1, …, x n ) = winner of the election. E.g.: f(x) = Majority(x) = sgn(x 1 + ∙∙∙ + x n ) f(x) = ElectoralCollege(x) f(x) = +1 (not balanced)
Social Choice interpretation Impartial Culture Assumption [GK’68] : Voters’ preferences are uniformly random. “Faulty voting machine twist”: Each vote recorded correctly with prob. ρ, changed to a random vote with prob. 1−ρ. ρ-Sens[f] = Pr[faulty machines affect outcome]
Which balanced f : {−1,+1} n → {−1,+1} minimizes ρ-Sensitivity[f]? Answer: Dictatorships, f(x) = x j (and negated-dictatorships, f(x) = −x j ) +1 −1 +1 −1 +1 −1 +1
Which balanced f : {−1,+1} n → {−1,+1} minimizes ρ-Sensitivity[f]? Theorem: ∀ balanced f : {−1,+1} n → {−1,+1}, ρ-Sens[f] ≥ ρ-Sens[±Dictators] = (1−ρ)/2 Proof: Fourier analysis of Boolean functions.
One more social choice detour…
Three candidates A, B, C, ranked by n voters. Societal ranking produced by holding 3 pairwise elections using some f : {−1,+1} n → {−1,+1}. (Condorcet election / Independence of Irrelevant Alternatives) Condorcet’s Paradox (1785): With f = Majority, might obtain “A beats B, B beats C, C beats A”! Arrow’s Theorem (1950): Paradox never occurs ⇒ f = ±Dictator. ☹ Kalai’s Proof (2002): Same Fourier analysis as in previous theorem.
Every mathematics talk should contain… a joke a proof
+1 −1 +1 −1 j Inf j [ i th Dictator ] = Examples: Inf j [Majority n ] ∀ j
Which balanced f : {−1,+1} n → {−1,+1} with Influence j [f] “small” for all 1 ≤ j ≤ n minimizes ρ-Sensitivity[f]? Stablest voting rule problem
If f : {−1,+1} n → {−1,+1} is balanced, and Influence j [f] ≤ δ for all 1 ≤ j ≤ n, then ρ-Sens[f] ≥ ρ-Sens[Majority] − (δ) (where (δ) → 0 as δ → 0) Majority Is Stablest Conjecture [KKMO’04] [Guilbaud’52]
If f : {−1,+1} n → {−1,+1} is balanced, and Influence j [f] ≤ δ for all 1 ≤ j ≤ n, then ρ-Sens[f] ≥ ρ-Sens[Majority] − (δ) (where (δ) → 0 as δ → 0) Majority Is Stablest [Guilbaud’52] Theorem [MOO’05]
ρ 10 0 ρ-Sens (1−ρ)/2 (quality of voting machines) (probability outcome affected)
If f : {−1,+1} n → {−1,+1} is balanced, and Influence j [f] ≤ δ for all 1 ≤ j ≤ n, then ρ-Sens[f] ≥ ρ-Sens[Majority] − (δ) (where (δ) → 0 as δ → 0) Majority Is Stablest [KKMO’04/MOO’05] 2013: New proof by De, Mossel, Neeman
[KKMO’04] motivation: “Majority Is Stablest” is the exact statement needed to show an optimal computational complexity result for the algorithmic task called Maximum-Cut.
Max-Cut Input: “Almost bipartite” N-vertex graph Output: Optimal bipartition “mistake edges”
Max-Cut “Brute force” algorithm: ≈ 2 N steps. Question: Is there an “efficient” (= N C step) algorithm? Answer: No. (Assuming “P≠NP”. Max-Cut is “NP-hard”.)
Max-Cut Input: “Almost bipartite” N-vertex graph Output: “mistake edges” Approximate Optimal bipartitionDo your best
There is an efficient algorithm s.t. ∀ ρ if input graph is “ρ-bipartite”, then algorithm outputs a bipartition with fraction of mistake edges ≤ Theorem: [GLS’88,DP’90,GW’94] “optimal bipartition has ≤ (1−ρ)/2 fraction of mistake edges” ≥.69
ρ 10 0 (1−ρ)/2 How bipartite the input graph is Fraction of mistake edges GW alg’s guarantee optimal bipartition prev best efficient algorithm
[KKMO’04] Theorem: “Majority Is Stablest” ⇒ NP-hard to beat GW’s Max-Cut algorithm “UG-hard” Raghavendra ’08: (see also [KKMO’04,Aus’06,Aus’07,OW’07,RS08]) ∃ a generic, efficient algorithm A such that for all “constraint satisfaction problems” M, it’s UG-hard to approx. M better than A does.
Proving Majority Is Stablest: enter Gaussian geometry.
+1 −1 f : {−1,+1} n → {−1,+1} balanced {−1,+1} n For ρ-correlated x x f = ±1 S
If f : {−1,+1} n → {−1,+1} is balanced, and Influence j [f] ≤ δ for all 1 ≤ j ≤ n, then ρ-Sens[f] ≥ “ρ-Sens[Majority]” − (δ) (where (δ) → 0 as δ → 0) Majority Is StablestTheorem
[exercise, Sheppard 1899] “Gaussian-ρ-Sensitivity”[sgn] sgn : ℝ 1 → {−1,+1} ℝ1ℝ1 = ±1 S (Note: S has Gaussian volume ½; i.e., sgn is “balanced”.) n-dim. Boolean function Majority is the 1-dim. Gaussian function sgn in disguise! z z S = (0,∞) ⊆ ℝ 1
More generally, for g : ℝ d → {−1,+1}, g = ±1 S, define Gaussian-ρ-Sens[g] = The Gaussian function g can be “disguised” by a sequence of (small-influence) Boolean functions. S S ℝ2ℝ2 As n → ∞: ρ-Sens[f] → Gaussian-ρ-Sens[g] if g is “balanced” (Pr [z ∈ S] = ½), f → balanced Influence j [f] → 0 ∀ j Majority Is Stablest hypotheses
If g : ℝ d → {−1,+1} is balanced, Gaussian-ρ-Sens[g] ≥. ∴ Majority Is Stablest Theorem implies… Gaussian-ρ-Sens [sgn]
Borell’s Isoperimetric Inequality ∴ Majority Is Stablest Theorem implies… [Borell ’85] (special case) Equality if S is halfline in ℝ 1, or indeed any halfspace thru 0 in ℝ d If S ⊆ ℝ d has Gaussian volume ½, ρ → 1 implies classical Gaussian Isoperimetric Inequality [Borell’74, Sudakov−Tsirelson’74]
∴ Majority Is Stablest ⇒ Borell’s Isoperim. Ineq. Proofs of Borell’s Isoperimetric Inequality: Borell ’85: Gaussian rearrangement, very hard Beckner ’90: Analogue on the sphere by 2-point symm., pretty easy, implies Gaussian version [CL’90] [KO’12]: vol.-½, : four sentences
Every mathematics talk should contain… a joke a proof
∴ Majority Is Stablest ⇒ Borell’s Isoperim. Ineq. Proofs of Borell’s Isoperimetric Inequality: Borell ’85: Gaussian rearrangement, very hard Beckner ’90: Analogue on the sphere by 2-point symm., pretty easy, implies Gaussian version [CL’90] [KO’12]: vol.-½, : four sentences First proof of Majority Is Stablest: [MOO’05] proved “Invariance Principle” (nonlinear CLT) to obtain Borell’s Isoperim. Ineq. ⇒ Majority Is Stablest, whence UG-hardness of beating GW Max-Cut algorithm.
∴ Majority Is Stablest ⇒ Borell’s Isoperim. Ineq. Proofs of Borell’s Isoperimetric Inequality: Borell ’85: Gaussian rearrangement, very hard Beckner ’90: Analogue on the sphere by 2-point symm., pretty easy, implies Gaussian version [CL’90] [MN’12]: Semigroup method [DMN’13]: Discrete proof of Majority Is Stablest (hence also Borell’s Isoperimetric Ineq.) by induction on n. [KO’12]: vol.-½, : four sentences Eldan ’13: Stochastic calculus
Conclusion: Importance of multiple proofs [MOO] proof of Majority Is Stablest: Invariance Principle, reduced to Gaussian geom. Advantage: Invariance Principle useful elsewhere: Social Choice, Learning Theory, Comp. Complexity [Raghavendra’08] [DMN] proof of Majority Is Stablest: Direct induction on n, completely discrete Advantage: Proof expressible in “SOS proof system”, which has algorithmic implications…