Fourier Analysis, Projections, Influence, Junta, Etc…

Slides:



Advertisements
Similar presentations
Shortest Vector In A Lattice is NP-Hard to approximate
Advertisements

Inapproximability of Hypergraph Vertex-Cover. A k-uniform hypergraph H= : V – a set of vertices E - a collection of k-element subsets of V Example: k=3.
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Inapproximability of MAX-CUT Khot,Kindler,Mossel and O ’ Donnell Moshe Ben Nehemia June 05.
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
Gibbs sampler - simple properties It’s not hard to show that this MC chain is aperiodic. Often is reversible distribution. If in addition the chain is.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Having Proofs for Incorrectness
1 By Gil Kalai Institute of Mathematics and Center for Rationality, Hebrew University, Jerusalem, Israel presented by: Yair Cymbalista.
1 Introduction to Computability Theory Lecture15: Reductions Prof. Amos Israeli.
1 Introduction to Computability Theory Lecture12: Reductions Prof. Amos Israeli.
Derandomized DP  Thus far, the DP-test was over sets of size k  For instance, the Z-Test required three random sets: a set of size k, a set of size k-k’
Chapter 5 Orthogonality
Analysis of Boolean Functions Fourier Analysis, Projections, Influence, Junta, Etc… And (some) applications Slides prepared with help of Ricky Rosen.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 12 June 18, 2006
Putting a Junta to the Test Joint work with Eldar Fischer, Dana Ron, Shmuel Safra, and Alex Samorodnitsky Guy Kindler.
1 Tight Hardness Results for Some Approximation Problems [Raz,Håstad,...] Adi Akavia Dana Moshkovitz & Ricky Rosen S. Safra.
Putting a Junta to the Test Joint work with Eldar Fischer & Guy Kindler.
Analysis of Boolean Functions Fourier Analysis, Projections, Influence, Junta, Etc… Slides prepared with help of Ricky Rosen.
Fourier Analysis, Projections, Influences, Juntas, Etc…
1 Noise-Insensitive Boolean-Functions are Juntas Guy Kindler & Muli Safra Slides prepared with help of: Adi Akavia.
Testing of Clustering Noga Alon, Seannie Dar Michal Parnas, Dana Ron.
1 Noise-Insensitive Boolean-Functions are Juntas Guy Kindler & Muli Safra Slides prepared with help of: Adi Akavia.
Michael Bender - SUNY Stony Brook Dana Ron - Tel Aviv University Testing Acyclicity of Directed Graphs in Sublinear Time.
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen.
1 On approximating the number of relevant variables in a function Dana Ron & Gilad Tsur Tel-Aviv University.
Visual Recognition Tutorial
1 Tight Hardness Results for Some Approximation Problems [mostly Håstad] Adi Akavia Dana Moshkovitz S. Safra.
1. 2 Overview Some basic math Error correcting codes Low degree polynomials Introduction to consistent readers and consistency tests H.W.
The Importance of Being Biased Irit Dinur S. Safra (some slides borrowed from Dana Moshkovitz) Irit Dinur S. Safra (some slides borrowed from Dana Moshkovitz)
Fourier Analysis of Boolean Functions Juntas, Projections, Influences Etc.
1 The PCP starting point. 2 Overview In this lecture we’ll present the Quadratic Solvability problem. In this lecture we’ll present the Quadratic Solvability.
Foundations of Privacy Lecture 11 Lecturer: Moni Naor.
1 The PCP starting point. 2 Overview In this lecture we’ll present the Quadratic Solvability problem. We’ll see this problem is closely related to PCP.
Adi Akavia Shafi Goldwasser Muli Safra
1 2 Introduction In this lecture we’ll cover: Definition of strings as functions and vice versa Error correcting codes Low degree polynomials Low degree.
1 Joint work with Shmuel Safra. 2 Motivation 3 Motivation.
Foundations of Cryptography Lecture 2 Lecturer: Moni Naor.
Dana Moshkovitz, MIT Joint work with Subhash Khot, NYU.
Introduction to AEP In information theory, the asymptotic equipartition property (AEP) is the analog of the law of large numbers. This law states that.
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
Correlation testing for affine invariant properties on Shachar Lovett Institute for Advanced Study Joint with Hamed Hatami (McGill)
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
The Polynomial Time Algorithm for Testing Primality George T. Gilbert.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Primer on Fourier Analysis Dana Moshkovitz Princeton University and The Institute for Advanced Study.
Edge-disjoint induced subgraphs with given minimum degree Raphael Yuster 2012.
Basic Concepts in Number Theory Background for Random Number Generation 1.For any pair of integers n and m, m  0, there exists a unique pair of integers.
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics …
1/19 Minimizing weighted completion time with precedence constraints Nikhil Bansal (IBM) Subhash Khot (NYU)
AGC DSP AGC DSP Professor A G Constantinides©1 Signal Spaces The purpose of this part of the course is to introduce the basic concepts behind generalised.
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen, & Adi Akavia.
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen.
Joint Moments and Joint Characteristic Functions.
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
Approximation Algorithms based on linear programming.
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics …
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Primbs, MS&E345 1 Measure Theory in a Lecture. Primbs, MS&E345 2 Perspective  -Algebras Measurable Functions Measure and Integration Radon-Nikodym Theorem.
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen.
Probabilistic Algorithms
Dana Ron Tel Aviv University
Markov Chains Mixing Times Lecture 5
RS – Reed Solomon List Decoding.
The Curve Merger (Dvir & Widgerson, 2008)
Noise-Insensitive Boolean-Functions are Juntas
Summarizing Data by Statistics
Switching Lemmas and Proof Complexity
The Importance of Being Biased
Approximation of Functions
Presentation transcript:

Fourier Analysis, Projections, Influence, Junta, Etc…

© S.Safra Boolean Functions and Juntas Def: A Boolean function

© S.Safra f f * * -1* 1* 11* 11-1* -1-1* -11* -11-1* -111* * -1-11* 111* 1-1* 1-1-1* 1-11* Functions as an Inner-Product Vector-Space f f 2n2n 2n2n * * -1* 1* 11* 11-1* -1-1* -11* -11-1* -111* * -1-11* 111* 1-1* 1-1-1* 1-11*

© S.Safra Functions’ Vector-Space A functions f is a vector A functions f is a vector Addition: ‘f+g’(x) = f(x) + g(x) Addition: ‘f+g’(x) = f(x) + g(x) Multiplication by scalar ‘c  f’(x) = c  f(x) Multiplication by scalar ‘c  f’(x) = c  f(x)

© S.Safra Variables` Influence The influence of an index i  [n] on a Boolean function f:{1,-1} n  {1,-1} is The influence of an index i  [n] on a Boolean function f:{1,-1} n  {1,-1} is

© S.Safra Norms Def: Expectation Norm Def: Expectation Norm Def: Sum Norm Def: Sum Norm

© S.Safra Inner-Product A functions f is a vector A functions f is a vector Inner product (normalized) Inner product (normalized)

© S.Safra Simple Observations Claims: For any function f whose range is {-1, 0, 1}

© S.Safra Monomials What would be the monomials over x  P[n]? What would be the monomials over x  P[n]? All powers except 0 and 1 disappear! All powers except 0 and 1 disappear! Hence, one for each character S  [n] Hence, one for each character S  [n] These are all the multiplicative functions These are all the multiplicative functions

© S.Safra Fourier-Walsh Transform Consider all characters Consider all characters Given any function let the Fourier-Walsh coefficients of f be Given any function let the Fourier-Walsh coefficients of f be thus f can described as thus f can described as

© S.Safra Fourier Transform: Norm Norm: (Sum) Thm [Parseval]: Hence, for a Boolean f

© S.Safra Variables` Influence The influence of an index i  [n] on a Boolean function f:{1,-1} n  {1,-1} is The influence of an index i  [n] on a Boolean function f:{1,-1} n  {1,-1} is Which can be expressed in terms of the Fourier coefficients of f Claim: Which can be expressed in terms of the Fourier coefficients of f Claim:

© S.Safra Restriction and Average Def: Let I  [n], x  P([n]\I), the restriction function is Def: the average function is Note: [n] I x y I x y y y y y

© S.Safra In Fourier Expansion Prop: Prop: And since the expectation of a function is its coefficient on the empty character: And since the expectation of a function is its coefficient on the empty character: Corollary: Corollary:

© S.Safra Expectation and Variance Claim: Claim: Hence, for any f Hence, for any f

© S.Safra Average Sensitivity Def: the sensitivity of x w.r.t. f is Def: the sensitivity of x w.r.t. f is Def: the average-sensitivity of f is Def: the average-sensitivity of f is

© S.Safra When as(f)=1 Def: a balanced function f is s.t. Def: a balanced function f is s.t. Thm: a balanced, Boolean f s.t. as(f)=1 is a dictatorship Thm: a balanced, Boolean f s.t. as(f)=1 is a dictatorship Proof: observe that and since as(f)=1 it must be that however ||f|| 2 =1 hence Proof: observe that and since as(f)=1 it must be that however ||f|| 2 =1 hence

© S.Safra Linear, Boolean Functions Proof(cont): Proof(cont): pick any x; f(x)  {-1, 1} pick any x; f(x)  {-1, 1} Pick {i} with non-zero coefficient Pick {i} with non-zero coefficient Observe that f(x  {i})  {-1, 1} however differ from f(x) Observe that f(x  {i})  {-1, 1} however differ from f(x) Conclusion: Conclusion:

© S.Safra Codes and Boolean Functions Def: an m-bit code is a subset of the set of all the m- binary string C  {-1,1} m The distance of a code C, which is the minimum, over all pairs of legal-words (in C), of the Hamming distance between the two words A Boolean function over n binary variables, is a 2 n -bit string Hence, a set of Boolean functions can be considered as a 2 n -bits code

© S.Safra Hadamard Code In the Hadamard code the set of legal-words consists of all multiplicative (linear if over {0,1}) functions C={  S | S  [n]} namely all characters

© S.Safra Hadamard Test Given a Boolean f, choose random x and y; check that f(x)f(y)=f(xy) Prop(completeness): a legal Hadamard word (a character) always passes this test

22 Hadamard Test – Soundness Prop(soundness): Proof:

© S.Safra Long-Code In the long-code the set of legal-words consists of all monotone dictatorships This is the most extensive binary code, as its bits represent all possible binary values over n elements

© S.Safra Long-Code Encoding an element e  [n] : Encoding an element e  [n] : E e legally-encodes an element e if E e = f e E e legally-encodes an element e if E e = f e F F F F T T T T T T

© S.Safra Testing Long-code Def(a long-code list-test): given a code-word f, probe it in a constant number of entries, and accept almost always if f is a monotone dictatorship accept almost always if f is a monotone dictatorship reject w.h.p if f does not have a sizeable fraction of its Fourier weight concentrated on a small set of variables, that is, if  a semi-Junta J  [n] s.t. reject w.h.p if f does not have a sizeable fraction of its Fourier weight concentrated on a small set of variables, that is, if  a semi-Junta J  [n] s.t. Note: a long-code list-test, distinguishes between the case f is a dictatorship, to the case f is far from a junta.

© S.Safra Motivation – Testing Long-code The long-code list-test are essential tools in proving hardness results. The long-code list-test are essential tools in proving hardness results. Hence finding simple sufficient-conditions for a function to be a junta is important. Hence finding simple sufficient-conditions for a function to be a junta is important.

© S.Safra Perturbation Def: denote by   the distribution over all subsets of [n], which assigns probability to a subset x as follows: independently, for each i  [n], let i  x with probability 1-  i  x with probability 1-  i  x with probability  i  x with probability 

© S.Safra Long-Code Test Given a Boolean f, choose random x and y, and choose z   ; check that f(x)f(y)=f(xyz) Prop(completeness): a legal long- code word (a dictatorship) passes this test w.p. 1- 

29 Long-code Test – Soundness Prop(soundness): Proof:

© S.Safra Variation Def: the variation of f (extension of influence ) Prop: the following are equivalent definitions to the variation of f:

© S.Safra Proof Recall Therefore

© S.Safra Proof – Cont. Recall Recall Therefore (by Parseval): Therefore (by Parseval):

© S.Safra High vs Low Frequencies Def: The section of a function f above k is and the low-frequency portion is

© S.Safra Junta Test Def: A Junta test is as follows: A distribution over l queries For each l-tuple, a local-test that either accepts or rejects:T[x 1, …, x l ]: {1, -1} l  {T,F} s.t. for a j-junta f whereas for any f which is not ( , j)-Junta

© S.Safra Fourier Representation of influence Proof: consider the I-average function on P[I] which in Fourier representation is and

© S.Safra Fourier Representation of influence Proof: consider the influence function which in Fourier representation is and

© S.Safra Subsets` Influence Def: The Variation of a subset I  [n] on a Boolean function f is and the low-frequency influence

© S.Safra Independence-Test The I-independence-test on a Boolean function f is, for Lemma:

© S.Safra

Junta Test The junta-size test JT on a Boolean function f is The junta-size test JT on a Boolean function f is Randomly partition [n] to I 1,.., I r Randomly partition [n] to I 1,.., I r Run the independence-test t times on each I h Run the independence-test t times on each I h Accept if ≤j of the I h fail their independence-tests Accept if ≤j of the I h fail their independence-tests For r>>j 2 and t>>j 2 / 

© S.Safra Completeness Lemma: for a j-junta f Proof: only those sets which contain an index of the Junta would fail the independence-test

© S.Safra Soundness Lemma: Proof: Assume the premise. Fix  <<1/t and let

© S.Safra |J| ≤ j Prop: r >> j implies |J| ≤ j Proof: otherwise, J spreads among I h w.h.p. J spreads among I h w.h.p. and for any I h s.t. I h  J ≠  it must be that Variation I h (f) >  and for any I h s.t. I h  J ≠  it must be that Variation I h (f) > 

© S.Safra High Frequencies Contribute Little Prop: k >> r log r implies Proof: a character S of size larger than k spreads w.h.p. over all parts I h, hence contributes to the influence of all parts. If such characters were heavy (>  /4), then surely there would be more than j parts I h that fail the t independence-tests

© S.Safra Almost all Weight is on J Lemma: Proof: assume by way of contradiction otherwise since for a random partition w.h.p. (Chernoff bound) for every h however, since for any I the influence of every I h would be ≥  /100rk

© S.Safra Find the Close Junta Now, since consider the (non Boolean) which, if rounded outside J is Boolean and not more than  far from f

© S.Safra Consider the q-biased product distribution  q : Def: The probability of a subset F and for a family of subsets  Consider the q-biased product distribution  q : Def: The probability of a subset F and for a family of subsets  Product, Biased Distribution

© S.Safra Beckner/Nelson/Bonami Inequality Def: let T  be the following operator on any f, Prop: Proof:

© S.Safra Beckner/Nelson/Bonami Inequality Def: let T  be the following operator on any f, Thm: for any p≥r and  ≤((r-1)/(p-1)) ½

© S.Safra Beckner/Nelson/Bonami Corollary Corollary 1: for any real f and 2≥r≥1 Corollary 2: for real f and r>2

© S.Safra Average Sensitivity The sum of variables’ influence is referred to as the average sensitivity Which can be expressed by the Fourier coefficients as

© S.Safra Freidgut Theorem Thm: any Boolean f is an [ , j]-junta for Proof: 1. Specify the junta J 2. Show the complement of J has little influence

© S.Safra Specify the Junta Set k=  (as(f)/  ), and  =2 -  (k) Let We’ll prove: and let hence, J is a [ ,j]-junta, and |J|=2 O(k)

© S.Safra High Frequencies Contribute Little Prop: Proof: a character S of size larger than k contributes at least k times the square of its coefficient to the average sensitivity. If such characters were heavy (>  /4), as(f) would have been large

© S.Safra Altogether Lemma: Proof:

© S.Safra Altogether

Biased  q - Influence The  q -influence of an index i  [n] on a boolean function f:P[n]  {1,-1} is The  q -influence of an index i  [n] on a boolean function f:P[n]  {1,-1} is

© S.Safra Biased Walsh Product The usual Fourier basis  is not orthogonal with respect to the biased inner-product, The usual Fourier basis  is not orthogonal with respect to the biased inner-product, Hence, we use the Biased Walsh Product: Hence, we use the Biased Walsh Product:

© S.Safra Thm [Margulis-Russo]: For monotone f Hence Lemma: For monotone f  > 0,  q  [p, p+  ] s.t. as q (f)  1/  Proof: Otherwise  p+  (f) > 1

© S.Safra Proof [Margulis-Russo]:

© S.Safra Influential People and Issues The theory of the influence of variables on Boolean functions [BL, KKL] and related issues, has been introduced to tackle social choice problems, furthermore has motivated a magnificent sequence of works, related to Economics [K], percolation [BKS], Hardness of approximation [DS] Revolving around the Fourier/Walsh analysis of Boolean functions… The theory of the influence of variables on Boolean functions [BL, KKL] and related issues, has been introduced to tackle social choice problems, furthermore has motivated a magnificent sequence of works, related to Economics [K], percolation [BKS], Hardness of approximation [DS] Revolving around the Fourier/Walsh analysis of Boolean functions… And the real important question: And the real important question:

© S.Safra Where to go for Dinner? Who has suggestions: Each cast their vote in an (electronic) envelope, and have the system decided, not necessarily according to majority… It turns out someone –in the Florida wing- has the power to flip some votes Power influence

© S.Safra Voting Systems n agents, each voting either “for” (T) or “against” (F) – a Boolean function over n variables f is the outcome n agents, each voting either “for” (T) or “against” (F) – a Boolean function over n variables f is the outcome The values of the agents (variables) may each, independently, flip with probability The values of the agents (variables) may each, independently, flip with probability It turns out: one cannot design an f that would be robust to such noise -that is, would, on average, change value w.p. < O(1) - unless taking into account only very few of the votes It turns out: one cannot design an f that would be robust to such noise -that is, would, on average, change value w.p. < O(1) - unless taking into account only very few of the votes

© S.Safra [n] x I I z Noise-Sensitivity How often does the value of f changes when the input is perturbed? x I I z

© S.Safra Def( ,p,x [n] ): Let 0< <1, and x  P([n]). Then y~ ,p,x, if y = (x\I)  z where Def( ,p,x [n] ): Let 0< <1, and x  P([n]). Then y~ ,p,x, if y = (x\I)  z where I~  [n] is a noise subset, and I~  [n] is a noise subset, and z~  p I is a replacement. z~  p I is a replacement. Def( -noise-sensitivity): let 0< <1, then [ When p=½ equivalent to flipping each coordinate in x independently w.p. /2.] [n] x I I z Noise-Sensitivity

© S.Safra Noise-Sensitivity – Cont. Advantage: very efficiently testable (using only two queries) by a perturbation-test. Advantage: very efficiently testable (using only two queries) by a perturbation-test. Def (perturbation-test): choose x~  p, and y~ ,p,x, check whether f(x)=f(y) The success is proportional to the noise- sensitivity of f. Def (perturbation-test): choose x~  p, and y~ ,p,x, check whether f(x)=f(y) The success is proportional to the noise- sensitivity of f. Prop: the -noise-sensitivity is given by Prop: the -noise-sensitivity is given by

© S.Safra Relation between Parameters Prop: small ns  small high-freq weight Proof: therefore: if ns is small, then Hence the high frequencies must have small weights (as). Prop: small as  small high-freq weight Proof:

© S.Safra Main Result Theorem:  constant  >0 s.t. any Boolean function f:P([n])  {-1,1} satisfying is an [ ,j]-junta for j=O(  -2 k 3  2k ). Corollary: fix a p-biased distribution  p over P([n]) Let >0 be any parameter. Set k=log 1- (1/2) Then  constant  >0 s.t. any Boolean function f:P([n])  {-1,1} satisfying is an [ ,j]-junta for j=O(  -2 k 3  2k )

© S.Safra First Attempt: Following Freidgut’s Proof Thm: any Boolean function f is an [ ,j]-junta for Proof: 1. Specify the junta where, let k=O(as(f)/  ) and fix  =2 -O(k) 2. Show the complement of J has small variation P([n]) J

© S.Safra If k were 1 Easy case (!?!): If we’d have a bound on the non- linear weight, we should be done. The linear part is a set of independent characters (the singletons) In order for those to hit close to 1 or -1 most of the time, they must avoid the Law of Large Numbers, namely be almost entirely placed on one singleton [by Chernoff like bound] Thm[FKN, ext.]: Assume f is close to linear, then f is close to shallow (  a constant function or a dictatorship)

© S.Safra How to Deal with Dependency between Characters Recall Recall (theorem’s premise) (theorem’s premise) Idea: Let Partition [n]\J into I 1,…,I r, for r >> k Partition [n]\J into I 1,…,I r, for r >> k w.h.p f I [x] is close to linear (low freq characters intersect I expectedly by  1 element, while high-frequency weight is low). w.h.p f I [x] is close to linear (low freq characters intersect I expectedly by  1 element, while high-frequency weight is low). P([n]) J I1I1 I2I2 IrIr I

© S.Safra Shallow Function Def: a function f is linear, if only singletons have non-zero weight Def: a function f is linear, if only singletons have non-zero weight Def: a function f is shallow, if f is either a constant or a dictatorship. Def: a function f is shallow, if f is either a constant or a dictatorship. Claim: Boolean linear functions are shallow. Claim: Boolean linear functions are shallow. 0123kn0123kn weight Character size

© S.Safra Boolean Linear  Shallow Claim: Boolean linear functions are shallow. Claim: Boolean linear functions are shallow. Proof: let f be Boolean linear function, we next show: Proof: let f be Boolean linear function, we next show: 1.  {i o } s.t. (i.e. ) 2. And conclude, that eitheror i.e. f is shallow

© S.Safra Claim 1 Claim 1: let f be Boolean linear function, then  {i o } s.t. Claim 1: let f be Boolean linear function, then  {i o } s.t. Proof: w.l.o.g assume Proof: w.l.o.g assume for any z  {3,…,n} consider x 00 =z, x 10 =z  {1}, x 01 =z  {2}, x 11 =z  {1,2} for any z  {3,…,n} consider x 00 =z, x 10 =z  {1}, x 01 =z  {2}, x 11 =z  {1,2} then. then. Next value must be far from {-1,1} Next value must be far from {-1,1} A contradiction! (boolean function) A contradiction! (boolean function) Therefore Therefore 1 ?

© S.Safra Claim 1 Claim 1: let f be Boolean linear function, then  {i o } s.t. Claim 1: let f be Boolean linear function, then  {i o } s.t. Proof: w.l.o.g assume Proof: w.l.o.g assume for any z  {3,…,n} consider x 00 =z, x 10 =z  {1}, x 01 =z  {2}, x 11 =z  {1,2} for any z  {3,…,n} consider x 00 =z, x 10 =z  {1}, x 01 =z  {2}, x 11 =z  {1,2} then. then. But this is impossible as f(x 00 ),f(x 10 ),f(x 01 ), f(x 11 )  {-1,1}, hence their distances cannot all be >0 ! But this is impossible as f(x 00 ),f(x 10 ),f(x 01 ), f(x 11 )  {-1,1}, hence their distances cannot all be >0 ! Therefore. Therefore. 1 ?

© S.Safra Claim 2 Claim 2: let f be Boolean function, s.t. Then eitheror Claim 2: let f be Boolean function, s.t. Then eitheror Proof: consider f(  ) and f(i 0 ): Proof: consider f(  ) and f(i 0 ): Then Then but f is Boolean, hence but f is Boolean, hence therefore therefore 1 0

© S.Safra Linearity and Dictatorship Prop: Let f be a balanced linear Boolean function then f is a dictatorship. Proof: f(  ),f(i 0 )  {-1,1}, hence Prop: Let f be a balanced Boolean function s.t. as(f)=1, then f is a dictatorship. Proof:, but f is balanced, (i.e. ), therefore f is also linear.

© S.Safra Proving FKN: almost-linear  close to shallow Theorem: Let f:P([n])   be linear, Theorem: Let f:P([n])   be linear, Let Let let i 0 be the index s.t. is maximal let i 0 be the index s.t. is maximalthen Note: f is linear, hence w.l.o.g., assume i 0 =1, then all we need to show is: We show that in the following claim and lemma. Note: f is linear, hence w.l.o.g., assume i 0 =1, then all we need to show is: We show that in the following claim and lemma.

© S.Safra Corollary Corollary: Let f be linear, and then  a shallow boolean function g s.t. Corollary: Let f be linear, and then  a shallow boolean function g s.t. Proof: let, let g be the boolean function closest to l. Then, this is true, as Proof: let, let g be the boolean function closest to l. Then, this is true, as is small (by theorem), is small (by theorem), and additionallyis small, since and additionallyis small, since

© S.Safra Claim 1 Claim 1: Let f be linear. w.l.o.g., assume then  global constant c=min{p,1-p} s.t. Claim 1: Let f be linear. w.l.o.g., assume then  global constant c=min{p,1-p} s.t. {} {1} {2} {i}{n} {1,2} {1,3}{n-1,n}S{1,..,n} weight Characters Each of weight no more than c 

© S.Safra Proof of Claim1 Proof: assume Proof: assume for any z  {3,…,n}, consider x 00 =z, x 10 =z  {1}, x 01 =z  {2}, x 11 =z  {1,2} for any z  {3,…,n}, consider x 00 =z, x 10 =z  {1}, x 01 =z  {2}, x 11 =z  {1,2} then then Next value must be far from {-1,1} ! Next value must be far from {-1,1} ! A contradiction! (to ) A contradiction! (to ) 1 ?

© S.Safra Where to go for Dinner? Who has suggestions: Each cast their vote in an (electronic) envelope, and have the system decided, not necessarily according to majority… It turns out someone –in the Florida wing- has the power to flip some votes Power influence Of course they’ll have to discuss it over dinner….