Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen.

Slides:



Advertisements
Similar presentations
6.896: Topics in Algorithmic Game Theory Lecture 21 Yang Cai.
Advertisements

NP-Hard Nattee Niparnan.
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
6.896: Topics in Algorithmic Game Theory Lecture 20 Yang Cai.
Inapproximability of MAX-CUT Khot,Kindler,Mossel and O ’ Donnell Moshe Ben Nehemia June 05.
Auction Theory Class 5 – single-parameter implementation and risk aversion 1.
Price Of Anarchy: Routing
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
An Efficient Membership-Query Algorithm for Learning DNF with Respect to the Uniform Distribution Jeffrey C. Jackson Presented By: Eitan Yaakobi Tamar.
An Approximate Truthful Mechanism for Combinatorial Auctions An Internet Mathematics paper by Aaron Archer, Christos Papadimitriou, Kunal Talwar and Éva.
Effort Games and the Price of Myopia Michael Zuckerman Joint work with Yoram Bachrach and Jeff Rosenschein.
Seminar In Game Theory Algorithms, TAU, Agenda  Introduction  Computational Complexity  Incentive Compatible Mechanism  LP Relaxation & Walrasian.
Bundling Equilibrium in Combinatorial Auctions Written by: Presented by: Ron Holzman Rica Gonen Noa Kfir-Dahav Dov Monderer Moshe Tennenholtz.
1/17 Optimal Long Test with One Free Bit Nikhil Bansal (IBM) Subhash Khot (NYU)
Entropy Rates of a Stochastic Process
Proclaiming Dictators and Juntas or Testing Boolean Formulae Michal Parnas Dana Ron Alex Samorodnitsky.
Analysis of Boolean Functions Fourier Analysis, Projections, Influence, Junta, Etc… And (some) applications Slides prepared with help of Ricky Rosen.
Putting a Junta to the Test Joint work with Eldar Fischer, Dana Ron, Shmuel Safra, and Alex Samorodnitsky Guy Kindler.
1 Tight Hardness Results for Some Approximation Problems [Raz,Håstad,...] Adi Akavia Dana Moshkovitz & Ricky Rosen S. Safra.
Putting a Junta to the Test Joint work with Eldar Fischer & Guy Kindler.
Analysis of Boolean Functions Fourier Analysis, Projections, Influence, Junta, Etc… Slides prepared with help of Ricky Rosen.
Fourier Analysis, Projections, Influences, Juntas, Etc…
1 Noise-Insensitive Boolean-Functions are Juntas Guy Kindler & Muli Safra Slides prepared with help of: Adi Akavia.
1 Noise-Insensitive Boolean-Functions are Juntas Guy Kindler & Muli Safra Slides prepared with help of: Adi Akavia.
Michael Bender - SUNY Stony Brook Dana Ron - Tel Aviv University Testing Acyclicity of Directed Graphs in Sublinear Time.
Fourier Analysis, Projections, Influence, Junta, Etc…
1 On approximating the number of relevant variables in a function Dana Ron & Gilad Tsur Tel-Aviv University.
1 Tight Hardness Results for Some Approximation Problems [mostly Håstad] Adi Akavia Dana Moshkovitz S. Safra.
CSE 421 Algorithms Richard Anderson Lecture 27 NP Completeness.
Chapter 11: Limitations of Algorithmic Power
The Importance of Being Biased Irit Dinur S. Safra (some slides borrowed from Dana Moshkovitz) Irit Dinur S. Safra (some slides borrowed from Dana Moshkovitz)
Fourier Analysis of Boolean Functions Juntas, Projections, Influences Etc.
Lecture 20: April 12 Introduction to Randomized Algorithms and the Probabilistic Method.
1 Slides by Asaf Shapira & Michael Lewin & Boaz Klartag & Oded Schwartz. Adapted from things beyond us.
Adi Akavia Shafi Goldwasser Muli Safra
1 Joint work with Shmuel Safra. 2 Motivation 3 Motivation.
Foundations of Cryptography Lecture 2 Lecturer: Moni Naor.
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
Correlation testing for affine invariant properties on Shachar Lovett Institute for Advanced Study Joint with Hamed Hatami (McGill)
Auction Seminar Optimal Mechanism Presentation by: Alon Resler Supervised by: Amos Fiat.
Nattee Niparnan. Easy & Hard Problem What is “difficulty” of problem? Difficult for computer scientist to derive algorithm for the problem? Difficult.
Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s.
Primer on Fourier Analysis Dana Moshkovitz Princeton University and The Institute for Advanced Study.
Edge-disjoint induced subgraphs with given minimum degree Raphael Yuster 2012.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
Private Approximation of Search Problems Amos Beimel Paz Carmi Kobbi Nissim Enav Weinreb (Technion)
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics …
1/19 Minimizing weighted completion time with precedence constraints Nikhil Bansal (IBM) Subhash Khot (NYU)
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen, & Adi Akavia.
6.853: Topics in Algorithmic Game Theory Fall 2011 Constantinos Daskalakis Lecture 22.
Unique Games Approximation Amit Weinstein Complexity Seminar, Fall 2006 Based on: “Near Optimal Algorithms for Unique Games" by M. Charikar, K. Makarychev,
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen.
Locally Testable Codes and Caylay Graphs Parikshit Gopalan (MSR-SVC) Salil Vadhan (Harvard) Yuan Zhou (CMU)
Combinatorial Auction. A single item auction t 1 =10 t 2 =12 t 3 =7 r 1 =11 r 2 =10 Social-choice function: the winner should be the guy having in mind.
CSE 421 Algorithms Richard Anderson Lecture 27 NP-Completeness Proofs.
Approximation Algorithms based on linear programming.
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics …
Random Access Codes and a Hypercontractive Inequality for
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen.
Property Testing (a.k.a. Sublinear Algorithms )
Information Complexity Lower Bounds
Dana Ron Tel Aviv University
ICS 353: Design and Analysis of Algorithms
Linear sketching with parities
Richard Anderson Lecture 25 NP-Completeness
Noise-Insensitive Boolean-Functions are Juntas
Linear sketching over
Linear sketching with parities
Switching Lemmas and Proof Complexity
Locality In Distributed Graph Algorithms
Presentation transcript:

Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen

Introduction Objectives: Objectives: To introduce Analysis of Boolean Functions and some of its applications. To introduce Analysis of Boolean Functions and some of its applications. Overview: Overview: Basic definitions. Basic definitions. First passage percolation First passage percolation Mechanism design Mechanism design Graph property Graph property Learning functions Learning functions … And more… … And more…

Influential People The theory of the Influence of Variables on Boolean Functions [ KKL,BL,R,M], has been introduced to tackle Social Choice problems and distributed computing. The theory of the Influence of Variables on Boolean Functions [ KKL,BL,R,M], has been introduced to tackle Social Choice problems and distributed computing. It has motivated a magnificent body of work, related to It has motivated a magnificent body of work, related to Sharp Threshold [F, FK] Sharp Threshold [F, FK] Percolation [BKS] Percolation [BKS] Economics: Arrow’s Theorem [K] Economics: Arrow’s Theorem [K] Hardness of Approximation [DS] Hardness of Approximation [DS] Utilizing Harmonic Analysis of Boolean functions… And the real important question: And the real important question:

Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide – not necessarily according to majority… And what if someone (in Florida?) can flip some votes Power influence

Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide – not necessarily according to majority… And what if someone (in Florida?) can flip some votes Power influence

The alternatives Diners would cast their vote in an (electronic) envelope The system would decide – not necessarily according to majority… And what if someone (in Florida?) can flip some votes Where to go for Dinner? Power influence

Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide – not necessarily according to majority… And what if someone (in Florida?) can flip some votes Power influence

Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide – not necessarily according to majority… And what if someone (in Florida?) can flip some votes Power influence

Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide – not necessarily according to majority… And what if someone (in Florida?) can flip some votes Power influence

Boolean Functions Def: A Boolean function Def: A Boolean function Power set of [n] Choose the location of -1 Choose a sequence of -1 and 1

Noise Sensitivity The values of the variables may each, independently, flip with probability The values of the variables may each, independently, flip with probability It turns out: one cannot design an f that would be robust to such noise --that is, would, on average, change value w.p. < O(1) -- unless determining the outcome according to very few of the voters It turns out: one cannot design an f that would be robust to such noise --that is, would, on average, change value w.p. < O(1) -- unless determining the outcome according to very few of the voters

Def: the influence of i on f is the probability, over a random input x, that f changes its value when i is flipped Def: the influence of i on f is the probability, over a random input x, that f changes its value when i is flipped Voting and influence

The influence of i on Majority is the probability, over a random input x, Majority changes with i The influence of i on Majority is the probability, over a random input x, Majority changes with i this happens when half of the n-1 coordinate (people) vote -1 and half vote 1. this happens when half of the n-1 coordinate (people) vote -1 and half vote 1. i.e. i.e. Majority :{1,-1} n  {1,-1} Majority :{1,-1} n  {1,-1} 1?

Parity : {1,-1} n  {1,-1} Parity : {1,-1} n  {1,-1} Always changes the value of parity

influence of i on Dictatorship i = 1. influence of i on Dictatorship i = 1. influence of j  i on Dictatorship i = 0. influence of j  i on Dictatorship i = 0. Dictatorship i :{1,-1} 20  {1,-1} Dictatorship i :{1,-1} 20  {1,-1} Dictatorship i (x)=x i Dictatorship i (x)=x i

Average Sensitivity (Total-Influence) Def: the Average­ Sensitivity of f ( as ) is the sum of influences of all coordinates i  [n] : Def: the Average­ Sensitivity of f ( as ) is the sum of influences of all coordinates i  [n] : as (Majority) = O(n ½ ) as (Majority) = O(n ½ ) as (Parity) = n as (Parity) = n as (dictatorship) =1 as (dictatorship) =1

example majority for majority for What is Average Sensitivity ? What is Average Sensitivity ? AS= ½+ ½+ ½= 1.5 AS= ½+ ½+ ½= Influence 2 3

When as (f)=1 Def: f is a balanced function if it equals -1 exactly half of the times: E x [f(x)]=0 Can a balanced f have as (f) < 1 ? What about as (f)=1 ? Beside dictatorships? Prop: f is balanced and as (f)=1  f is a dictatorship.

Representing f as a Polynomial What would be the monomials over x  P[n] ? What would be the monomials over x  P[n] ? All powers except 0 and 1 cancel out! All powers except 0 and 1 cancel out! Hence, one for each character S  [n] Hence, one for each character S  [n] These are all the multiplicative functions These are all the multiplicative functions

Fourier-Walsh Transform Consider all characters Consider all characters Given any function let the Fourier-Walsh coefficients of f be Given any function let the Fourier-Walsh coefficients of f be thus f can be described as thus f can be described as

Norms Def: Expectation norm on the function Def: Summation norm on the transform Thm [Parseval]: Hence, for a Boolean f

We may think of the Transform as defining a distribution over the characters. We may think of the Transform as defining a distribution over the characters. Distribution over Characters

Simple Observations Def: Def: Claim:For any function f whose range is {-1,0,1}: Claim:For any function f whose range is {-1,0,1}:

Variables` Influence Recall: influence of an index i  [n] on a Boolean function f:{1,-1} n  {1,-1} is Recall: influence of an index i  [n] on a Boolean function f:{1,-1} n  {1,-1} is Which can be expressed in terms of the Fourier coefficients of f Claim: Which can be expressed in terms of the Fourier coefficients of f Claim: And the as: And the as:

Fourier Representation of influence Proof: consider the influence function which in Fourier representation is and

Balanced f s.t. as (f)=1 is Dict. Since f is balanced and So f is linear For any i s.t. If  s s.t |s|>1 and then as(f)>1 Only i has changed

Expectation and Variance Claim: Claim: Hence, for any f Hence, for any f

First Passage Percolation [BKS] Each edge costs a w/probability ½ and b w/probability ½

First Passage Percolation Consider the Grid Consider the Grid For each edge e of choose independently w e = 1 or w e = 2, each with probability ½ For each edge e of choose independently w e = 1 or w e = 2, each with probability ½ This induces a shortest-path metric on This induces a shortest-path metric on Thm : The variance of the shortest path from the origin to vertex v is bounded from above by O( |v|/ log |v|) [BKS] Thm : The variance of the shortest path from the origin to vertex v is bounded from above by O( |v|/ log |v|) [BKS] Proof idea: The average sensitivity of shortest-path is bounded by that term Proof idea: The average sensitivity of shortest-path is bounded by that term

Let G denote the grid Let G denote the grid SP G – the shortest path in G from the origin to v. SP G – the shortest path in G from the origin to v. Let denote the Grid which differ from G only on w e i.e. flip the value of e in G. Let denote the Grid which differ from G only on w e i.e. flip the value of e in G. Set Set Proof outline

Observation If e participates in a shortest path then flipping its value will increase or decrease the SP in 1,if e is not in SP - the SP will not change.

Proof cont. And by [KKL] there is at least one variable whose influence is at least  (logn/n) And by [KKL] there is at least one variable whose influence is at least  (logn/n)

Def: A graph property is a subset of graphs invariant under isomorphism. Def: a monotone graph property is a graph property P s.t. If P(G) then for every super-graph H of G (namely, a graph on the same set of vertices, which contains all edges of G) P(H) as well. If P(G) then for every super-graph H of G (namely, a graph on the same set of vertices, which contains all edges of G) P(H) as well. P is in fact a Boolean function: P: {-1, 1} V 2  {-1, 1} Graph properties

Examples of graph properties G is connected G is connected G is Hamiltonian G is Hamiltonian G contains a clique of size t G contains a clique of size t G is not planar G is not planar The clique number of G is larger than that of its complement The clique number of G is larger than that of its complement The diameter of G is at most s The diameter of G is at most s... etc.... etc. What is the influence of different e on P? What is the influence of different e on P?

Erdös–Rényi G(n,p) Graph The Erdös-Rényi distribution of random graphs Put an edge between any two vertices w.p. p

Definitions P – a graph property P – a graph property  p (P) - the probability that a random graph on n vertices with edge probability p satisfies P.  p (P) - the probability that a random graph on n vertices with edge probability p satisfies P. G  G(n,p) - G is a random graph of n vertices and edge probability p. G  G(n,p) - G is a random graph of n vertices and edge probability p.

Example-Max Clique Consider G  G(n,p) Consider G  G(n,p) The size of the interval of probabilities p for which the clique number of G is almost surely k (where k  log n) is of order log -1 n. The size of the interval of probabilities p for which the clique number of G is almost surely k (where k  log n) is of order log -1 n. The threshold interval: The transition between clique numbers k-1 and k. The threshold interval: The transition between clique numbers k-1 and k. Probability for choosing an edge Number of vertices

The probability of having a (k + 1)-clique is still small (  log -1 n). The probability of having a (k + 1)-clique is still small (  log -1 n). The value of p must increase byclog -1 n before the probability for having a (k + 1)- clique reaches  and another transition interval begins. The value of p must increase by clog -1 n before the probability for having a (k + 1)- clique reaches  and another transition interval begins. The probability of having a clique of size k is 1-  The probability of having a clique of size k is 

Def: Sharp threshold Sharp threshold in monotone graph property: Sharp threshold in monotone graph property: The transition from a property being very unlikely to it being very likely is very swift. The transition from a property being very unlikely to it being very likely is very swift. G satisfies property P G Does not satisfies property P

Thm: every monotone graph property has a Sharp Threshold [FK] Let P be any monotone property of graphs on n vertices. Let P be any monotone property of graphs on n vertices. If  p (P) >  then  q (P) > 1-  for q = p + c 1 log(½  )/logn Proof idea: show as p’ (P), for p’>p, is high

Thm [Margulis-Russo]: For monotone f

Proof [Margulis-Russo]:

Mechanism Design Problem N agents, each agent i has private input t i  T. All other information is public knowledge. N agents, each agent i has private input t i  T. All other information is public knowledge. Each agent i has a valuation for all items: Each agent wishes to optimize her own utility. Each agent i has a valuation for all items: Each agent wishes to optimize her own utility. Objective: minimize objective function, the total payment. Objective: minimize the objective function, the total payment. Means: protocol between agents and auctioneer. Means: protocol between agents and auctioneer.

Vickrey-Clarke-Groves (VCG) Sealed bid auction Sealed bid auction A Truth Revealing protocol, namely, one in which each agent might as well reveal her valuation to the auctioneer A Truth Revealing protocol, namely, one in which each agent might as well reveal her valuation to the auctioneer Whereby each agent gets the best (for her) price she could have bid and still win the auction Whereby each agent gets the best (for her) price she could have bid and still win the auction

Shortest Path using VGC Problem definition: Problem definition: Communication network modeled by a directed graph G and two vertices source s and target t. Communication network modeled by a directed graph G and two vertices source s and target t. Agents = edges in G Agents = edges in G Each agent has a cost for sending a single message on her edge denote by t e. Each agent has a cost for sending a single message on her edge denote by t e. Objective: find the shortest (cheapest) path from s to t. Objective: find the shortest (cheapest) path from s to t. Means: protocol between agents and auctioneer. Means: protocol between agents and auctioneer.

VCG for Shortest-Path 50$10$50$10$ Always in the shortest path

How much will we pay? SP SP Every agent will get $1 more. Every agent will get $1 more. Thm[Mahedia,Saberi,S]: expected extra pay is as SP (G) Thm[Mahedia,Saberi,S]: expected extra pay is as SP (G) 1$ 2$

How much will we overpay? SP SP Every agent gets an extra $1 Every agent gets an extra $1 Thm[Mahedia,Saberi,S]: expected extra pay is as SP (G) Thm[Mahedia,Saberi,S]: expected extra pay is as SP (G) 1$ 2$

Learning Functions To learn a function f:G n  G is to sample it in poly(n|G|/  ) points and come up with some f’ that differs with f on at most an  fraction of the points To learn a function f:G n  G is to sample it in poly(n|G|/  ) points and come up with some f’ that differs with f on at most an  fraction of the points Membership: choose your points Membership: choose your points Random: w.h.p. over values for a small random set of points Random: w.h.p. over values for a small random set of points Applications (when learnable): bioinformatics, economy, etc. Applications (when learnable): bioinformatics, economy, etc. Also [Akavia, Goldwasser, S.]: cryptography -- hardcore predicates via list-decoding Also [Akavia, Goldwasser, S.]: cryptography -- hardcore predicates via list-decoding Learning Alg

Concentrated Def: the restriction of f to  is Def: the restriction of f to  is Def: f is a concentrated function if  >0,  of poly(n/  ) size s.t. Def: f is a concentrated function if  >0,  of poly(n/  ) size s.t. Thm [Kushilevitz, Mansour]: f:{0,1} n  {0,1} concentrated is learnable Thm [Kushilevitz, Mansour]: f:{0,1} n  {0,1} concentrated is learnable Thm[Akavia, Goldwasser, S.]: over any Abelian group f:G n  G Thm[Akavia, Goldwasser, S.]: over any Abelian group f:G n  G characters weight … …

Juntas A function is a J-junta if its value depends on only J variables. A function is a J-junta if its value depends on only J variables. A Dictatorship is 1-junta A Dictatorship is 1-junta

Juntas A function is a J-junta if its value depends on only J variables. A function is a J-junta if its value depends on only J variables. Thm [Fischer, Kindler, Ron, Samo., S]: Juntas are testable Thm [Fischer, Kindler, Ron, Samo., S]: Juntas are testable Thm [ Kushilevitz, Mansour; Mossel, Odonel ]: Juntas are learnable Thm [ Kushilevitz, Mansour; Mossel, Odonel ]: Juntas are learnable

- Noise sensitivity - Noise sensitivity The noise sensitivity of a function f is the probability that f changes its value when flipping a subset of its variables according to the  p distribution. The noise sensitivity of a function f is the probability that f changes its value when flipping a subset of its variables according to the  p distribution. Choose a subset, I, of variables Each var is in the set with probability Choose a subset, I, of variables Each var is in the set with probability Flip each value of the subset, I with probability p Flip each value of the subset, I with probability p What is the new value of f? I I

Noise sensitivity and juntas  Juntas are noise insensitive (stable) Thm [Bourgain; Kindler & S]: Noise insensitive (stable) Boolean functions are Juntas Choose a subset (I) of variables Each var is in the set with probability Choose a subset (I) of variables Each var is in the set with probability Flip each value of the subset (I) with probability p Flip each value of the subset (I) with probability p What is the new value of f? W.H.P STAY THE SAME What is the new value of f? W.H.P STAY THE SAME I I Junta

Freidgut Theorem Thm: any Boolean f is an [ , j]-junta for Proof: 1. Specify the junta J 2. Show the complement of J has little influence

Long-Code In the long-code the set of legal-words consists of all monotone dictatorships This is the most extensive binary code, as its bits represent all possible binary values over n elements

Long-Code Encoding an element e  [n] : Encoding an element e  [n] : E e legally-encodes an element e if E e = f e E e legally-encodes an element e if E e = f e F F F F T T T T T T

Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide – not necessarily according to majority… And what if someone (in Florida?) can flip some votes Power influence Of course they’ll have to discuss it over dinner….

Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide – not necessarily according to majority… And what if someone (in Florida?) can flip some votes Power influence Of course they’ll have to discuss it over dinner….

Open Questions Hardness of Approximation: Hardness of Approximation: MAX-CUT MAX-CUT Coloring a 3-colorable graph with fewest colors Coloring a 3-colorable graph with fewest colors Graph Properties: find sharp-thresholds for properties Graph Properties: find sharp-thresholds for properties Circuit Complexity: switching lemmas Circuit Complexity: switching lemmas Mechanism Design: show a non truth-revealing protocol in which the pay is smaller (Nash equilibrium when all agents tell the truth?) Mechanism Design: show a non truth-revealing protocol in which the pay is smaller (Nash equilibrium when all agents tell the truth?) Analysis: show weakest condition for a function to be a Junta Analysis: show weakest condition for a function to be a Junta Learning: by random queries Learning: by random queries Apply Concentration of Measure techniques to other problems in Complexity Theory Apply Concentration of Measure techniques to other problems in Complexity Theory

Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide – not necessarily according to majority… And what if someone (in Florida?) can flip some votes Power influence Of course they’ll have to discuss it over dinner….

Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide – not necessarily according to majority… And what if someone (in Florida?) can flip some votes Power influence Of course they’ll have to discuss it over dinner….

Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide – not necessarily according to majority… And what if someone (in Florida?) can flip some votes Power influence Of course they’ll have to discuss it over dinner….

Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide – not necessarily according to majority… And what if someone (in Florida?) can flip some votes Power influence Of course they’ll have to discuss it over dinner….

Low-degree B.f are Juntas Corollary: fix a p-biased distribution  p over P([n]) Let >0 be any parameter. Set k=log 1- (½) Then  constant  >0 s.t. any Boolean function f:P([n])  {-1,1} satisfying is an [ ,j]-junta for j=O(  -2 k 3  2k )

Def( ,p,x [n] ): Let 0< <1, and x  P([n]) Then y~ ,p,x, if y = (x\I)  z where Def( ,p,x [n] ): Let 0< <1, and x  P([n]) Then y~ ,p,x, if y = (x\I)  z where I~  [n] is a noise subset, and I~  [n] is a noise subset, and z~  p I is a replacement. z~  p I is a replacement. Def( -noise-sensitivity): let 0< <1, then [ When p=½ equivalent to flipping each coordinate in x independently w.p. /2.] [n] x I I z Noise-Sensitivity

Noise-Sensitivity – Cont. Advantage: very efficiently testable (using only two queries) by a perturbation-test. Advantage: very efficiently testable (using only two queries) by a perturbation-test. Def (perturbation-test): choose x~  p, and y~ ,p,x, check whether f(x)=f(y) The success is proportional to the noise- sensitivity of f. Def (perturbation-test): choose x~  p, and y~ ,p,x, check whether f(x)=f(y) The success is proportional to the noise- sensitivity of f. Prop: the -noise-sensitivity is given by Prop: the -noise-sensitivity is given by

Relation between Parameters Prop: small ns  small high-freq weight Proof: therefore: if ns is small, then Hence the high frequencies must have small weights (as). Prop: small as  small high-freq weight Proof:

High vs. Low Frequencies Def: The section of a function f above k is and the low-frequency portion is

Low-degree B.f are Juntas Theorem [Bourgain, Kindler & S] :  constant  >0 s.t. any Boolean function f:P([n])  {-1,1} satisfying is an [ ,j]-junta for j=O(  -2 k 3  2k ) Corollary: fix a p-biased distribution  p over P([n]) Let >0 be any parameter. Set k=log 1- (½) Then  constant  >0 s.t. any Boolean function f:P([n])  {-1,1} satisfying is an [ ,j]-junta for j=O(  -2 k 3  2k )

Specify the Junta Set k=  (as(f)/  ), and  =2 -  (k) Let We’ll prove: and let hence, J is a [ ,j]-junta, and |J|=2 O(k)

Functions’ Vector-Space A functions f is a vector A functions f is a vector Addition: ‘f+g’(x) = f(x) + g(x) Addition: ‘f+g’(x) = f(x) + g(x) Multiplication by scalar ‘c  f’(x) = c  f(x) Multiplication by scalar ‘c  f’(x) = c  f(x)

Hadamard Code In the Hadamard code the set of legal-words consists of all multiplicative (linear if over {0,1}) functions C={  S | S  [n]} namely all characters

Hadamard Test Given a Boolean f, choose random x and y; check that f(x)f(y)=f(xy) Prop(completeness): a legal Hadamard word (a character) always passes this test

76 Hadamard Test – Soundness Prop(soundness): Proof:

Testing Long-code Def(a long-code list-test): given a code-word f, probe it in a constant number of entries, and accept almost always if f is a monotone dictatorship accept almost always if f is a monotone dictatorship reject w.h.p if f does not have a sizeable fraction of its Fourier weight concentrated on a small set of variables, that is, if  a semi-Junta J  [n] s.t. reject w.h.p if f does not have a sizeable fraction of its Fourier weight concentrated on a small set of variables, that is, if  a semi-Junta J  [n] s.t. Note: a long-code list-test, distinguishes between the case f is a dictatorship, to the case f is far from a junta.

Motivation – Testing Long-code The long-code list-test are essential tools in proving hardness results. The long-code list-test are essential tools in proving hardness results. Hence finding simple sufficient-conditions for a function to be a junta is important. Hence finding simple sufficient-conditions for a function to be a junta is important.

High Frequencies Contribute Little Prop: k >> r log r implies Proof: a character S of size larger than k spreads w.h.p. over all parts I h, hence contributes to the influence of all parts. If such characters were heavy (>  /4), then surely there would be more than j parts I h that fail the t independence-tests

Altogether Lemma: Proof:

Altogether

Beckner/Nelson/Bonami Inequality Def: let T  be the following operator on any f, Prop: Proof:

Beckner/Nelson/Bonami Inequality Def: let T  be the following operator on any f, Thm: for any p≥r and  ≤((r-1)/(p-1)) ½

Beckner/Nelson/Bonami Corollary Corollary 1: for any real f and 2≥r≥1 Corollary 2: for real f and r>2

Perturbation Def: denote by   the distribution over all subsets of [n], which assigns probability to a subset x as follows: independently, for each i  [n], let i  x with probability 1-  i  x with probability 1-  i  x with probability  i  x with probability 

Long-Code Test Given a Boolean f, choose random x and y, and choose z   ; check that f(x)f(y)=f(xyz) Prop(completeness): a legal long- code word (a dictatorship) passes this test w.p. 1- 

Long-code Tests Def (a long-code test): given a code- word w, probe it in a constant number of entries, and Def (a long-code test): given a code- word w, probe it in a constant number of entries, and accept w.h.p if w is a monotone dictatorship accept w.h.p if w is a monotone dictatorship reject w.h.p if w is not close to any monotone dictatorship reject w.h.p if w is not close to any monotone dictatorship

Efficient Long-code Tests For some applications, it suffices if the test may accept illegal code-words, nevertheless, ones which have short list-decoding: Def(a long-code list-test): given a code-word w, probe it in 2/3 places, and accept w.h.p if w is a monotone dictatorship, accept w.h.p if w is a monotone dictatorship, reject w.h.p if w is not even approximately determined by a short list of domain elements, that is, if  a Junta J  [n] s.t. f is close to f’ and f’(x)=f’(x  J) for all x reject w.h.p if w is not even approximately determined by a short list of domain elements, that is, if  a Junta J  [n] s.t. f is close to f’ and f’(x)=f’(x  J) for all x Note: a long-code list-test, distinguishes between the case w is a dictatorship, to the case w is far from a junta.

General Direction These tests may vary These tests may vary The long-code list-test a, in particular the biased case version, seem essential in proving improved hardness results for approximation problems The long-code list-test a, in particular the biased case version, seem essential in proving improved hardness results for approximation problems Other interesting applications Other interesting applications Hence finding simple, weak as possible, sufficient-conditions for a function to be a junta is important. Hence finding simple, weak as possible, sufficient-conditions for a function to be a junta is important.

Codes and Boolean Functions Def: an m-bit code is a subset of the set of all the m-binary string C  {-1,1} m The distance of a code C is the minimum, over all pairs of legal-words (in C), of the Hamming distance between the two words Note: A Boolean function over n binary variables is a 2 n -bit string Hence, a set of Boolean functions can be considered as a 2 n -bits code

Long-Code  Monotone-Dictatorship In the long-code, the legal code-words are all monotone dictatorships C={  {i} | i  [n]} namely, all the singleton characters In the long-code, the legal code-words are all monotone dictatorships C={  {i} | i  [n]} namely, all the singleton characters