Analysis of Boolean Functions and Complexity Theory Economics Combinatorics …

Slides:



Advertisements
Similar presentations
Combinatorial Auction
Advertisements

Routing Complexity of Faulty Networks Omer Angel Itai Benjamini Eran Ofek Udi Wieder The Weizmann Institute of Science.
Truthful Mechanisms for Combinatorial Auctions with Subadditive Bidders Speaker: Shahar Dobzinski Based on joint works with Noam Nisan & Michael Schapira.
6.896: Topics in Algorithmic Game Theory Lecture 21 Yang Cai.
NP-Hard Nattee Niparnan.
6.896: Topics in Algorithmic Game Theory Lecture 20 Yang Cai.
Inapproximability of MAX-CUT Khot,Kindler,Mossel and O ’ Donnell Moshe Ben Nehemia June 05.
Approximation Algorithms Chapter 14: Rounding Applied to Set Cover.
Approximation, Chance and Networks Lecture Notes BISS 2005, Bertinoro March Alessandro Panconesi University La Sapienza of Rome.
An Approximate Truthful Mechanism for Combinatorial Auctions An Internet Mathematics paper by Aaron Archer, Christos Papadimitriou, Kunal Talwar and Éva.
Effort Games and the Price of Myopia Michael Zuckerman Joint work with Yoram Bachrach and Jeff Rosenschein.
Gibbs sampler - simple properties It’s not hard to show that this MC chain is aperiodic. Often is reversible distribution. If in addition the chain is.
Seminar In Game Theory Algorithms, TAU, Agenda  Introduction  Computational Complexity  Incentive Compatible Mechanism  LP Relaxation & Walrasian.
Complexity ©D Moshkovitz 1 Approximation Algorithms Is Close Enough Good Enough?
Bundling Equilibrium in Combinatorial Auctions Written by: Presented by: Ron Holzman Rica Gonen Noa Kfir-Dahav Dov Monderer Moshe Tennenholtz.
1 By Gil Kalai Institute of Mathematics and Center for Rationality, Hebrew University, Jerusalem, Israel presented by: Yair Cymbalista.
Complexity 18-1 Complexity Andrei Bulatov Probabilistic Algorithms.
Chapter 5 Orthogonality
Analysis of Boolean Functions Fourier Analysis, Projections, Influence, Junta, Etc… And (some) applications Slides prepared with help of Ricky Rosen.
Putting a Junta to the Test Joint work with Eldar Fischer & Guy Kindler.
Analysis of Boolean Functions Fourier Analysis, Projections, Influence, Junta, Etc… Slides prepared with help of Ricky Rosen.
Fourier Analysis, Projections, Influences, Juntas, Etc…
1 Noise-Insensitive Boolean-Functions are Juntas Guy Kindler & Muli Safra Slides prepared with help of: Adi Akavia.
NP-Complete Problems Reading Material: Chapter 10 Sections 1, 2, 3, and 4 only.
The Theory of NP-Completeness
NP-Complete Problems Problems in Computer Science are classified into
Combinatorial Auction. Conbinatorial auction t 1 =20 t 2 =15 t 3 =6 f(t): the set X  F with the highest total value the mechanism decides the set of.
Avraham Ben-Aroya (Tel Aviv University) Oded Regev (Tel Aviv University) Ronald de Wolf (CWI, Amsterdam) A Hypercontractive Inequality for Matrix-Valued.
1 Noise-Insensitive Boolean-Functions are Juntas Guy Kindler & Muli Safra Slides prepared with help of: Adi Akavia.
Fourier Analysis, Projections, Influence, Junta, Etc…
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen.
1 INTRODUCTION NP, NP-hardness Approximation PCP.
CSE 421 Algorithms Richard Anderson Lecture 27 NP Completeness.
Chapter 11: Limitations of Algorithmic Power
Fourier Analysis of Boolean Functions Juntas, Projections, Influences Etc.
1 Slides by Asaf Shapira & Michael Lewin & Boaz Klartag & Oded Schwartz. Adapted from things beyond us.
A Fourier-Theoretic Perspective on the Condorcet Paradox and Arrow ’ s Theorem. By Gil Kalai, Institute of Mathematics, Hebrew University Presented by:
MCS312: NP-completeness and Approximation Algorithms
Auction Seminar Optimal Mechanism Presentation by: Alon Resler Supervised by: Amos Fiat.
Nattee Niparnan. Easy & Hard Problem What is “difficulty” of problem? Difficult for computer scientist to derive algorithm for the problem? Difficult.
Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s.
Primer on Fourier Analysis Dana Moshkovitz Princeton University and The Institute for Advanced Study.
Approximating the Minimum Degree Spanning Tree to within One from the Optimal Degree R 陳建霖 R 宋彥朋 B 楊鈞羽 R 郭慶徵 R
Edge-disjoint induced subgraphs with given minimum degree Raphael Yuster 2012.
Chapter 2 Mathematical preliminaries 2.1 Set, Relation and Functions 2.2 Proof Methods 2.3 Logarithms 2.4 Floor and Ceiling Functions 2.5 Factorial and.
CSE 024: Design & Analysis of Algorithms Chapter 9: NP Completeness Sedgewick Chp:40 David Luebke’s Course Notes / University of Virginia, Computer Science.
1 Elections and Manipulations: Ehud Friedgut, Gil Kalai, and Noam Nisan Hebrew University of Jerusalem and EF: U. of Toronto, GK: Yale University, NN:
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen, & Adi Akavia.
Fourier Transforms in Computer Science. Can a function [0,2  ] z R be expressed as a linear combination of sin nx, cos nx ? If yes, how do we find the.
6.853: Topics in Algorithmic Game Theory Fall 2011 Constantinos Daskalakis Lecture 22.
Unique Games Approximation Amit Weinstein Complexity Seminar, Fall 2006 Based on: “Near Optimal Algorithms for Unique Games" by M. Charikar, K. Makarychev,
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen.
1 Covering Non-uniform Hypergraphs Endre Boros Yair Caro Zoltán Füredi Raphael Yuster.
Combinatorial Auction. A single item auction t 1 =10 t 2 =12 t 3 =7 r 1 =11 r 2 =10 Social-choice function: the winner should be the guy having in mind.
CSE 421 Algorithms Richard Anderson Lecture 27 NP-Completeness Proofs.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. Fast.
Approximation Algorithms based on linear programming.
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics …
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
ICS 353: Design and Analysis of Algorithms NP-Complete Problems King Fahd University of Petroleum & Minerals Information & Computer Science Department.
Chapter 10 NP-Complete Problems.
Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen.
Probabilistic Algorithms
Richard Anderson Lecture 26 NP-Completeness
Richard Anderson Lecture 26 NP-Completeness
ICS 353: Design and Analysis of Algorithms
Noise-Insensitive Boolean-Functions are Juntas
Switching Lemmas and Proof Complexity
Sparse Kindler-Safra Theorem via agreement theorems
Locality In Distributed Graph Algorithms
Presentation transcript:

Analysis of Boolean Functions and Complexity Theory Economics Combinatorics …

Introduction Objectives: Objectives: To introduce Analysis of Boolean Functions and some of its applications. To introduce Analysis of Boolean Functions and some of its applications. Overview: Overview: Basic definitions. Basic definitions. First passage percolation First passage percolation Mechanism design Mechanism design Graph property Graph property … And more… … And more…

Influential People The theory of the Influence of Variables on Boolean Functions [KKL,BL,R,M], has been introduced to tackle Social Choice problems and distributed computing. It has motivated a magnificent body of work, related to Sharp Threshold [F, FG] Percolation [BKS] Economics: Arrow’s Theorem [K] Hardness of Approximation [DS] Utilizing Harmonic Analysis of Boolean functions… And the real important question:

Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide – not necessarily by majority… And what if someone (in Florida?) can flip some votes Power influence

Boolean Functions Def: A Boolean function Def: A Boolean function Power set of [n] Choose the location of -1 Choose a sequence of -1 and 1

Def: the influence of i on f is the probability, over a random input x, that f changes its value when i is flipped Def: the influence of i on f is the probability, over a random input x, that f changes its value when i is flipped influence

Functions as Vector-Spaces f f * * -1* 1* 11* 11-1* -1-1* -11* -11-1* -111* * -1-11* 111* 1-1* 1-1-1* 1-11* f f 2n2n 2n2n * * -1* 1* 11* 11-1* -1-1* -11* -11-1* -111* * -1-11* 111* 1-1* 1-1-1* 1-11*

Functions’ Vector-Space A functions f is a vector A functions f is a vector Addition: ‘f+g’(x) = f(x) + g(x) Addition: ‘f+g’(x) = f(x) + g(x) Multiplication by scalar ‘c  f’(x) = c  f(x) Multiplication by scalar ‘c  f’(x) = c  f(x)

The influence of i on Majority is the probability, over a random input x, Majority changes with i The influence of i on Majority is the probability, over a random input x, Majority changes with i this happens when half of the n-1 coordinate (people) vote -1 and half vote 1. this happens when half of the n-1 coordinate (people) vote -1 and half vote 1. i.e. i.e. Majority :{1,-1} n  {1,-1} Majority :{1,-1} n  {1,-1} 1?

Parity : {1,-1} 20  {1,-1} Parity : {1,-1} 20  {1,-1} Always changes the value of parity

influence of i on Dictatorship i = 1. influence of i on Dictatorship i = 1. influence of j  i on Dictatorship i = 0. influence of j  i on Dictatorship i = 0. Dictatorship i :{1,-1} 20  {1,-1} Dictatorship i :{1,-1} 20  {1,-1} Dictatorship i (X)=x i Dictatorship i (X)=x i

Variables` Influence The influence of a coordinate i  [n] on a Boolean function f:{1,-1} n  {1,-1} is The influence of a coordinate i  [n] on a Boolean function f:{1,-1} n  {1,-1} is The influence of i on f is the probability, over a random input x, that f changes its value when i is flipped.

Variables` Influence Average­ Sensitivity of f (AS) - The sum of influences of all coordinates i  [n]. Average­ Sensitivity of f (AS) - The sum of influences of all coordinates i  [n]. Average­ Sensitivity of f is the expected number of coordinates, for a random input x, flipping of which changes the value of f. Average­ Sensitivity of f is the expected number of coordinates, for a random input x, flipping of which changes the value of f.

example majority for majority for What is Average Sensitivity ? What is Average Sensitivity ? AS= ½+ ½+ ½= 1.5 AS= ½+ ½+ ½= Influence 2 3

Representing f as a Polynomial What would be the monomials over x  P[n] ? What would be the monomials over x  P[n] ? All powers except 0 and 1 cancel out! All powers except 0 and 1 cancel out! Hence, one for each character S  [n] Hence, one for each character S  [n] These are all the multiplicative functions These are all the multiplicative functions

Fourier-Walsh Transform Consider all characters Consider all characters Given any function let the Fourier-Walsh coefficients of f be Given any function let the Fourier-Walsh coefficients of f be thus f can be described as thus f can be described as

Norms Def: Expectation norm on the function Def: Expectation norm on the function Def: Summation Norm on its Fourier transform Def: Summation Norm on its Fourier transform

Fourier Transform: Norm Norm: (Sum) Thm [Parseval]: Hence, for a Boolean f

We may think of the Transform as defining a distribution over the characters. We may think of the Transform as defining a distribution over the characters.

Inner Product Recall Recall Inner product (normalized) Inner product (normalized)

Simple Observations Claim: Claim: For any function f whose range is {-1,0,1}: For any function f whose range is {-1,0,1}:

Variables` Influence Recall: influence of an index i  [n] on a Boolean function f:{1,-1} n  {1,-1} is Recall: influence of an index i  [n] on a Boolean function f:{1,-1} n  {1,-1} is Which can be expressed in terms of the Fourier coefficients of f Claim: Which can be expressed in terms of the Fourier coefficients of f Claim:

Average Sensitivity Def: the sensitivity of x w.r.t. f is Def: the sensitivity of x w.r.t. f is Thinking of the discrete n-dimensional cube, color each vertex n in color 1 or color -1 (color f(n)). Thinking of the discrete n-dimensional cube, color each vertex n in color 1 or color -1 (color f(n)). Edge whose vertices are colored with the same color is called monotone. Edge whose vertices are colored with the same color is called monotone. The average sensitivity is the number of edges whom are not monotone.. The average sensitivity is the number of edges whom are not monotone..

average sensitivity of Majority is the expected number of coordinates, for a random input x, flipping of which changes the value of Majority. average sensitivity of Majority is the expected number of coordinates, for a random input x, flipping of which changes the value of Majority. Majority :{1,-1} 19  {1,-1} Majority :{1,-1} 19  {1,-1} 1?

Parity :{1,-1} 20  {1,-1} Parity :{1,-1} 20  {1,-1} Always changes the value of parity

influence of i on Dictatorship i = 1. influence of i on Dictatorship i = 1. influence of j  i on Dictatorship i = 0. influence of j  i on Dictatorship i = 0. Dictatorship i :{1,-1} 20  {1,-1} Dictatorship i :{1,-1} 20  {1,-1} Dictatorship i (X)=x i Dictatorship i (X)=x i

Average Sensitivity Claim: Claim: Proof: Proof:

When AS(f)=1 Def: f is a balanced function if Def: f is a balanced function if THM: f is balanced and as(f)=1  f is dictatorship. THM: f is balanced and as(f)=1  f is dictatorship. Proof: Proof:   x, sens(x)=1, and as(f)=1 follows.   x, sens(x)=1, and as(f)=1 follows. f is balanced since the dictator is 1 on half of the x and -1 on half of the x. f is balanced since the dictator is 1 on half of the x and -1 on half of the x. because only x can change the value of f

When AS(f)=1  So f is linear So f is linear For i whose For i whose f is balanced If  s s.t |s|>1 and then as(f)>1 Only i has changed

First Passage Percolation

Choose each edge with probability ½ to be a and ½ to be b

First Passage Percolation Consider the Grid Consider the Grid For each edge e of choose independently w e = a or w e = b, each with probability ½ 0< a < b < . For each edge e of choose independently w e = a or w e = b, each with probability ½ 0< a < b < . This induces a random metric on the vertices of This induces a random metric on the vertices of Proposition : The variance of the shortest path from the origin to vertex v is bounded by O( |v|/ log |v|). [BKS] Proposition : The variance of the shortest path from the origin to vertex v is bounded by O( |v|/ log |v|). [BKS]

First Passage Percolation Choose each edge with probability ½ to be 1 and ½ to be 2

First Passage Percolation Consider the Grid Consider the Grid For each edge e of choose independently w e = 1 or w e = 2, each with probability ½. For each edge e of choose independently w e = 1 or w e = 2, each with probability ½. This induces a random metric on the vertices of This induces a random metric on the vertices of Proposition : The variance of the shortest path from the origin to vertex v is bounded by O( |v| /log |v|). Proposition : The variance of the shortest path from the origin to vertex v is bounded by O( |v| /log |v|).

Let G denote the grid Let G denote the grid SP G – the shortest path in G from the origin to v. SP G – the shortest path in G from the origin to v. Let denote the Grid which differ from G only on w e i.e. flip coordinate e in G. Let denote the Grid which differ from G only on w e i.e. flip coordinate e in G. Set Set Proof outline

Observation If e participates in a shortest path then flipping its value will increase or decrease the SP in 1,if e is not in SP - the SP will not change.

Proof cont. And by [KKL] there is at least one variable whose influence was as big as  (n/logn) And by [KKL] there is at least one variable whose influence was as big as  (n/logn)

Graph property Every Monotone Graph Property has a sharp threshold

A graph property is a property of graphs which is closed under isomorphism. A graph property is a property of graphs which is closed under isomorphism. monotone graph property : monotone graph property : Let P be a graph property. Let P be a graph property. Every graph H on the same set of vertices, which contains G as a sub graph satisfies P as well. Every graph H on the same set of vertices, which contains G as a sub graph satisfies P as well. Graph property

Examples of graph properties G is connected G is connected G is Hamiltonian G is Hamiltonian G contains a clique of size t G contains a clique of size t G is not planar G is not planar The clique number of G is larger than that of its complement The clique number of G is larger than that of its complement the diameter of G is at most s the diameter of G is at most s... etc.... etc.

Erdös – Rényi Graph Model Erdös - Rényi for random graph Model Erdös - Rényi for random graph Choose every edge with probability p Choose every edge with probability p

Erdös – Rényi Graph Model Erdös - Rényi for random graph Model Erdös - Rényi for random graph Choose every edge with probability p Choose every edge with probability p

Every Monotone Graph Property has a sharp threshold Ehud Friedgut & Gil Kalai

Definitions GNP – a graph property GNP – a graph property  (P) - the probability that a random graph on n vertices with edge probability p satisfies GP.  (P) - the probability that a random graph on n vertices with edge probability p satisfies GP. G  G(n,p) - G is a random graph with n vertices and edge probability p. G  G(n,p) - G is a random graph with n vertices and edge probability p.

Main Theorem Let GNP be any monotone property of graphs on n vertices. Let GNP be any monotone property of graphs on n vertices. If  p (GNP) >  then  q (GNP) > 1-  for q = p + c 1 log(1/2  )/logn absolute constant

Example-Max Clique Consider G  G(n,p). Consider G  G(n,p). The length of the interval of probabilities p for which the clique number of G is almost surely k (where k  log n) is of order log -1 n. The length of the interval of probabilities p for which the clique number of G is almost surely k (where k  log n) is of order log -1 n. The threshold interval: The transition between clique numbers k-1 and k. The threshold interval: The transition between clique numbers k-1 and k. Probability for choosing an edge Number of vertices

The probability of having a (k + 1)-clique is still small (  log -1 n). The probability of having a (k + 1)-clique is still small (  log -1 n). The value of p must increase byclog -1 n before the probability for having a (k + 1)- clique reaches  and another transition interval begins. The value of p must increase by clog -1 n before the probability for having a (k + 1)- clique reaches  and another transition interval begins. The probability of having a clique of size k is 1-  The probability of having a clique of size k is 

Def: Sharp threshold Sharp threshold in monotone graph property: Sharp threshold in monotone graph property: The transition from a property being very unlikely to it being very likely is very swift. The transition from a property being very unlikely to it being very likely is very swift. G satisfies property P G Does not satisfies property P

Conjecture Let GNP be any monotone property of graphs on n vertices. If  p (GNP) >  then  q (GNP) > 1-  for q = p + clog(1/2  )/log 2 n Let GNP be any monotone property of graphs on n vertices. If  p (GNP) >  then  q (GNP) > 1-  for q = p + clog(1/2  )/log 2 n

Graph property Every Monotone Graph Property has a sharp threshold

A graph property is a property of graphs which is closed under isomorphism. A graph property is a property of graphs which is closed under isomorphism. hereditary : hereditary : Let P be a monotone graph property; that is, if a graph G satisfies P Let P be a monotone graph property; that is, if a graph G satisfies P Every graph H on the same set of vertices, which contains G as a sub graph satisfies P as well. Every graph H on the same set of vertices, which contains G as a sub graph satisfies P as well. Graph property

Hereditary in 3-colorable graphs

Examples of graph properties G is connected G is connected G is Hamiltonian G is Hamiltonian G contains a clique of size t G contains a clique of size t G is not planar G is not planar The clique number of G is larger than that of its complement The clique number of G is larger than that of its complement the diameter of G is at most s the diameter of G is at most s G admits a transitive orientation G admits a transitive orientation... etc.... etc.

Erdös – Rényi Graph Model Erdös - Rényi for random graph Model Erdös - Rényi for random graph Choose every edge with probability p Choose every edge with probability p

Erdös – Rényi Graph Model Erdös - Rényi for random graph Model Erdös - Rényi for random graph Choose every edge with probability p Choose every edge with probability p

Definitions GNP – a graph property GNP – a graph property  (P) - the probability that a random graph on n vertices with edge probability p satisfies GP.  (P) - the probability that a random graph on n vertices with edge probability p satisfies GP. G  G(n,p) - G is a random graph with n vertices and edge probability p. G  G(n,p) - G is a random graph with n vertices and edge probability p.

Example – max clique Let G  G(n,p) Let G  G(n,p)

Sharp threshold Sharp threshold in monotone graph property: Sharp threshold in monotone graph property: The transition from a property being very unlikely to it being very likely is very swift. The transition from a property being very unlikely to it being very likely is very swift. G satisfies property P G Does not satisfies property P

Mechanism Design Shortest Path Problem

Mechanism Design Problem N agents,bidders, each agent i has private input t i  T. Everything else in this scenario is public knowledge. N agents,bidders, each agent i has private input t i  T. Everything else in this scenario is public knowledge. The output specification maps to each type vector t= t 1 …t n a set of allowed outputs o  O. The output specification maps to each type vector t= t 1 …t n a set of allowed outputs o  O. Each agent i has a valuation for his items: V i (t i,o) = outcome for the agents. Each agent wishes to optimize his own utility. Each agent i has a valuation for his items: V i (t i,o) = outcome for the agents. Each agent wishes to optimize his own utility. Objective: minimize the objective function, the total payment. Objective: minimize the objective function, the total payment. Means: protocol between agents and auctioneer. Means: protocol between agents and auctioneer.

Truth implementation The action of an agent consists of reporting its type, its true type. The action of an agent consists of reporting its type, its true type. Playing the truth is the dominating strategy Playing the truth is the dominating strategy THM: If there exists a mechanism then there exists also a Truthful Implementation. THM: If there exists a mechanism then there exists also a Truthful Implementation. Proof: simulate the hypothetical implementation based on the actions derived from the reported types. Proof: simulate the hypothetical implementation based on the actions derived from the reported types.

Vickery-Groves-Clarke (VGC)

Mechanism Design for SP 50$10$50$10$ Always in the shortest path

Shortest Path using VGC Problem definition: Problem definition: Communication network modeled by a directed graph G and two vertices source s and target t. Communication network modeled by a directed graph G and two vertices source s and target t. Agents = edges in G Agents = edges in G Each agent has a cost for sending a single message on his edge denote by t e. Each agent has a cost for sending a single message on his edge denote by t e. Objective: find the shortest (cheapest) path from s to t. Objective: find the shortest (cheapest) path from s to t. Means: protocol between agents and auctioneer. Means: protocol between agents and auctioneer.

C(G) = costs along the shortest path (s,t) in G. C(G) = costs along the shortest path (s,t) in G. compute a shortest path in the G, at cost C(G). compute a shortest path in the G, at cost C(G). Each agent that participates in the SP obtains the payment she demanded plus [ C(G\e) – t e ]. Each agent that participates in the SP obtains the payment she demanded plus [ C(G\e) – t e ]. Shortest Path using VGC SP on G\e

How much will we pay? 50$10$50$10$

junta A function is a J-junta if its value depends on only J variables. A function is a J-junta if its value depends on only J variables. A Dictatorship is 1-junta A Dictatorship is 1-junta

High vs. Low Frequencies Def: The section of a function f above k is and the low-frequency portion is

Freidgut Theorem Thm: any Boolean f is an [ , j]-junta for Proof: 1. Specify the junta J 2. Show the complement of J has little influence

Specify the Junta Set k=  (as(f)/  ), and  =2 -  (k) Let We’ll prove: and let hence, J is a [ ,j]-junta, and |J|=2 O(k)

High Frequencies Contribute Little Prop: k >> r log r implies Proof: a character S of size larger than k spreads w.h.p. over all parts I h, hence contributes to the influence of all parts. If such characters were heavy (>  /4), then surely there would be more than j parts I h that fail the t independence-tests

Altogether Lemma: Proof:

Altogether

Beckner/Nelson/Bonami Inequality Def: let T  be the following operator on any f, Prop: Proof:

Beckner/Nelson/Bonami Inequality Def: let T  be the following operator on any f, Thm: for any p≥r and  ≤((r-1)/(p-1)) ½

Beckner/Nelson/Bonami Corollary Corollary 1: for any real f and 2≥r≥1 Corollary 2: for real f and r>2

Freidgut Theorem Thm: any Boolean f is an [ , j]-junta for Proof: 1. Specify the junta J 2. Show the complement of J has little influence

Altogether Beckner