An Upper Bound on the GKS Game via Max Bipartite Matching

Slides:



Advertisements
Similar presentations
Optimal Space Lower Bounds for All Frequency Moments David Woodruff MIT
Advertisements

Quantum Lower Bounds You probably Havent Seen Before (which doesnt imply that you dont know OF them) Scott Aaronson, UC Berkeley 9/24/2002.
Quantum Lower Bounds The Polynomial and Adversary Methods Scott Aaronson September 14, 2001 Prelim Exam Talk.
The Future (and Past) of Quantum Lower Bounds by Polynomials Scott Aaronson UC Berkeley.
SPEED LIMIT n Quantum Lower Bounds Scott Aaronson (UC Berkeley) August 29, 2002.
Lower Bounds for Local Search by Quantum Arguments Scott Aaronson.
Oracles Are Subtle But Not Malicious Scott Aaronson University of Waterloo.
On the degree of symmetric functions on the Boolean cube Joint work with Amir Shpilka.
Computational Privacy. Overview Goal: Allow n-private computation of arbitrary funcs. –Impossible in information-theoretic setting Computational setting:
Theory of Computing Lecture 23 MAS 714 Hartmut Klauck.
Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
Quantum Information and the PCP Theorem Ran Raz Weizmann Institute.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
1 Cell Probe Complexity - An Invitation Peter Bro Miltersen University of Aarhus.
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Foundations of Cryptography Lecture 4 Lecturer: Moni Naor.
Lecture 12: Lower bounds By "lower bounds" here we mean a lower bound on the complexity of a problem, not an algorithm. Basically we need to prove that.
Complexity 25-1 Complexity Andrei Bulatov #P-Completeness.
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
Complexity 11-1 Complexity Andrei Bulatov Space Complexity.
Complexity 26-1 Complexity Andrei Bulatov Interactive Proofs.
Computability and Complexity 13-1 Computability and Complexity Andrei Bulatov The Class NP.
Courtesy Costas Busch - RPI1 A Universal Turing Machine.
On the tightness of Buhrman- Cleve-Wigderson simulation Shengyu Zhang The Chinese University of Hong Kong On the relation between decision tree complexity.
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
Private Information Retrieval. What is Private Information retrieval (PIR) ? Reduction from Private Information Retrieval (PIR) to Smooth Codes Constructions.
NP-Completeness NP-Completeness Graphs 4/17/2017 4:10 AM x x x x x x x
Job Scheduling Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines There are n jobs, each job has: a processing time p(i,j) (the time to.
Fall 2004COMP 3351 A Universal Turing Machine. Fall 2004COMP 3352 Turing Machines are “hardwired” they execute only one program A limitation of Turing.
Linear-Time Encodable and Decodable Error-Correcting Codes Jed Liu 3 March 2003.
Defining Polynomials p 1 (n) is the bound on the length of an input pair p 2 (n) is the bound on the running time of f p 3 (n) is a bound on the number.
Induction and recursion
Applied Discrete Mathematics Week 9: Relations
Section 11.4 Language Classes Based On Randomization
 Let A and B be any sets A binary relation R from A to B is a subset of AxB Given an ordered pair (x, y), x is related to y by R iff (x, y) is in R. This.
Approximating Minimum Bounded Degree Spanning Tree (MBDST) Mohit Singh and Lap Chi Lau “Approximating Minimum Bounded DegreeApproximating Minimum Bounded.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Quantum Computing MAS 725 Hartmut Klauck NTU TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A.
The Complexity of Optimization Problems. Summary -Complexity of algorithms and problems -Complexity classes: P and NP -Reducibility -Karp reducibility.
Preference elicitation Communicational Burden by Nisan, Segal, Lahaie and Parkes October 27th, 2004 Jella Pfeiffer.
Logic Circuits Chapter 2. Overview  Many important functions computed with straight-line programs No loops nor branches Conveniently described with circuits.
Complexity 25-1 Complexity Andrei Bulatov Counting Problems.
Logic (continuation) Boolean Logic and Bit Operations.
A Universal Turing Machine
The least known length of ordered basis of symmetric group S. A. Kalinchuk, Yu. L. Sagalovich Institute for Information Transmission Problems, Russian.
CS 103 Discrete Structures Lecture 13 Induction and Recursion (1)
1 Linear Bounded Automata LBAs. 2 Linear Bounded Automata (LBAs) are the same as Turing Machines with one difference: The input string tape space is the.
1 Turing’s Thesis. 2 Turing’s thesis: Any computation carried out by mechanical means can be performed by a Turing Machine (1930)
Exact quantum algorithms Andris Ambainis University of Latvia.
Donghyun (David) Kim Department of Mathematics and Computer Science North Carolina Central University 1 Chapter 7 Time Complexity Some slides are in courtesy.
Smooth Boolean Functions are Easy: Efficient Algorithms for Low-Sensitivity Functions Rocco Servedio Joint work with Parikshit Gopalan (MSR) Noam Nisan.
Quantum algorithms are at most polynomially faster for any symmetric function Andris Ambainis University of Latvia.
CompSci 102 Discrete Math for Computer Science March 13, 2012 Prof. Rodger Slides modified from Rosen.
Complexity 24-1 Complexity Andrei Bulatov Interactive Proofs.
Theory of Computational Complexity Yuji Ishikawa Avis lab. M1.
Network Topology Single-level Diversity Coding System (DCS) An information source is encoded by a number of encoders. There are a number of decoders, each.
Multi-Party Computation r n parties: P 1,…,P n  P i has input s i  Parties want to compute f(s 1,…,s n ) together  P i doesn’t want any information.
1 A Universal Turing Machine. 2 Turing Machines are “hardwired” they execute only one program A limitation of Turing Machines: Real Computers are re-programmable.
A Universal Turing Machine
Relations, Functions, and Matrices
Probabilistic Algorithms
Modeling with Recurrence Relations
Polynomial-Time Reduction
Communication Complexity as a Lower Bound for Learning in Games
Branching Programs Part 3
Effcient quantum protocols for XOR functions
RS – Reed Solomon List Decoding.
Applied Discrete Mathematics Week 7: Computation
Instructor: Aaron Roth
Presentation transcript:

An Upper Bound on the GKS Game via Max Bipartite Matching DeVon Ingram (Georgia Institute of Technology)

A Brief Introduction to Query Complexity Settling open questions on the power of Turing Machines is usually out of reach of current techniques. Query complexity is a limited model of computation, which is focused on computing a function f( 𝑥 1 ,…, 𝑥 𝑛 ) of variables 𝑥 𝑖 , which can be accessed via queries. Lower bounds on query complexity are usually more feasible and sometimes yields insight on structural complexity.

Multilinear Polynomial Degree Definition: Let f :{0,1} 𝑛 →{0,1} be a Boolean function. A real multivariate polynomial p: ℝ 𝑛 →ℝ represents f if, for every n bit binary string x, f(x)=p(x). Every Boolean function can be represented by a unique multilinear polynomial. For a Boolean function f, the degree of f, denoted by deg(f) is the degree of the unique real multilinear polynomial that represents f.

Multilinear Polynomial Degree (cont.) Example: Let f be the AND function: f( 𝑥 1 ,…, 𝑥 𝑛 )=1 if and only if 𝑥 1 =…= 𝑥 𝑛 =1. Then, the multilinear polynomial representing f is 𝑥 1 … 𝑥 𝑛 . Thus, deg(f) = n. Example: Let f be the OR function: f(x,y)=1 if at least one of x,y is equal to one. The multilinear polynomial representing f is x+y-xy. Thus, deg(f) = 2.

Sensitivity Definition: Let f :{0,1} 𝑛 →{0,1} be a Boolean function. On an input x, an n bit binary string, the ith bit of x is said to be sensitive if f(x)=/=f(x’), where x’ is x with its ith bit flipped. The sensitivity of f, denoted by s(f), is the maximum number of sensitive bits over all inputs Example: Let f be the AND function. Then, s(f) = n because flipping any bit on the input 11…1 changes f’s value from 1 to 0.

The Sensitivity Conjecture Conjecture: Let f be an arbitrary Boolean function. Then, deg(f)≤poly(s(f)). Since it is known that s(f)≤poly(deg(f)), the conjecture essentially states that multilinear polynomial degree and sensitivity are polynomially related, that is deg(f)≤poly(s(f)) and s(f)≤poly(deg(f)). If the conjecture is true, then sensitivity is also polynomially related to several other query complexity measures.

The GKS Game A reformulation of the sensitivity conjecture in terms of a cooperative two- player communication game. A polynomial lower bound on the cost of the game implies a polynomial upper bound on multilinear polynomial degree in terms of sensitivity. Alice and Bob attempt to come up with a protocol to identify an adversarially chosen bit in a Boolean array with maximum certainty.

The GKS Game (cont.) The game is played on a Boolean array, which has entries that can be set to either 0 or 1. Alice receives a permutation σ of {1,…,n} and writes either 0 or 1 at the locations σ 𝑖 n-1 times. Eve adversarially sets the last bit and sends the Boolean array to Bob. Bob returns a subset of minimal size containing the bit Eve wrote The cost of the game is the maximum size of the subset Bob returns over all inputs.

An O( 𝑛 .5 ) Upper Bound on the GKS Game Split the Boolean array into 𝑛 .5 blocks of size 𝑛 .5 . If the bit being edited is the last unedited bit in the block, place a 1. Otherwise, place a 0. The cost of the game is 𝑛 .5 regardless of the bit Eve places, Bob must return a subset of size 𝑛 .5 . 0 0 1 0 1 0 0 0

An O( 𝑛 .4732 ) Upper Bound on GKS Game Main idea: split up the Boolean array into blocks and place codewords in each block. Definition (Szegedy): Suppose that the GKS game is played on a Boolean array of length n. A (k,n) strategy for the GKS game is a strategy such that the size of the subset returned by Bob is at most k over all inputs. Theorem (Szegedy): If (k,n) and (x,y) strategies exist, then there exists a (kx,ny) strategy.

An O( 𝑛 .4732 ) Upper Bound on GKS Game (cont.) Strategy: Split up a Boolean array of length 30 in 5 blocks of length 6. Suppose that Alice must edit the ith bit in a block. If the bit is the first bit to be edited in the block, write the ith bit of the string 𝑤 𝑖 . If the ith bit is not the first bit nor last bit to be edited, then the bits have been corresponding to a string 𝑤 𝑘 . Place the ith bit of 𝑤 𝑘 . If the ith bit is the last bit to be edited in the block and the bits have been placed corresponding to a string 𝑤 𝑘 , place 1-j, where j represents the ith bit of 𝑤 𝑘 .

An O( 𝑛 .4696 ) Upper Bound on GKS Game Main idea: Split up Boolean array into blocks and place codewords into each block. However, instead of using only one bit to uniquely identify a Hamming codeword, use several bits to uniquely identify Hamming Codewords. Problem is reduced to finding a bipartite matching between subsets of {1,…,m} and Hamming codewords A matching was found between subsets of size 4 and Hamming codewords of length 15.

An O( 𝑛 .4696 ) Upper Bound on GKS Game Strategy: Split Boolean array into 11 blocks of size 15. Suppose that Alice must edit the ith bit of a block in a Boolean array. If less than four bits have been edited in that block, then place a 0. If the ith bit is not the first bit nor last bit to be edited, then bits have been placed corresponding to a codeword C. Place the ith bit of C. If the ith bit is the last bit to be edited in the block and the block corresponds to the codeword C, place 1-j, where j represents the ith bit of C. Using Szegedy’s composition theorem, repeatedly composing (11,165) strategies yields an upper bound of O( 𝑛 .4696 ).

An O( 𝑛 1/3 ) Upper Bound on GKS Game Main idea: Split Boolean array into 𝑛 1/3 blocks of size 𝑛 2/3 . Place codewords in each block. Encode information in the codewords using a clever method that guarantees that all but 𝑛 1/2 bits in a block are guaranteed not to contain a bit Eve set. Codewords are split into sections: A section which acts as a Hamming checksum for the entire codeword. Three sections which encode the locations of the checksum bits in the first section. Sections filled with 0s known not to be the last bit placed in the codeword.

Future Directions Given a strategy for the GKS game, create an explicit function f that realizes a separation between multilinear polynomial degree and sensitivity. Improve the O( 𝑛 1/3 ) upper bound on the cost of the GKS game. Prove a nontrivial lower bound on the cost of the GKS game.

References Ambainis, A. (2017). Understanding Quantum Algorithms via Query Complexity. arXiv preprint arXiv:1712.06349. Arora, S., & Barak, B. (2009). Computational complexity: a modern approach. Cambridge University Press. Nisan, N., & Szegedy, M. (1994). On the degree of Boolean functions as real polynomials. Computational complexity, 4(4), 301-313. Gilmer, J., Koucký, M., & Saks, M. (2015). A communication game related to the sensitivity conjecture. arXiv preprint arXiv:1511.07729. Szegedy, M. (2015). An $ O (n^{0.4732}) $ upper bound on the complexity of the GKS communication game. arXiv preprint arXiv:1506.06456. Ingram, D. (2017). An Upper Bound on the GKS Game via Max Bipartite Matching. arXiv preprint arXiv:1712.01149. Brand, M. (2017, October). October 2017 solution. In Using your Head is Permitted. Retrieved from https://www.brand.site.co.il/riddles/201710a.html