Isaac Newton Institute 9th of May 2012 Søren Riis Queen Mary University of London Valiant’s Shift problem: A reduction to a problem about graph guessing.

Slides:



Advertisements
Similar presentations
On the Complexity of Scheduling
Advertisements

Boosting Textual Compression in Optimal Linear Time.
Great Theoretical Ideas in Computer Science
NP-Hard Nattee Niparnan.
Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
Inapproximability of MAX-CUT Khot,Kindler,Mossel and O ’ Donnell Moshe Ben Nehemia June 05.
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Approximation Algorithms Chapter 14: Rounding Applied to Set Cover.
Totally Unimodular Matrices
Study Group Randomized Algorithms 21 st June 03. Topics Covered Game Tree Evaluation –its expected run time is better than the worst- case complexity.
How Bad is Selfish Routing? By Tim Roughgarden Eva Tardos Presented by Alex Kogan.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Complexity ©D Moshkovitz 1 Approximation Algorithms Is Close Enough Good Enough?
Parallel Scheduling of Complex DAGs under Uncertainty Grzegorz Malewicz.
Introduction to PCP and Hardness of Approximation Dana Moshkovitz Princeton University and The Institute for Advanced Study 1.
The number of edge-disjoint transitive triples in a tournament.
Entropy Rates of a Stochastic Process
ENGS Lecture 8 ENGS 4 - Lecture 8 Technology of Cyberspace Winter 2004 Thayer School of Engineering Dartmouth College Instructor: George Cybenko,
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
Beyond selfish routing: Network Formation Games. Network Formation Games NFGs model the various ways in which selfish agents might create/use networks.
Michael Bender - SUNY Stony Brook Dana Ron - Tel Aviv University Testing Acyclicity of Directed Graphs in Sublinear Time.
EXPANDER GRAPHS Properties & Applications. Things to cover ! Definitions Properties Combinatorial, Spectral properties Constructions “Explicit” constructions.
Derandomizing LOGSPACE Based on a paper by Russell Impagliazo, Noam Nissan and Avi Wigderson Presented by Amir Rosenfeld.
Lossless data compression Lecture 1. Data Compression Lossless data compression: Store/Transmit big files using few bytes so that the original files.
Job Scheduling Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines There are n jobs, each job has: a processing time p(i,j) (the time to.
DANSS Colloquium By Prof. Danny Dolev Presented by Rica Gonen
Variable-Length Codes: Huffman Codes
Network Formation Games. Netwok Formation Games NFGs model distinct ways in which selfish agents might create and evaluate networks We’ll see two models:
Chapter 11 Limitations of Algorithm Power Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
1 2 Introduction In this lecture we’ll cover: Definition of strings as functions and vice versa Error correcting codes Low degree polynomials Low degree.
Network Formation Games. Netwok Formation Games NFGs model distinct ways in which selfish agents might create and evaluate networks We’ll see two models:
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Information Theory and Security
1 Joint work with Shmuel Safra. 2 Motivation 3 Motivation.
THE TRANSITION FROM ARITHMETIC TO ALGEBRA: WHAT WE KNOW AND WHAT WE DO NOT KNOW (Some ways of asking questions about this transition)‏
Graphs, relations and matrices
Theory of Computing Lecture 19 MAS 714 Hartmut Klauck.
Nattee Niparnan. Easy & Hard Problem What is “difficulty” of problem? Difficult for computer scientist to derive algorithm for the problem? Difficult.
Graph Guessing Games & non-Shannon Information Inequalities First workshop on Entropy and Information Inequalities 17 April 2013 Søren Riis Queen Mary.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 11 Section 1 – Slide 1 of 34 Chapter 11 Section 1 Random Variables.
1 2. Independence and Bernoulli Trials Independence: Events A and B are independent if It is easy to show that A, B independent implies are all independent.
Logic Circuits Chapter 2. Overview  Many important functions computed with straight-line programs No loops nor branches Conveniently described with circuits.
1 Combinatorial Algorithms Parametric Pruning. 2 Metric k-center Given a complete undirected graph G = (V, E) with nonnegative edge costs satisfying the.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
1 Lower Bounds Lower bound: an estimate on a minimum amount of work needed to solve a given problem Examples: b number of comparisons needed to find the.
A logicians approach to Network Coding Søren Riis The Isaac Newton Institute, Cambridge 7 February, 2012 Søren Riis The Isaac Newton Institute, Cambridge.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Markov Chains and Random Walks. Def: A stochastic process X={X(t),t ∈ T} is a collection of random variables. If T is a countable set, say T={0,1,2, …
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Beyond selfish routing: Network Games. Network Games NGs model the various ways in which selfish agents strategically interact in using a network They.
1 Decomposition into bipartite graphs with minimum degree 1. Raphael Yuster.
CS 3343: Analysis of Algorithms Lecture 25: P and NP Some slides courtesy of Carola Wenk.
Implicit Hitting Set Problems Richard M. Karp Erick Moreno Centeno DIMACS 20 th Anniversary.
Chapter 8: Relations. 8.1 Relations and Their Properties Binary relations: Let A and B be any two sets. A binary relation R from A to B, written R : A.
Secret Sharing Non-Shannon Information Inequalities Presented in: Theory of Cryptography Conference (TCC) 2009 Published in: IEEE Transactions on Information.
1 Covering Non-uniform Hypergraphs Endre Boros Yair Caro Zoltán Füredi Raphael Yuster.
Great Theoretical Ideas in Computer Science for Some.
Presented by Alon Levin
Lecture. Today Problem set 9 out (due next Thursday) Topics: –Complexity Theory –Optimization versus Decision Problems –P and NP –Efficient Verification.
Network Formation Games. NFGs model distinct ways in which selfish agents might create and evaluate networks We’ll see two models: Global Connection Game.
COSC 3101A - Design and Analysis of Algorithms 14 NP-Completeness.
Approximation Algorithms based on linear programming.
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
Network Topology Single-level Diversity Coding System (DCS) An information source is encoded by a number of encoders. There are a number of decoders, each.
Reliability of Disk Systems. Reliability So far, we looked at ways to improve the performance of disk systems. Next, we will look at ways to improve the.
The Curve Merger (Dvir & Widgerson, 2008)
Introduction to PCP and Hardness of Approximation
Chapter 11 Limitations of Algorithm Power
Switching Lemmas and Proof Complexity
Presentation transcript:

Isaac Newton Institute 9th of May 2012 Søren Riis Queen Mary University of London Valiant’s Shift problem: A reduction to a problem about graph guessing games

2 Outline of talk 1. Guessing Games and Graph Entropy* 2A. Linking Valiant’s shift conjecture to GG & GE 2B. Construction addressing a combinatorial conjecture by Valiant. 3.Final remarks and a result from my Visit at the Newton Institute (2012) * Parts of this part of the talk were given in a lecture series at University of Bielefeld (2008) 2

3 Description of the Guessing Game on K n, Talk as part of the Isaac Newton program: Combinatorics and Statistical Mechanics (2008)

1. Guessing Games and Graph Entropy Guessing Game (Riis 2005) Requirements: A Directed Graph with n nodes (n players) and n fair dice with s sides. Rules: The players/nodes can agree on a protocol in advance. Each player rolls their die. A player is typically NOT allowed to see the value his/her own die value. Each player is only allowed to see the dice values of the “incoming” nodes. The play: The game is a cooperative game. Each player have to guess (possible by following some protocol) the value of their own dice. No communication between the players is allowed. The players have to make their guess simultaneously (and independently). Task: Maximise the probability each player guess correctly the value of their own die.

1.Guessing Games and Graph Entropy Question: What is the probability that each player guess correctly the value of their own die? Naïve (wrong) answer: Each player have no “relevant” information about their own die value. Each player have probability 1/s of guessing their own dice value. The n dice values are independent so the probability all players guess correctly is (1/s) n

1. Guessing Games and Graph Entropy For the complete graph on n notes with bi-directed edges, the players have a guessing strategy where they all succeed with probability 1/s (and independent of n). The guessing strategy is optimal. The guessing strategy can described globally: Each player guess under the assumption that the sum of all dice values is 0 modulo s.

1. Guessing games and Graph Entropy For a graph on n nodes, uncoordinated guessing strategy succeed with probability (1/s) n The optimal guessing strategy succeed with probability 1/s Thus by cooperating the player can do succeed with a probability that is s (n-1) times more likely of succeeding than random uncoordinated random guessing.

1. Guessing Games and Graph Entropy The power n-1 in the factor s (n-1) plays an important role in our theory. Definition: The guessing the number of a strategy is  if the probability that the players succeed is s  times higher than the probability of success if they use uncoordinated random guessing.

1.Guessing Games and Graph Entropy The complete graph K n with bi-directed edges has guessing number is n-1 (independent of s), because we can do s (n-1) times better than random guessing.

1. Guessing Games and Graph Entropy Examples: guess(K n )=n-1

1. Guessing Games and Graph entropy Example: The graphs C n oriented as a loop has guessing number 1 (independently of s=2,3,4,…) guess(C n )=1

1.Guessing games and Graph Entropy Consider the following guessing strategy: Global description: Each player assume that all dice values are the same. Local description: Each player guess their Dice value is the same as the dice value they see. This strategy succeed with probability (1/s) (n-1) i.e. with a factor s=s 1 better than random guessing. Thus C n has guessing number  1

1. Guessing Games and Graph Entropy Are there any better guessing strategies for C n (viewed as an oriented graph)? This type of question can be answered naturally by use of information theory

1.Guessing games and Graph Entropy Let G be a directed graph and let A={1,2,..,s} be an alphabet with s ≥ 2 letters. Consider any fixed guessing strategy agreed by the players. Define a code C that consists exactly of the words that corresponds to dice assignment where all players are correct. Assume that C contain m words.

1.Guessing Games and Graph Entropy The entropy of C is log s (m) if we assume that each word w 1 w 2 …w n in C has the same probability (namely, 1/m). Notice that we calculate the entropy in base s. Let x 1,x 2,…,x n be stochastic variables, where x j is the value of the j th letter w j of the chosen word. Then H(C )= log s (m), and H(x j ) ≤ 1. Besides that crucially Information constraints determined by the graph G: For each node j, we have the condition H(x j | x i1,x i2,…,x id )=0 where i 1,i 2,…,i d are all incoming nodes to j. To simplify the notation we write this as H(j | i1, i2, ….,id)=0

1. Guessing Games and Graph Entropy Now let us return to the graph C n (oriented) H(1,2,…,n)=H(1,2,…,n-1) =H(1,2,…,n-2)=….=H(1) ≤ 1 This shows that each guessing strategy is successful at most s=s 1 times among all sn words of length n. Thus the entropy calculation shows that the guessing number of C n (oriented) is at most 1.

1.Guessing Games and Graph Entropy Non-trivial example: Consider the pentagon C 5 with bi-directed edges. Let A be an alphabet with s letters. Assume that s is a square number i.e. that s=t 2 for some t=2,3,4,…

1. Guessing Games and Graph Entropy W.l.o.g. we can assume that each of the 5 players have assigned two dice (each of t sides). One labeled “l”for left, and one labeled “r” for right.

1. Guessing Games and Graph Entropy Guessing strategy for C 5 : Global description: Each “left” die has the same value as its corresponding “right” die. This strategy succeed with probability (1/t) 5 = (1/s) 2.5

1. Guessing Games and Graph Entropy This shows that pentagon has guessing number 2.5 if s=4,9,16,25,… is square number. Using information theory (Shannon information inequalities) we now show that any guessing strategy have leads to code C of entropy ≤ 2.5

1. Guessing Games and Graph Entropy Let H be the entropy function corresponding to the code C. In general H satisfies H(X,Y,Z) + H(Z ) ≤ H(X,Z) + H(Y,Z). If we let X={1}, Y={3} and Z={4,5} we get: H(1,2,3,4,5)+H(4,5) = H(1,3,4,5) + H(4,5) ≤ H(1,4,5) + H(3,4,5) = H(1,4)+H(3,5) ≤ H(1) +H(3) +H(4) +H(5) and thus H(1,2,3,4,5) ≤ H(1)+H(3)+H(4)+H(5)-H(4,5) Next notice that H(1,2,3,4,5)-H(2,5) =H(2,3,4,5)-H(2,5)= H(3,4 | 2,5) = H(4 | 2,5) ≤ H(4|5) = H(4,5)-H(5) which shows that H(1,2,3,4,5) ≤ H(2,5) + H(4,5)-H(5) ≤ H(4,5)+H(2) Adding up: 2H(1,2,3,4,5) ≤ H(1)+H(2)+H(3)+H(4)+H(5) ≤ 5

1.Guessing Games and Graph Entropy Proposition: The pentagon has guessing number log 2 (5)=2.32… when s=2 (A={head/tail}) An optimal guessing strategy: The players assume that no 3 consecutive players each have assigned tail, and no 2 consecutive players have assigned head.

1. Guessing Games and Graph Entropy Theorem: (Riis 2007) The graph C n (with bi-directed edges) has Entropy n/2 for n=2,4,5,6,7,..... For n=3 there is an exception as C 3 has Entropy 2.

1. Guessing Games and Graph Entropy (in a new network coding sense) The entropy calculation in the previous section used Shannon’s information inequalities to derive an upper bound on the guessing number. Definition: The pseudo entropy* (“Shannon entropy” in some papers) of a graph is the minimal upper bound on the guessing number that can be derived from Shannon’s information inequalities. H(X,Y,Z) + H(Z) ≤ H(X,Z) + H(Y,Z) and H(Ø)=0 * Pseudo entropy is a better term as there are non-shannon information inequalities.

1.Guessing Games and Graph Entropy Definition: The entropy E(G,s) of a graph G over an alphabet A of size s=2,3,4,… is the supremum of H(1,2,…,n) of all entropy functions H over A that satisfies the information constraints* determined by G. * The information constraints are the same constraints we used for calculating the pseudo entropy of G

1. Guessing Games and Graph Entropy The Pseudo-entropy seems to approximate the guessing number on small graphs extremely well. For in fact ALL directed graph we investigated by computer with <10 nodes have identical pseudo-entropy and Guessing Number. (Sun Yun and Emil Vaughan) Theorem: Riis (2007) The Graph entropy* H(G) of a directed graph satisfies H(G):= lim s→∞ Guess(G,s) * Not to be confused with Korner’s notion

27 1. Guessing Games and Graph Entropy So far all this were known in 2007 Since then I worked on these problems with two PhD students* and post doctoral students** and other collaborators *** as part of a EPSRC funded project Information Flows and Information Bottlenecks * Sun Yun ( ) and Taoyang Wu ( ) ** Maximilien Gadouleau, Antonia Katzouraki, Emil Vaughan and Rahil Baber *** Peter Cameron, Peter Keevash and Demetres Christofides

28 1. Guessing Games and Graph Entropy Each perfect graph* has Guessing number given by its maximal clique cover D. Christofides and K. Markström (2011) Open Questions: Is the Guessing number of an undirected Graph always given by its fractional cliques cover? Given a directed graph G. Does G and is “reverse” have the same pseudo entropy? Does a directed graph G and its reverse have the same entropy? * undirected edges are views as bi-directed edges. 28

29 2A. Linking Valiant’s shift conjecture to GG & GE Simultaneous lower bounds on size and depth: Problem stated in the mid 70’s: Find an explicit family of Boolean Functions B n that cannot be computed by size O(n), depth O(log(n)) Circuits. 29

30 2A. Linking Valiant’s shift conjecture to GG & GE Shift problem: Task is to compute the Boolean function {0,1} n+log(n) -> {0,1} n which given binary input string (x 1,x 2,...,x n ) shifts it cyclically by j places (j=0,1,2....,n-1) Valiant (1977): Can the shift function be computed in size O(n) and depth O(log(n))? This problem is still open. 30

31 2A. Linking Valiant’s shift conjecture to GG & GE In 1992 Valiant stated a combinatorial conjecture which would give such lower bounds on “shifting” and “Sorting”. Valiants Combinatorial Conjecture: If G is a biparte graph with O(n (1+ε) )-edges then there is a cyclic shift σ such that G does not realise σ with n/2 common bits. 31

32 2A. Linking Valiant’s shift conjecture to GG & GE We translate Valiants Combinatorial conjecture as follows: Given a directed graph G where the nodes are labelled 1,2,...,n. We defined the shifted graphs G 0,G 1,G 2,....,G n-1 as the directed graph that occurs by maintaining the tail of each arc, but shifting the head of each arc. σ=0,1,2,...,n-1 32

33 2A. Linking Valiant’s shift conjecture to GG & GE Reformulation of Valiant combinatorial conjecture*: Given a directed graph G on n nodes with at most n (1+ε) edges, then there is a shift σ such that G σ has guessing number < n/2. Bad news: The conjecture (or the version stated by Valiant) is not quite correct (Riis 2007) Good news: The conjecture can be modified which leaves the viability of the common information approach to proving lower bounds in Circuit Complexity open. * not entirely equivalent to valiant conjecture

34 2B. Construction addressing combinatorial conjecture by Valiant. Consider n, d, r ∈ N with d < n/2 and 0 ≤ r ≤ n − 1 we let G n,d,r denote the graph that has vertex set {0,1,2,...,n − 1} and edge set E=E n,d,r consisting of all edges (i,j) with i+j ∈ {r,r+1,r+2,...,r+d−1} (sum modulo n). 34

35 2B. Construction addressing combinatorial conjecture by Valiant. 35 G 9,3,1 and its 8 shift

36 2B. Construction addressing combinatorial conjecture by Valiant. Each of the 23 shift of G 23,5,1 produce a graph with guessing number at least 14 (rather than 11) The construction can be converted back to biparte graph where all shifts are realised by 9 bits (rather than 12 which would follows by a version of Valiant’s conjecture)

37 2B. Construction addressing combinatorial conjecture by Valiant. G n,n ε,1 has guessing number (n+n ε )/2 and converting back to Valant’s Combinatorial problem we get: Riis (2007): For each 1>ε>0 there is a biparte graph with O(n (1+ε) )- edges such that G realise all cyclic shift σ with less than (n-n ε )/2 common bits. 37

38 2B. Construction addressing combinatorial conjecture by Valiant. The viability of Valiant’s common information approach to proving lower bounds in Circuit Complexity remains open: Conjecture: Given ε>0 then there exist δ>0 such that a for each directed graph G on n nodes with at most n (1+ε) edges, there is a shift σ such that G σ has guessing number less than (1-δ)n for some δ>0 Conjecture: For given ε>0, there exists δ>0 such each directed graph G with at most n (1+ε) edges and without self-loops and bi-directed edges, has Guessing number at most (1-δ)n Each of these conjectures imply a O(n) size, O(log(n)) depth lower bound on the shift problem. 38

39 3. Final Remarks and a result from my Visit a the Newton Institute. There is an interesting link between Communication Networks (Network Coding) and Circuits. Example: The matrix transposition problem can be viewed as a reverse engineering problem in communication networks Example: The multiuser multiple unicast problem can be restated as the problem of determining if a given acyclic graph with n inputs and n corresponding outputs can compute the identity function. 39

40 3. Final Remarks and a result from my Visit a the Newton Institute. Let t 1 =s 1,t 2 =s 2,....,t k =s k be k-term equations in variables x 1,x 2,...,x n. We are interested to determine the maximal number of solutions over finite universal algebras of size s, where the function symbols in the terms are interpreted. 40

41 3. Final Remarks and a result from my Visit a the Newton Institute. Theorem (Riis INI) The maximal number of solutions to a collection of term equations is - up to a constant factor (independent of s)- given by s Guess(G,s) (s is size of the universal algebra where the terms are interpreted and G is a directed graph that can be constructed by purely syntactical means from the term equations) 41

42 Thank you! 42