Derandomizing LOGSPACE Based on a paper by Russell Impagliazo, Noam Nissan and Avi Wigderson Presented by Amir Rosenfeld.

Slides:



Advertisements
Similar presentations
Optimal Space Lower Bounds for All Frequency Moments David Woodruff MIT
Advertisements

Chapter 5: Tree Constructions
Theory of Computing Lecture 23 MAS 714 Hartmut Klauck.
Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
The Communication Complexity of Approximate Set Packing and Covering
Applied Algorithmics - week7
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
1 Nondeterministic Space is Closed Under Complement Presented by Jing Zhang and Yingbo Wang Theory of Computation II Professor: Geoffrey Smith.
Huffman code and ID3 Prof. Sin-Min Lee Department of Computer Science.
Approximation, Chance and Networks Lecture Notes BISS 2005, Bertinoro March Alessandro Panconesi University La Sapienza of Rome.
Randomized Algorithms Kyomin Jung KAIST Applied Algorithm Lab Jan 12, WSAC
Optimization of Pearl’s Method of Conditioning and Greedy-Like Approximation Algorithm for the Vertex Feedback Set Problem Authors: Ann Becker and Dan.
Complexity 12-1 Complexity Andrei Bulatov Non-Deterministic Space.
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
Complexity 13-1 Complexity Andrei Bulatov Hierarchy Theorem.
Complexity 18-1 Complexity Andrei Bulatov Probabilistic Algorithms.
CS151 Complexity Theory Lecture 7 April 20, 2004.
Tirgul 10 Rehearsal about Universal Hashing Solving two problems from theoretical exercises: –T2 q. 1 –T3 q. 2.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 8 May 4, 2005
Karger’s Min-Cut Algorithm Amihood Amir Bar-Ilan University, 2009.
CPSC 689: Discrete Algorithms for Mobile and Wireless Systems Spring 2009 Prof. Jennifer Welch.
1 University of Freiburg Computer Networks and Telematics Prof. Christian Schindelhauer Distributed Coloring in Õ(  log n) Bit Rounds COST 293 GRAAL and.
CPSC 668Set 10: Consensus with Byzantine Failures1 CPSC 668 Distributed Algorithms and Systems Fall 2006 Prof. Jennifer Welch.
Undirected ST-Connectivity 2 DL Omer Reingold, STOC 2005: Presented by: Fenghui Zhang CPSC 637 – paper presentation.
CS151 Complexity Theory Lecture 7 April 20, 2015.
Complexity 19-1 Complexity Andrei Bulatov More Probabilistic Algorithms.
Computability and Complexity 20-1 Computability and Complexity Andrei Bulatov Class NL.
Tirgul 6 B-Trees – Another kind of balanced trees Problem set 1 - some solutions.
The Byzantine Generals Strike Again Danny Dolev. Introduction We’ll build on the LSP presentation. Prove a necessary and sufficient condition on the network.
Undirected ST-Connectivity In Log Space Omer Reingold Slides by Sharon Bruckner.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
Foundations of Cryptography Lecture 9 Lecturer: Moni Naor.
Simulating independence: new constructions of Condensers, Ramsey Graphs, Dispersers and Extractors Boaz Barak Guy Kindler Ronen Shaltiel Benny Sudakov.
Ch. 8 & 9 – Linear Sorting and Order Statistics What do you trade for speed?
Section 11.4 Language Classes Based On Randomization
Distributed Algorithms on a Congested Clique Christoph Lenzen.
Computing and Communicating Functions over Sensor Networks A.Giridhar and P. R. Kumar Presented by Srikanth Hariharan.
Randomized Turing Machines
Approximation Algorithms Pages ADVANCED TOPICS IN COMPLEXITY THEORY.
Theory of Computing Lecture 17 MAS 714 Hartmut Klauck.
Why Extractors? … Extractors, and the closely related “Dispersers”, exhibit some of the most “random-like” properties of explicitly constructed combinatorial.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
Discrete Structures Lecture 12: Trees Ji Yanyan United International College Thanks to Professor Michael Hvidsten.
PROBABILISTIC COMPUTATION By Remanth Dabbati. INDEX  Probabilistic Turing Machine  Probabilistic Complexity Classes  Probabilistic Algorithms.
 Rooted tree and binary tree  Theorem 5.19: A full binary tree with t leaves contains i=t-1 internal vertices.
Fall 2013 CMU CS Computational Complexity Lectures 8-9 Randomness, communication, complexity of unique solutions These slides are mostly a resequencing.
Amplification and Derandomization Without Slowdown Dana Moshkovitz MIT Joint work with Ofer Grossman (MIT)
Umans Complexity Theory Lectures Lecture 7b: Randomization in Communication Complexity.
Strings Basic data type in computational biology A string is an ordered succession of characters or symbols from a finite set called an alphabet Sequence.
Communication Complexity Guy Feigenblat Based on lecture by Dr. Ely Porat Some slides where adapted from various sources Complexity course Computer science.
Pseudo-random generators Talk for Amnon ’ s seminar.
Algorithms for hard problems Parameterized complexity Bounded tree width approaches Juris Viksna, 2015.
Overview of the theory of computation Episode 3 0 Turing machines The traditional concepts of computability, decidability and recursive enumerability.
NP-complete Languages
CSCI 2670 Introduction to Theory of Computing December 2, 2004.
Almost SL=L, and Near-Perfect Derandomization Oded Goldreich The Weizmann Institute Avi Wigderson IAS, Princeton Hebrew University.
5.6 Prefix codes and optimal tree Definition 31: Codes with this property which the bit string for a letter never occurs as the first part of the bit string.
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
Randomized Algorithms for Distributed Agreement Problems Peter Robinson.
CSCI 2670 Introduction to Theory of Computing November 15, 2005.
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Probabilistic Algorithms
Information Complexity Lower Bounds
Time complexity Here we will consider elements of computational complexity theory – an investigation of the time (or other resources) required for solving.
Computing Connected Components on Parallel Computers
Parameterized complexity Bounded tree width approaches
Turnstile Streaming Algorithms Might as Well Be Linear Sketches
CS21 Decidability and Tractability
CS151 Complexity Theory Lecture 7 April 23, 2019.
Locality In Distributed Graph Algorithms
Presentation transcript:

Derandomizing LOGSPACE Based on a paper by Russell Impagliazo, Noam Nissan and Avi Wigderson Presented by Amir Rosenfeld

Derandomization & Pseudorandomness Pseudorandomness is about understanding the minimum amount of randomness actually required by a probabilistic model of computation. A Pseudorandom Generator takes m<<n random bits and deterministically stretches them into n pseudorandom bits. A Pseudorandom Generator is said to “fool” a computational model, when n truly random bits used by the model can be replaced by n pseudorandom bits created by the generator without significant difference in its behavior. If we can “fool” a probabilistic algorithm by using m<<n truly random bits, then we can derandomize it by trying out all 2 m possibilities.

Relevant Computational Models The generator described by INW fools every computational model which can be described as a network of probabilistic processors. The number of truly random bits required by the generator (m) depends on the the communication bandwidth of the algorithm run by the network. No assumptions are made about the computational power of the processors.

Communication Network v1v1 v2v2 v3v3 v4v4 v5v5 Each processor receives part of the input. Each processor requires a preset number of random coin flips Each processor sends and receives at most c bits of information Each processor calculates some function of its input, random bits and received information.

Some Intuition In a network algorithm that uses probabilistic processors, we can reuse the same random bits on many processors provided their communication is limited enough such that a processor cannot learn much about the random bits of the others. AliceBob r random bits are required by both Alice and Bob c bits of communication So Bob can reuse most of the entropy in the bits given to Alice The entropy of Alice's bits as seen here is r-c

Yao (1979) described a communication complexity model for computation where two parties are required to compute a function The communication complexity was described as the minimum number of bits that need to be communicated between the parties until one of them outputs the answer. The Basic Two-Party Model

Defining Protocol The network algorithm uses a specific protocol to communicate between the parties. We call a protocol “normal” if the total amount of information sent/received by a party equals the total number of bits that it sent/received. (I.e. length of messages and timing are known in advance) A c-protocol is a normal protocol in which on any input to the network and every random choice, every party sends and receives at most c bits of communication.

Definition of the Basic Generator

The Basic Generator Fix an expander graph H=(V,E), with 2 r vertices and degree D=2 d. The input is a name of a random directed edge in E. It therefore requires m=r+d random bits. The output are the two vertices on the edge. Thus it produces two r bit output strings.

What does it Fool? Theorem 1 g is a c-generator, i.e. it fools 2-party c-protocols, for c = (d - log λ)/2 where λ is the second largest eigenvalue of H. Recall that g consists of an expander graph H with degree D=2 d.

Proof of Theorem 1 For every graph H=(V,E) of degree D and second largest eigenvalue λ, and for every S,T  V the following inequality holds: This is the Mixing Lemma that was presented earlier in the course.

Proof of Theorem 1 (cont.)

Extractor – An Alternative View The same construction could be achieved with an extractor: rd Seed = m random bits rE(r,d) The auxiliary bits used by the extractor Bounded communication – r still contains much entropy to other party

Expanding the Model The communication network is a graph H=(V,E) where the nodes are parties/processors and directional edges represent communication lines between them. Each processor has unlimited power and it can use any input information and any communicated information it received. We are concerned with networks algorithms using c-protocols.

Partition Trees A partition tree T of a graph H=(V,E) is a rooted binary tree with a one-to-one onto mapping of V to the leaves of T. T is called balanced if the depth of T is O(log |V|). Every internal node ν of T partitions V into 3 sets: A ν, B ν and C ν ; these are the vertices of V residing in the leaves of the left child of ν, right child of ν and the remaining leaves of T.

A Partition Tree v1v1 v2v2 v3v3 v4v4 v5v5 T3T3 T2T2 T1T1 T4T4

v1v1 v2v2 v3v3 v4v4 v5v5 T3T3 T4T4 T2T2 T1T1

Partition Trees (cont.) cut(ν) is the subset of E which connect vertices in two different sets. The width of ν is the smallest number of vertices which cover all of the edges in the cut. The width of a tree is the minimal width of an internal node. The width of a graph H is the the smallest width of a balanced partition tree for H.

A Partition Tree v1v1 v2v2 v3v3 v4v4 v5v5 T3T3 T4T4 T2T2 T1T1 cut(T 2 ) = 4

k-measurement A k-measurement M on a protocol is a k-bit function where each of the bits can be computed at the end of the protocol by a single processor (at least). The entire k bit measurement Processor 1 Processor 2 Processor n

The Required Generator

Constructing the Generator

Constructing the Generator (2)

L The Generator (3) S R x (truly random seed) v1v1 v2v2 v3v3 v4v4 v5v5

What can the Generator Do? Main Theorem:

Random LOGSPACE Input Tape of length n Work Tape of length O(log n) Random Bits Tape The class of problems that are decidable by a Turing machine with the following characteristics:

Non Uniform Machine The Turing machine just described is for a uniform model of computation. We can build a non-uniform machine by creating a specific hardwired machine for every input. This machine has the random bit tape as the only input.

Non-Uniform Machine (2) We are now left with an OBDD that has to Accept or Reject according to the random tape. R A Random Bit Tape Max Poly(n) Width resulting from LOGSPACE bound The OBDD implies read-once access of the random bits, but this is only a simplification for the purpose of explanation.

The Communication Network Reducing the OBDD to a network and protocol:  Each random-bit cell is a processor in the network  Whenever the head moves from cell A i to its neighbor A i+1, the entire state of the machine is sent from processor i to processor i+1.

The Resulting Line Protocol P1P1 P2P2 P3P3 P r-1 PrPr System State O(log n) bits At each random cell transition a processor sends the system state to its neighbor.

Tree Width of the Line Protocol P1 P2P3P4 P5 T1 Constant Width !

What do we need to fool? A processor sends the state at most constant number of times. Thus it is an O(S)-protocol. The tree-width of the network is O(1) The k-measurement is actually a 1-measurement: Accept or Reject by the last processor.

The total state of the machine is held in O(log n) bits. Therefore, our generator requires only: random bits, allowing us to derandomize the algorithm with n log n input strings. Derandomizing LOGSPACE Bounded Read-Multiplicity Machines

Proof (for LOGSPACE Machines)

Proof (cont.) Induction base:  For a leaf – the distributions are identical. For the induction step we create hybrid distributions and prove that their combined distance from the fully random distribution meets the goal.

The Hybrid Distributions R RR RRR R R RR RRG G R GG GGG G R RR GGG G By induction hypothesis, and averaging over possible values of right side

Summary We showed a generator that can fool randomized network algorithms. We showed a reduction of LOGSPACE machines to a Network Algorithm We proved that the generator works for the networks that result from that reduction This proves that we can derandomize LOGSPACE by order n log n random bits.

The END