Randomness and Computation Oded Goldreich Dept. of Computer Science and Applied Math. Weizmann Institute Slides for MINERVA presentation MINERVA MINI-SYMPOSIA (Natural Sciences) – 7/10/13, Session IB
The Interplay of Randomness and Computation Odd: Seems that computation (as a deterministic process) stands in contrast to the notion of randomness. But if you consider the output of a deterministic process on a randomly distributed input, then… Or an algorithm that tosses coins: Probabilistic Proof Systems: E.g., IP, ZK, and PCP. Pseudorandomness and De-randomization. Property Testing and Sub-linear Time Algorithms PPS and PRG are related to the Foundations of Crypto. Comment: ZK and PRG are closely related to Cryptography.
Probabilistic Proof Systems Odd: Don’t we want certainty in proofs? Well, we’ll get certainty up to an explicitly bounded error probability, and this will allow us advantages over the traditional notion (of a deterministically verifiable) proof. IP (interactive proof): convincing via interaction. ZK (zero-knowledge): w.o. revealing anything (beyond). PCP (Prob. Checkable Pf.): while reading three bits. PCP: Trade-off between the number of bits read and the confidence parameter!
Pseudorandomness (and PRGs) What does this mean (in CS)? A deterministic process that stretches short random seeds into (much) longer sequences that “look random”. What does “look random” mean? See 1st parameter (below). PRG is a “family name”; members differ by parameters: Computational indistinguishability w.r.t a specific class of observers (defined by their resources). Comput. complexity of generation (i.e., of the PRG). The amount of stretch. Re (1): E.g., all efficient observers, all small-space observers, all linear tests, all tests that look at five bits.
Property Testing (sublinear-time approx. decision) What is an approximate decision? Distinguishing objects that have a (predetermined) property from objects that are “far” from having this property. Advantage: the algorithm may inspect a small portion of the tested object! That’s typically the goal in PT. An example you all know: Estimating the average value of a function defined over a huge domain. This simple example refers to “unstructured” objects and properties. Our focus is on “structured” ones. E.g., testing that a graph is bipartite or cycle-free. One-sided error vs two-sided error. Adaptive vs non-adaptive.
One concrete research project (i.e., recent, ongoing) Context: De-randomization. Known: Given a ``very simple algorithm’’ (i.e., constant-depth circuit) that evaluates to the value 1 on at least half of its inputs, we can determinstically find an input that evaluates to 1 “almost efficiently” (i.e., quasi-poly-time). N.B.: It is easy to find such an input probabilistically. New: If we are guaranteed that the number of bad inputs is sub-exponential (in the input length), then we can find a good input efficiently. If the class of circuits is mildly extended, then this relaxation is not helpful. Constant-depth circuits = the most simple model of algorithms.
End The slides of this talk are available at http://www.wisdom.weizmann.ac.il/~oded/T/minerva.ppt