Randomness – A computational complexity view Avi Wigderson Institute for Advanced Study.

Slides:



Advertisements
Similar presentations
Low-End Uniform Hardness vs. Randomness Tradeoffs for Arthur-Merlin Games. Ronen Shaltiel, University of Haifa Chris Umans, Caltech.
Advertisements

Linear-Degree Extractors and the Inapproximability of Max Clique and Chromatic Number David Zuckerman University of Texas at Austin.
Wonders of the Digital Envelope
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
CS151 Complexity Theory Lecture 8 April 22, 2004.
Lecture 21 NP-complete problems
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Great Theoretical Ideas in Computer Science for Some.
Agrawal-Kayal-Saxena Presented by: Xiaosi Zhou
Introduction to Modern Cryptography Lecture 6 1. Testing Primitive elements in Z p 2. Primality Testing. 3. Integer Multiplication & Factoring as a One.
COM 5336 Cryptography Lecture 7a Primality Testing
Derandomized parallel repetition theorems for free games Ronen Shaltiel, University of Haifa.
Probabilistically Checkable Proofs (and inapproximability) Irit Dinur, Weizmann open day, May 1 st 2009.
Proof, Computation, & Randomness Kurt Gödel John von Neumann and Theoretical Computer Science Avi Wigderson School of Mathematics Institute for Advanced.
Probabilistically Checkable Proofs Madhu Sudan MIT CSAIL 09/23/20091Probabilistic Checking of Proofs TexPoint fonts used in EMF. Read the TexPoint manual.
Time vs Randomness a GITCS presentation February 13, 2012.
Discrete Structures & Algorithms The P vs. NP Question EECE 320.
CS151 Complexity Theory Lecture 7 April 20, 2004.
Derandomization: New Results and Applications Emanuele Viola Harvard University March 2006.
Arithmetic Hardness vs. Randomness Valentine Kabanets SFU.
P versus NP and Cryptography Wabash College Mathematics and Computer Science Colloquium Nov 16, 2010 Jeff Kinne, Indiana State University (Theoretical)
Complexity 19-1 Complexity Andrei Bulatov More Probabilistic Algorithms.
EXPANDER GRAPHS Properties & Applications. Things to cover ! Definitions Properties Combinatorial, Spectral properties Constructions “Explicit” constructions.
Analysis of Algorithms CS 477/677
CSE 421 Algorithms Richard Anderson Lecture 27 NP Completeness.
Sedgewick & Wayne (2004); Chazelle (2005) Sedgewick & Wayne (2004); Chazelle (2005)
The Power of Randomness in Computation 呂及人中研院資訊所.
Lecture 20: April 12 Introduction to Randomized Algorithms and the Probabilistic Method.
A Brief Introduction To The Theory of Computer Science and The PCP Theorem By Dana Moshkovitz Faculty of Mathematics and Computer Science The Weizmann.
Sedgewick & Wayne (2004); Chazelle (2005) Sedgewick & Wayne (2004); Chazelle (2005)
Complexity Theory: The P vs NP question Lecture 28 (Dec 4, 2007)
Of 28 Probabilistically Checkable Proofs Madhu Sudan Microsoft Research June 11, 2015TIFR: Probabilistically Checkable Proofs1.
Correlation testing for affine invariant properties on Shachar Lovett Institute for Advanced Study Joint with Hamed Hatami (McGill)
Randomness (and Pseudorandomness) Avi Wigderson IAS, Princeton
Sub-Constant Error Low Degree Test of Almost-Linear Size Dana Moshkovitz Weizmann Institute Ran Raz Weizmann Institute.
The Power and Weakness of Randomness (when you are short on time) Avi Wigderson School of Mathematics Institute for Advanced Study.
Cs3102: Theory of Computation Class 24: NP-Completeness Spring 2010 University of Virginia David Evans.
1 Lower Bounds Lower bound: an estimate on a minimum amount of work needed to solve a given problem Examples: b number of comparisons needed to find the.
The sum-product theorem and applications Avi Wigderson School of Mathematics Institute for Advanced Study.
1 The Theory of NP-Completeness 2 Cook ’ s Theorem (1971) Prof. Cook Toronto U. Receiving Turing Award (1982) Discussing difficult problems: worst case.
Some Fundamental Insights of Computational Complexity Theory Avi Wigderson IAS, Princeton, NJ Hebrew University, Jerusalem.
COMPSCI 102 Introduction to Discrete Mathematics.
Umans Complexity Theory Lectures Lecture 1a: Problems and Languages.
1 Chapter 34: NP-Completeness. 2 About this Tutorial What is NP ? How to check if a problem is in NP ? Cook-Levin Theorem Showing one of the most difficult.
Fall 2013 CMU CS Computational Complexity Lectures 8-9 Randomness, communication, complexity of unique solutions These slides are mostly a resequencing.
Polynomials Emanuele Viola Columbia University work partially done at IAS and Harvard University December 2007.
The Computational Complexity of Satisfiability Lance Fortnow NEC Laboratories America.
CS 3343: Analysis of Algorithms Lecture 25: P and NP Some slides courtesy of Carola Wenk.
My Favorite Ten Complexity Theorems of the Past Decade II Lance Fortnow University of Chicago.
Avi Wigderson Institute for Advanced Study Princeton
Randomness & Pseudorandomness Avi Wigderson IAS, Princeton.
CPS Computational problems, algorithms, runtime, hardness (a ridiculously brief introduction to theoretical computer science) Vincent Conitzer.
NP Completeness Piyush Kumar. Today Reductions Proving Lower Bounds revisited Decision and Optimization Problems SAT and 3-SAT P Vs NP Dealing with NP-Complete.
Pseudo-random generators Talk for Amnon ’ s seminar.
The sum-product theorem and applications Avi Wigderson School of Mathematics Institute for Advanced Study.
Comparing Notions of Full Derandomization Lance Fortnow NEC Research Institute With thanks to Dieter van Melkebeek.
NP ⊆ PCP(n 3, 1) Theory of Computation. NP ⊆ PCP(n 3,1) What is that? NP ⊆ PCP(n 3,1) What is that?
Almost SL=L, and Near-Perfect Derandomization Oded Goldreich The Weizmann Institute Avi Wigderson IAS, Princeton Hebrew University.
Pseudorandomness: New Results and Applications Emanuele Viola IAS April 2007.
Computational Complexity Theory
Cryptography and Pseudorandomness
Introduction to Randomized Algorithms and the Probabilistic Method
Randomness and Computation
On the Size of Pairing-based Non-interactive Arguments
CS154, Lecture 18:.
Dana Moshkovitz The Institute For Advanced Study
Pseudorandomness when the odds are against you
Pseudo-derandomizing learning and approximation
Objective of This Course
Presentation transcript:

Randomness – A computational complexity view Avi Wigderson Institute for Advanced Study

Plan of the talk Computational complexity -- efficient algorithms, hard and easy problems, P vs. NP The power of randomness -- in saving time The weakness of randomness -- what is randomness ? -- the hardness vs. randomness paradigm The power of randomness -- in saving space -- to strengthen proofs

Easy and Hard Problems asymptotic complexity of functions Multiplication mult(23,67) = 1541 grade school algorithm: n 2 steps on n digit inputs EASY P – Polynomial time algorithm Factoring factor(1541) = (23,67) best known algorithm: exp(  n) steps on n digits HARD? -- we don’t know! -- the whole world thinks so!

Map Coloring and P vs. NP Input: planar map M (with n countries) 2-COL: is M 2-colorable? 4-COL: is M 4-colorable? Easy Hard?3-COL: is M 3-colorable? Trivial Thm: If 3-COL is Easy then Factoring is Easy P vs. NP problem: Formal: Is 3-COL Easy? Informal: Can creativity be automated? - Thm [Cook-Levin ’71, Karp ’72]:3-COL is NP-complete - …. Numerous equally hard problems in all sciences

Fundamental question #1 Is NP  P ? More generally how fast can we solve: - Factoring integers - Map coloring - Satisfiability of Boolean formulae - Computing the Permanent of a matrix - Computing optimal Chess/Go strategies - ……. Best known algorithms: exponential time/size. Is exponential time/size necessary for some? Conjecture 1 : YES

The Power of Randomness Host of problems for which: - We have probabilistic polynomial time algorithms - We (still) have no deterministic algorithms of subexponential time.

Coin Flips and Errors Algorithms will make decisions using coin flips … (flips are independent and unbiased) When using coin flips, we’ll guarantee: “task will be achieved, with probability >99%” Why tolerate errors? We tolerate uncertainty in life Here we can reduce error arbitrarily <exp(-n) To compensate – we can do much more…

Number Theory: Primes Problem 1 [Gauss]: Given x  [2 n, 2 n+1 ], is x prime? 1975 [Solovay-Strassen, Rabin] : Probabilistic 2002 [Agrawal-Kayal-Saxena]: Deterministic !! Problem 2: Given n, find a prime in [2 n, 2 n+1 ] Algorithm: Pick at random x 1, x 2,…, x 1000n For each x i apply primality test. Prime Number Theorem  Pr [  i x i prime] >.99

Algebra: Polynomial Identities Is det( )-  i<k (x i -x k )  0 ? Theorem [Vandermonde]: YES Given (implicitly, e.g. as a formula) a polynomial p of degree d. Is p(x 1, x 2,…, x n )  0 ? Algorithm [Schwartz-Zippel ‘80] : Pick r i indep at random in {1,2,…,100d} p  0  Pr[ p(r 1, r 2,…, r n ) =0 ] =1 p  0  Pr[ p(r 1, r 2,…, r n )  0 ] >.99 Applications: Program testing

Analysis: Fourier coefficients Given (implicitely) a function f:(Z 2 ) n  {-1,1} (e.g. as a formula), and  >0, Find all characters  such that | |   Comment : At most 1/  2 such  Algorithm [Goldreich-Levin ‘89] : …adaptive sampling… Pr[ success ] >.99 [AGS] : Extension to other Abelian groups. Applications: Coding Theory, Complexity Theory Learning Theory, Game Theory

Geometry: Estimating Volumes Algorithm [Dyer-Frieze-Kannan ‘91]: Approx counting  random sampling Random walk inside K. Rapidly mixing Markov chain. Analysis: Spectral gap  isoperimetric inequality Applications: Statistical Mechanics, Group Theory Given (implicitly) a convex body K in R d (d large!) (e.g. by a set of linear inequalities) Estimate volume (K) Comment: Computing volume(K) exactly is #P-complete

Fundamental question #2 Does randomness help ? Are there problems with probabilistic polytime algorithm but no deterministic one? Conjecture 2: YES Theorem: One of these conjectures is false! Fundamental question #1 Does NP require exponential time/size ? Conjecture 1: YES

Hardness vs. Randomness Theorems [Blum-Micali,Yao,Nisan-Wigderson, Impagliazzo-Wigderson…] : If there are natural hard problems Then randomness can be efficiently eliminated. Theorem [Impagliazzo-Wigderson ‘98] NP requires exponential size circuits  every probabilistic polynomial-time algorithm has a deterministic counterpart

Computational Pseudo-Randomness pseudorandom if for every efficient algorithm, for every input, output  none efficient deterministic pseudo- random generator algorithm input output many unbiased independent n algorithm input output many biased dependent n few k ~ c log n

Hardness  Pseudorandomness k ~ clog n k+1 f Need: f hard on random input Average-case hardness Have: f hard on some input Worst-case hardness Hardness amplification Need G: k bits  n bits Show G: k bits  k+1 bits NW generator

Derandomization G efficient deterministic pseudo- random generator algorithm input output n k ~ c log n Deterministic algorithm: - Try all possible 2 k =n c “seeds” - Take majority vote Pseudorandomness paradigm: Can derandomize specific algorithms without assumptions! e.g. Primality Testing & Maze exploration

Randomness and space complexity

Getting out of mazes (when your memory is weak) Theseus Ariadne Crete, ~1000 BC Theorem [Aleliunas-Karp- Lipton-Lovasz-Rackoff ‘80]: A random walk will visit every vertex in n 2 steps (with probability >99% ) Only a local view (logspace) n–vertex maze/graph Theorem [Reingold ‘06] : A deterministic walk, computable in logspace, will visit every vertex. Uses ZigZag expanders [Reingold-Vadhan-Wigderson ‘02]

The power of pandomness in Proof Systems

Probabilistic Proof System [Goldwasser-Micali-Rackoff, Babai ‘85] Is a mathematical statement claim true? E.g. claim: “No integers x, y, z, n>2 satisfy x n +y n = z n “ claim: “The Riemann Hypothesis has a 200 page proof” An efficient Verifier V(claim, argument) satisfies: *) If claim is true then V(claim, argument) = TRUE for some argument (in which case claim=theorem, argument=proof) **) If claim is false then V(claim, argument) = FALSE for every argument probabilistic with probability > 99% always

Remarkable properties of Probabilistic Proof Systems -Probabilistically Checkable Proofs (PCPs) -Zero-Knowledge (ZK) proofs

Probabilistically Checkable Proofs (PCPs) claim: The Riemann Hypothesis Prover: (argument) Verifier: (editor/referee/amateur) Verifier’s concern: Is the argument correct? PCPs: Ver reads 100 (random) bits of argument. Th[Arora-Lund-Motwani-Safra-Sudan-Szegedy’90] Every proof can be eff. transformed to a PCP Refereeing (even by amateurs) in a jiffy! Major application – approximation algorithms

Zero-Knowledge (ZK) proofs [Goldwasser-Micali-Rackoff ‘85] claim: The Riemann Hypothesis Prover: (argument) Verifier: (editor/referee/amateur) Prover’s concern: Will Verifier publish first? ZK proofs: argument reveals only correctness! Theorem [Goldreich-Micali-Wigderson ‘86]: Every proof can be efficiently transformed to a ZK proof, assuming Factoring is HARD Major application - cryptography

Conclusions & Problems When resources are limited, basic notions get new meanings (randomness, learning, knowledge, proof, …). - Randomness is in the eye of the beholder. - Hardness can generate (good enough) randomness. - Probabilistic algs seem powerful but probably are not. - Sometimes this can be proven! (Mazes,Primality) - Randomness is essential in some settings. Is Factoring HARD? Is electronic commerce secure? Is Theorem Proving Hard? Is P  NP? Can creativity be automated?