CS151 Complexity Theory Lecture 8 April 26, 2019.

Slides:



Advertisements
Similar presentations
On the Complexity of Parallel Hardness Amplification for One-Way Functions Chi-Jen Lu Academia Sinica, Taiwan.
Advertisements

Models of Computation Prepared by John Reif, Ph.D. Distinguished Professor of Computer Science Duke University Analysis of Algorithms Week 1, Lecture 2.
Complexity Theory Lecture 6
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Foundations of Cryptography Lecture 2: One-way functions are essential for identification. Amplification: from weak to strong one-way function Lecturer:
Talk for Topics course. Pseudo-Random Generators pseudo-random bits PRG seed Use a short “ seed ” of very few truly random bits to generate a long string.
Uniform Hardness vs. Randomness Tradeoffs for Arthur-Merlin Games. Danny Gutfreund, Hebrew U. Ronen Shaltiel, Weizmann Inst. Amnon Ta-Shma, Tel-Aviv U.
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
CS151 Complexity Theory Lecture 8 April 22, 2004.
Complexity 18-1 Complexity Andrei Bulatov Probabilistic Algorithms.
CS151 Complexity Theory Lecture 7 April 20, 2004.
CS151 Complexity Theory Lecture 5 April 13, 2004.
Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.
1 Slides by Iddo Tzameret and Gil Shklarski. Adapted from Oded Goldreich’s course lecture notes by Erez Waisbard and Gera Weiss.
ACT1 Slides by Vera Asodi & Tomer Naveh. Updated by : Avi Ben-Aroya & Alon Brook Adapted from Oded Goldreich’s course lecture notes by Sergey Benditkis,
CS151 Complexity Theory Lecture 7 April 20, 2015.
Submitted by : Estrella Eisenberg Yair Kaufman Ohad Lipsky Riva Gonen Shalom.
CS151 Complexity Theory Lecture 11 May 4, CS151 Lecture 112 Outline Extractors Trevisan’s extractor RL and undirected STCONN.
CS151 Complexity Theory Lecture 8 April 22, 2015.
In a World of BPP=P Oded Goldreich Weizmann Institute of Science.
CS151 Complexity Theory Lecture 9 April 27, 2004.
Foundations of Cryptography Lecture 2 Lecturer: Moni Naor.
CS151 Complexity Theory Lecture 9 April 27, 2015.
On Constructing Parallel Pseudorandom Generators from One-Way Functions Emanuele Viola Harvard University June 2005.
1 2 Probabilistic Computations  Extend the notion of “efficient computation” beyond polynomial-time- Turing machines.  We will still consider only.
Computation Model and Complexity Class. 2 An algorithmic process that uses the result of a random draw to make an approximated decision has the ability.
Umans Complexity Theory Lectures Lecture 1a: Problems and Languages.
Fall 2013 CMU CS Computational Complexity Lectures 8-9 Randomness, communication, complexity of unique solutions These slides are mostly a resequencing.
Umans Complexity Theory Lectures Lecture 17: Natural Proofs.
CS151 Complexity Theory Lecture 16 May 20, The outer verifier Theorem: NP  PCP[log n, polylog n] Proof (first steps): –define: Polynomial Constraint.
Pseudo-random generators Talk for Amnon ’ s seminar.
Umans Complexity Theory Lecturess Lecture 11: Randomness Extractors.
Umans Complexity Theory Lectures Lecture 9b: Pseudo-Random Generators (PRGs) for BPP: - Hardness vs. randomness - Nisan-Wigderson (NW) Pseudo- Random Generator.
Pseudo-randomness. Randomized complexity classes model: probabilistic Turing Machine –deterministic TM with additional read-only tape containing “coin.
 2005 SDU Lecture14 Mapping Reducibility, Complexity.
Umans Complexity Theory Lectures
Richard Cleve DC 2117 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 667 / Phys 767 C&O 481 / C&O 681 Lecture.
Probabilistic Algorithms
Cryptography Lecture 5 Arpita Patra © Arpita Patra.
Derandomization & Cryptography
Cryptography Lecture 13 Arpita Patra © Arpita Patra.
Pseudodeterministic Constructions in Subexponential Time
Umans Complexity Theory Lectures
CS151 Complexity Theory Lecture 11 May 8, 2017.
Pseudorandomness when the odds are against you
Cryptography Lecture 5.
Cryptography Lecture 6.
Umans Complexity Theory Lectures
CS151 Complexity Theory Lecture 17 May 31, 2017.
B504/I538: Introduction to Cryptography
The Curve Merger (Dvir & Widgerson, 2008)
Cryptography Lecture 12 Arpita Patra © Arpita Patra.
CSE838 Lecture notes copy right: Moon Jung Chung
CS151 Complexity Theory Lecture 3 April 10, 2017.
Cryptography Lecture 5 Arpita Patra © Arpita Patra.
Umans Complexity Theory Lectures
CS154, Lecture 12: Time Complexity
Cryptography Lecture 5.
Umans Complexity Theory Lectures
Cryptography Lecture 5 Arpita Patra © Arpita Patra.
CS21 Decidability and Tractability
Cryptography Lecture 6.
The Polynomial Hierarchy
Emanuele Viola Harvard University June 2005
CS151 Complexity Theory Lecture 7 April 23, 2019.
CS151 Complexity Theory Lecture 12 May 10, 2019.
Instructor: Aaron Roth
CS151 Complexity Theory Lecture 5 April 16, 2019.
CS151 Complexity Theory Lecture 4 April 12, 2019.
CS151 Complexity Theory Lecture 4 April 8, 2004.
Presentation transcript:

CS151 Complexity Theory Lecture 8 April 26, 2019

Randomized complexity classes model: probabilistic Turing Machine deterministic TM with additional read-only tape containing “coin flips” BPP (Bounded-error Probabilistic Poly-time) L ∈ BPP if there is a p.p.t. TM M: x ∈ L ⇒ Pry[M(x,y) accepts] ≥ 2/3 x ∉ L ⇒ Pry[M(x,y) rejects] ≥ 2/3 “p.p.t” = probabilistic polynomial time April 26, 2019

Randomized complexity classes RP (Random Polynomial-time) L ∈ RP if there is a p.p.t. TM M: x ∈ L ⇒ Pry[M(x,y) accepts] ≥ ½ x ∉ L ⇒ Pry[M(x,y) rejects] = 1 coRP (complement of Random Polynomial-time) L ∈ coRP if there is a p.p.t. TM M: x ∈ L ⇒ Pry[M(x,y) accepts] = 1 x ∉ L ⇒ Pry[M(x,y) rejects] ≥ ½ April 26, 2019

Randomized complexity classes One more important class: ZPP (Zero-error Probabilistic Poly-time) ZPP = RP ∩ coRP Pry[M(x,y) outputs “fail”] ≤ ½ otherwise outputs correct answer April 26, 2019

Relationship to other classes ZPP, RP, coRP, BPP, contain P they can simply ignore the tape with coin flips all are in PSPACE can exhaustively try all strings y count accepts/rejects; compute probability RP ⊆ NP (and coRP ⊆ coNP) multitude of accepting computations NP requires only one April 26, 2019

Relationship to other classes PSPACE BPP NP coNP RP coRP P April 26, 2019

Is randomness a panacea for intractability? BPP How powerful is BPP? We have seen an example of a problem in BPP that we only know how to solve in EXP. Is randomness a panacea for intractability? April 26, 2019

BPP It is not known if BPP = EXP (or even NEXP!) but there are strong hints that it does not Is there a deterministic simulation of BPP that does better than brute-force search? yes, if allow non-uniformity Theorem (Adleman): BPP ⊆ P/poly April 26, 2019

BPP and Boolean circuits Proof: language L ∈ BPP error reduction gives TM M such that if x ∈ L of length n Pry[M(x, y) accepts] ≥ 1 – (½)n2 if x ∉ L of length n Pry[M(x, y) rejects] ≥ 1 – (½)n2 April 26, 2019

BPP and Boolean circuits say “y is bad for x” if M(x,y) gives incorrect answer for fixed x: Pry[y is bad for x] ≤ (½)n2 Pry[y is bad for some x] ≤ 2n(½)n2< 1 Conclude: there exists some y on which M(x, y) is always correct build circuit for M, hardwire this y April 26, 2019

BPP and Boolean circuits Does BPP = EXP ? Adleman’s Theorem shows: BPP = EXP implies EXP ⊆ P/poly If you believe that randomness is all-powerful, you must also believe that non-uniformity gives an exponential advantage. April 26, 2019

further explore the relationship between BPP Next: further explore the relationship between randomness and nonuniformity Main tool: pseudo-random generators April 26, 2019

Derandomization Goal: try to simulate BPP in subexponential time (or better) use Pseudo-Random Generator (PRG): often: PRG “good” if it passes (ad-hoc) statistical tests G seed output string t bits m bits April 26, 2019

|Pry[C(y) = 1] – Prz[C(G(z)) = 1]| ≤ ε Derandomization ad-hoc tests not good enough to prove BPP has non-trivial simulations Our requirements: G is efficiently computable “stretches” t bits into m bits “fools” small circuits: for all circuits C of size at most s: |Pry[C(y) = 1] – Prz[C(G(z)) = 1]| ≤ ε April 26, 2019

Simulating BPP using PRGs Recall: L ∈ BPP implies exists p.p.t.TM M x ∈ L ⇒ Pry[M(x,y) accepts] ≥ 2/3 x ∉ L ⇒ Pry[M(x,y) rejects] ≥ 2/3 given an input x: convert M into circuit C(x, y) simplification: pad y so that |C| = |y| = m hardwire input x to get circuit Cx Pry[Cx(y) = 1] ≥ 2/3 (“yes”) Pry[Cx(y) = 1] ≤ 1/3 (“no”) April 26, 2019

Simulating BPP using PRGs Use a PRG G with output length m seed length t « m error ε < 1/6 fooling size s = m Compute Prz[Cx(G(z)) = 1] exactly evaluate Cx(G(z)) on every seed z ∈ {0,1}t running time (O(m)+(time for G))2t April 26, 2019

Simulating BPP using PRGs knowing Prz[Cx(G(z)) = 1], can distinguish between two cases: ε “yes”: 1/3 1/2 2/3 1 ε “no”: 1/3 1/2 2/3 1 April 26, 2019

O(m+mc)(2mδ) = O(2m2δ) = O(2n2kδ) Blum-Micali-Yao PRG Initial goal: for all 1 > δ > 0, we will build a family of PRGs {Gm} with: output length m fooling size s = m seed length t = mδ running time mc error ε < 1/6 implies: BPP ⊆ ∩δ>0 TIME(2nδ ) ⊆ EXP Why? simulation runs in time O(m+mc)(2mδ) = O(2m2δ) = O(2n2kδ) April 26, 2019

Prx[Cn(fn(x)) ∈ fn-1(fn(x))] ≤ ε(n) Blum-Micali-Yao PRG PRGs of this type imply existence of one-way-functions we’ll use widely believed cryptographic assumptions Definition: One Way Function (OWF): function family f = {fn}, fn:{0,1}n → {0,1}n fn computable in poly(n) time for every family of poly-size circuits {Cn} Prx[Cn(fn(x)) ∈ fn-1(fn(x))] ≤ ε(n) ε(n) = o(n-c) for all c April 26, 2019

Pry[Cn(y) = fn-1(y)] ≤ ε(n) Blum-Micali-Yao PRG believe one-way functions exist e.g. integer multiplication, discrete log, RSA (w/ minor modifications) Definition: One Way Permutation: OWF in which fn is 1-1 can simplify “Prx[Cn(fn(x)) ∈ fn-1(fn(x))] ≤ ε(n)” to Pry[Cn(y) = fn-1(y)] ≤ ε(n) April 26, 2019

First attempt attempt at PRG from OWP f: computable in time at most y0 ∈ {0,1}t yi = ft(yi-1) G(y0) = yk-1yk-2yk-3…y0 k = m/t computable in time at most ktc < mtc-1 = mc April 26, 2019

First attempt output is “unpredictable”: no poly-size circuit C can output yi-1 given yk-1yk-2yk-3…yi with non-negl. success prob. if C could, then given yi can compute yk-1, yk-2, …, yi+2, yi+1 and feed to C result is poly-size circuit to compute yi-1 = ft-1(yi) from yi note: we’re using that ft is 1-1 April 26, 2019

First attempt attempt: y0 ∈ {0,1}t yi = ft(yi-1) G(y0) = yk-1yk-2yk-3…y0 ft ft ft ft ft G(y0): y5 y4 y3 y2 y1 y0 same distribution! ft ft ft-1 ft-1 ft-1 G’(y3): y5 y4 y3 y2 y1 y0 April 26, 2019

First attempt one problem: hard to compute yi-1 from yi but might be easy to compute single bit (or several bits) of yi-1 from yi could use to build small circuit C that distinguishes G’s output from uniform distribution on {0,1}m April 26, 2019

|Pry[C(y) = 1] – Prz[C(G(z)) = 1]| ≤ ε First attempt second problem we don’t know if “unpredictability” given a prefix is sufficient to meet fooling requirement: |Pry[C(y) = 1] – Prz[C(G(z)) = 1]| ≤ ε April 26, 2019

Hard bits If {fn} is one-way permutation we know: no poly-size circuit can compute fn-1(y) from y with non-negligible success probability Pry[Cn(y) = fn-1(y)] ≤ ε’(n) We want to identify a single bit position j for which: no poly-size circuit can compute (fn-1(x))j from x with non-negligible advantage over a coin flip Pry[Cn(y) = (fn-1(y))j] ≤ ½ + ε(n) April 26, 2019

Hard bits For some specific functions f we know of such a bit position j More general: function hn:{0,1}n → {0,1} rather than just a bit position j. April 26, 2019

Hard bits Definition: hard bit for g = {gn} is family h = {hn}, hn:{0,1}n → {0,1} such that if circuit family {Cn} of size s(n) achieves: Prx[Cn(x) = hn(gn(x))] ≥ ½ + ε(n) then there is a circuit family {C’n} of size s’(n) that achieves: Prx[C’n(x) = gn(x)] ≥ ε’(n) with: ε’(n) = (ε(n)/n)O(1) s’(n) = (s(n)n/ε(n))O(1) April 26, 2019

Goldreich-Levin To get a generic hard bit, first need to modify our one-way permutation Define f’n :{0,1}n x {0,1}n → {0,1}2n as: f’n(x,y) = (fn(x), y) April 26, 2019

Goldreich-Levin f’n(x,y) = (fn(x), y) Two observations: f’ is a permutation if f is if circuit Cn achieves Prx,y[Cn(x,y) = f’n-1(x,y)] ≥ ε(n) then for some y* Prx[Cn(x,y*)=f’n-1(x,y*)=(fn-1(x), y*)] ≥ ε(n) and so f’ is a one-way permutation if f is. April 26, 2019

Goldreich-Levin The Goldreich-Levin function: GL2n : {0,1}n x {0,1}n → {0,1} is defined by: GL2n (x,y) = ⨁i:yi =1xi parity of subset of bits of x selected by 1’s of y inner-product of n-vectors x and y in GF(2) Theorem (G-L): for every function f, GL is a hard bit for f’. (proof: problem set) April 26, 2019

Distinguishers and predictors Distribution D on {0,1}n D ε-passes statistical tests of size s if for all circuits of size s: |Pry←Un[C(y) = 1] – Pry ←D[C(y) = 1]| ≤ ε circuit violating this is sometimes called an efficient “distinguisher” April 26, 2019

Distinguishers and predictors D ε-passes prediction tests of size s if for all circuits of size s: Pry←D[C(y1,2,…,i-1) = yi] ≤ ½ + ε circuit violating this is sometimes called an efficient “predictor” predictor seems stronger Yao showed essentially the same! important result and proof (“hybrid argument”) April 26, 2019

Distinguishers and predictors Theorem (Yao): if a distribution D on {0,1}n (ε/n)-passes all prediction tests of size s, then it ε-passes all statistical tests of size s’ = s – O(n). April 26, 2019

Distinguishers and predictors Proof: idea: proof by contradiction given a size s’ distinguisher C: |Pry←Un[C(y) = 1] – Pry←D[C(y) = 1]| > ε produce size s predictor P: Pry←D[P(y1,2,…,i-1) = yi] > ½ + ε/n work with distributions that are “hybrids” of the uniform distribution Un and D April 26, 2019

Distinguishers and predictors given a size s’ distinguisher C: |Pry←Un[C(y) = 1] – Pry←D[C(y) = 1]| > ε define n+1 hybrid distributions hybrid distribution Di: sample b = b1b2…bn from D sample r = r1r2…rn from Un output: b1b2…bi ri+1ri+2…rn April 26, 2019

Distinguishers and predictors Hybrid distributions: D0 = Un: ... ... Di-1: Di: ... ... Dn = D: April 26, 2019

Distinguishers and predictors Define: pi = Pry←Di[C(y) = 1] Note: p0=Pry←Un[C(y)=1]; pn=Pry←D[C(y)=1] by assumption: ε < |pn – p0| triangle inequality: |pn – p0| ≤ Σ1 ≤ i ≤ n|pi – pi-1| there must be some i for which |pi – pi-1| > ε/n WLOG assume pi – pi-1 > ε/n can invert output of C if necessary April 26, 2019

Distinguishers and predictors define distribution Di’ to be Di with i-th bit flipped pi’ = Pry←Di’[C(y) = 1] notice: Di-1 = (Di + Di’ )/2 pi-1 = (pi + pi’ )/2 Di-1: Di: Di’: April 26, 2019

Distinguishers and predictors randomized predictor P’ for ith bit: input: u = y1y2…yi-1 (which comes from D) flip a coin: d ∈{0,1} w = wi+1wi+2…wn ←Un-i evaluate C(udw) if 1, output d; if 0, output ¬d Claim: Pry←D,d,w←Un-i[P’(y1…i-1) = yi] > ½ + ε/n. April 26, 2019

Distinguishers and predictors P’ is randomized procedure there must be some fixing of its random bits d, w that preserves the success prob. final predictor P has d* and w* hardwired: may need to add ¬ gate Size is s’ + O(n) = s as promised circuit for P: C d* w* April 26, 2019

Distinguishers and predictors u = y1y2…yi-1 Proof of claim: Pry←D,d,w←Un-i[P’(y1…i-1) = yi] = Pr[yi = d | C(u,d,w) = 1]Pr[C(u,d,w) = 1] + Pr[yi = ¬d | C(u,d,w) = 0]Pr[C(u,d,w) = 0] = Pr[yi = d | C(u,d,w) = 1](pi-1) + Pr[yi = ¬d | C(u,d,w) = 0](1 - pi-1) April 26, 2019

Distinguishers and predictors u = y1y2…yi-1 Observe: Pr[yi = d | C(u,d,w) = 1] = Pr[C(u,d,w) = 1 | yi = d]Pr[yi=d] / Pr[C(u,d,w) = 1] = pi/(2pi-1) Pr[yi = ¬d | C(u,d,w) = 0] = Pr[C(u,d,w)=0 | yi= ¬d]Pr[yi=¬d] / Pr[C(u,d,w) = 0] = (1 – pi’) / 2(1 - pi-1) Di Di’ April 26, 2019

Distinguishers and predictors Success probability: Pr[yi=d|C(u,d,w)=1](pi-1) + Pr[yi=¬d|C(u,d,w)=0](1-pi-1) We know: Pr[yi = d | C(u,d,w) = 1] = pi/(2pi-1) Pr[yi = ¬d | C(u,d,w) = 0] = (1 - pi’)/2(1 - pi-1) pi-1 = (pi + pi’)/2 pi – pi-1 > ε/n Conclude: Pr[P’(y1…i-1) = yi] = ½ + (pi - pi’)/2 = ½ + pi/2 – (pi-1 – pi/2) = ½ + pi – pi-1 > ½ + ε/n. pi’/2 = pi-1 – pi/2 April 26, 2019

The BMY Generator Recall goal: for all 1 > δ > 0, family of PRGs {Gm} with output length m fooling size s = m seed length t = mδ running time mc error ε < 1/6 If one way permutations exist then WLOG there is OWP f = {fn} with hard bit h = {hn} April 26, 2019

The BMY Generator Generator Gδ = {Gδm}: t = mδ y0 ∈ {0,1}t yi = ft(yi-1) bi = ht(yi) Gδ(y0) = bm-1bm-2bm-3…b0 April 26, 2019

The BMY Generator Theorem (BMY): for every δ > 0, there is a constant c s.t. for all d, e, Gδ is a PRG with error ε < 1/md fooling size s = me running time mc Note: stronger than we needed sufficient to have ε < 1/6; s = m April 26, 2019

|Pry←Um[C(y) = 1] – Prz←D[C(z) = 1]| >1/md The BMY Generator Generator Gδ = {Gδm}: t = mδ; y0 ∈ {0,1}t; yi = ft(yi-1); bi = ht(yi) Gδm(y0) = bm-1bm-2bm-3…b0 Proof: computable in time at most mtc < mc+1 assume Gδ does not (1/md)-pass statistical test C = {Cm} of size me: |Pry←Um[C(y) = 1] – Prz←D[C(z) = 1]| >1/md April 26, 2019

Pry[P(bm-1…bm-i) = bm-i-1] > ½ + 1/md+1 The BMY Generator Generator Gδ = {Gδm}: t = mδ; y0 ∈ {0,1}t; yi = ft(yi-1); bi = ht(yi) Gδm(y0) = bm-1bm-2bm-3…b0 transform this distinguisher into a predictor P of size me + O(m): Pry[P(bm-1…bm-i) = bm-i-1] > ½ + 1/md+1 April 26, 2019

Pry[P(bm-1bm-2…bm-i) = bm-i-1] > ½ + 1/md+1 The BMY Generator Generator Gδ = {Gδm}: t = mδ; y0 ∈ {0,1}t; yi = ft(yi-1); bi = ht(yi) Gδm(y0) = bm-1bm-2bm-3…b0 a procedure to compute ht(ft-1(y)) set ym-i = y; bm-i = ht(ym-i) compute yj, bj for j = m-i+1, m-i+2…, m-1 as above evaluate P(bm-1bm-2…bm-i) f a permutation implies bm-1bm-2…bm-i distributed as (prefix of) output of generator: Pry[P(bm-1bm-2…bm-i) = bm-i-1] > ½ + 1/md+1 April 26, 2019

The BMY Generator Generator Gδ = {Gδm}: t = mδ; y0 ∈ {0,1}t; yi = ft(yi-1); bi = ht(yi) Gδm(y0) = bm-1bm-2bm-3…b0 Pry[P(bm-1bm-2…bm-i) = bm-i-1] > ½ + 1/md+1 What is bm-i-1? bm-i-1 = ht(ym-i-1) = ht(ft-1(ym-i)) = ht(ft-1(y)) We have described a family of polynomial-size circuits that computes ht(ft-1(y)) from y with success greater than ½ + 1/poly(m) Contradiction. April 26, 2019

The BMY Generator same distribution ft ft ft ft ft G(y0): y5 y4 y3 y2 April 26, 2019