–Def: A language L is in BPP c,s ( 0  s(n)  c(n)  1,  n  N) if there exists a probabilistic poly-time TM M s.t. : 1.  w  L, Pr[M accepts w]  c(|w|),

Slides:



Advertisements
Similar presentations
Function Technique Eduardo Pinheiro Paul Ilardi Athanasios E. Papathanasiou The.
Advertisements

Complexity Theory Lecture 6
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
COMPLEXITY THEORY CSci 5403 LECTURE XVI: COUNTING PROBLEMS AND RANDOMIZED REDUCTIONS.
Isolation Technique April 16, 2001 Jason Ku Tao Li.
1 The Monte Carlo method. 2 (0,0) (1,1) (-1,-1) (-1,1) (1,-1) 1 Z= 1 If  X 2 +Y 2  1 0 o/w (X,Y) is a point chosen uniformly at random in a 2  2 square.
Randomized Algorithms Kyomin Jung KAIST Applied Algorithm Lab Jan 12, WSAC
Complexity 25-1 Complexity Andrei Bulatov #P-Completeness.
Complexity 26-1 Complexity Andrei Bulatov Interactive Proofs.
1 L is in NP means: There is a language L’ in P and a polynomial p so that L 1 · L 2 means: For some polynomial time computable map r : 8 x: x 2 L 1 iff.
Complexity 18-1 Complexity Andrei Bulatov Probabilistic Algorithms.
February 23, 2015CS21 Lecture 201 CS21 Decidability and Tractability Lecture 20 February 23, 2015.
Reducibility A reduction is a way of converting one problem into another problem in such a way that a solution to the second problem can be used to solve.
CS151 Complexity Theory Lecture 7 April 20, 2004.
Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.
The Counting Class #P Slides by Vera Asodi & Tomer Naveh
FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY Read sections 7.1 – 7.3 of the book for next time.
Randomized Computation Roni Parshani Orly Margalit Eran Mantzur Avi Mintz
ACT1 Slides by Vera Asodi & Tomer Naveh. Updated by : Avi Ben-Aroya & Alon Brook Adapted from Oded Goldreich’s course lecture notes by Sergey Benditkis,
Probabilistic Complexity. Probabilistic Algorithms Def: A probabilistic Turing Machine M is a type of non- deterministic TM, where each non-deterministic.
Arithmetic Hardness vs. Randomness Valentine Kabanets SFU.
1 Slides by Golan Weisz, Omer Ben Shalom Nir Ailon & Tal Moran Adapted from Oded Goldreich’s course lecture notes by Moshe Lewenstien, Yehuda Lindell.
CS151 Complexity Theory Lecture 7 April 20, 2015.
Submitted by : Estrella Eisenberg Yair Kaufman Ohad Lipsky Riva Gonen Shalom.
Complexity 19-1 Complexity Andrei Bulatov More Probabilistic Algorithms.
Time Complexity.
1 Tight Hardness Results for Some Approximation Problems [mostly Håstad] Adi Akavia Dana Moshkovitz S. Safra.
Princeton University COS 433 Cryptography Fall 2005 Boaz Barak COS 433: Cryptography Princeton University Fall 2005 Boaz Barak Lecture 3: Computational.
Complexity ©D. Moshkovitz 1 And Randomized Computations The Polynomial Hierarchy.
1 The PCP starting point. 2 Overview In this lecture we’ll present the Quadratic Solvability problem. In this lecture we’ll present the Quadratic Solvability.
February 20, 2015CS21 Lecture 191 CS21 Decidability and Tractability Lecture 19 February 20, 2015.
The Polynomial Hierarchy By Moti Meir And Yitzhak Sapir Based on notes from lectures by Oded Goldreich taken by Ronen Mizrahi, and lectures by Ely Porat.
1 Joint work with Shmuel Safra. 2 Motivation 3 Motivation.
1 The Cook-Levin Theorem Zeph Grunschlag. 2 Announcements Last HW due Thursday Please give feedback about course at oracle.seas.columbia.edu/wces oracle.seas.columbia.edu/wces.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
Definition: Let M be a deterministic Turing Machine that halts on all inputs. Space Complexity of M is the function f:N  N, where f(n) is the maximum.
Computational Complexity Theory Lecture 2: Reductions, NP-completeness, Cook-Levin theorem Indian Institute of Science.
Theory of Computing Lecture 17 MAS 714 Hartmut Klauck.
1 2 Probabilistic Computations  Extend the notion of “efficient computation” beyond polynomial-time- Turing machines.  We will still consider only.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
. CLASSES RP AND ZPP By: SARIKA PAMMI. CONTENTS:  INTRODUCTION  RP  FACTS ABOUT RP  MONTE CARLO ALGORITHM  CO-RP  ZPP  FACTS ABOUT ZPP  RELATION.
Complexity 25-1 Complexity Andrei Bulatov Counting Problems.
Umans Complexity Theory Lectures Lecture 1a: Problems and Languages.
1. 2 Lecture outline Basic definitions: Basic definitions: P, NP complexity classes P, NP complexity classes the notion of a certificate. the notion of.
Fall 2013 CMU CS Computational Complexity Lectures 8-9 Randomness, communication, complexity of unique solutions These slides are mostly a resequencing.
1 How to establish NP-hardness Lemma: If L 1 is NP-hard and L 1 ≤ L 2 then L 2 is NP-hard.
NP-completeness Class of hard problems. Jaruloj ChongstitvatanaNP-complete Problems2 Outline  Introduction  Problems and Languages Turing machines and.
Complexity 24-1 Complexity Andrei Bulatov Interactive Proofs.
NP-complete Languages
Theory of Computational Complexity Yuji Ishikawa Avis lab. M1.
Graphs 4/13/2018 5:25 AM Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015 NP-Completeness.
P & NP.
Umans Complexity Theory Lectures
Computational Complexity Theory
NP-Completeness NP-Completeness Graphs 5/7/ :49 PM x x x x x x x
Probabilistic Algorithms
Richard Anderson Lecture 26 NP-Completeness
Richard Anderson Lecture 26 NP-Completeness
NP-Completeness Yin Tat Lee
Intro to Theory of Computation
Intro to Theory of Computation
NP-Completeness NP-Completeness Graphs 11/16/2018 2:32 AM x x x x x x
NP-Completeness NP-Completeness Graphs 12/3/2018 2:46 AM x x x x x x x
CS154, Lecture 13: P vs NP.
NP-Completeness Yin Tat Lee
Our First NP-Complete Problem
CS151 Complexity Theory Lecture 7 April 23, 2019.
Instructor: Aaron Roth
Probabilistic Complexity
Presentation transcript:

–Def: A language L is in BPP c,s ( 0  s(n)  c(n)  1,  n  N) if there exists a probabilistic poly-time TM M s.t. : 1.  w  L, Pr[M accepts w]  c(|w|), 2.  w  L, Pr[M(x) accepts]  s(|w|). Thm: (Amplification of BPP) For all choices of poly. computable functions c(n) and s(n) : {0,1} n  {0,1}, such that there exists a poly. Q(n) s.t.  n c(n)-s(n)  1/Q(n) and m=O(1), More on Randomized Class

Pf: Given a BPP machine M with c(n), s(n). We construct a BPP machine for the same language with for any m=O(1). –Define M’: 1. Run M on k times independently. 2. Accept if the number of time M accepted is  k ‧ (c(n)+s(n))/2. X i : indicator random variable for the event that M accepts w.

By the definition of BPP c,s we have: w  L  E[X i ]  c(n), w  L  E[X i ]  s(n).

Chernoff bound: For any k independent identically distributed random variable X 1,… X k with values in {0,1}, and with expected values E[X i ]=p, for any  (0,1), –

Using Chernoff bounds, choose  with Setting

So ■

P/poly and circuit complexity –Def: P/poly={ L |  A  P, a sequence of strings {S i } i  N and a constant k s.t. |S i |=O(i k ) and x  L  (x,S |x| )  A } –Def: A language L has poly circuit complexity if there exists a constant k such that for all n, the function f n that is 1 iff its input (of length n) is in L, has circuit complexity O(n k ).

Prop: L  P/poly iff L has poly. circuit complexity. Pf:  : If L has poly circuit complexity, then for each n, there is a circuit of size poly in n that decides membership in L for all words of length n. Encode this circuit on a string, S n ~ poly size. Construct a poly time TM taking x and S |x| and simulate S |x| on input x.

 : Assume L  P/poly. If M decides L in TIME(O(n k )), then we can construct a circuit c k of size O(n 2k ) that simulates M running on input strings of length n. Hardwired in each machine will be the advice strings S n, which is constant for each input size n and which grows polynomial in n. ■

Thm: BPP  P/poly. Pf: Let L be an arbitrary language in BPP. –By amplification of BPP, we have a TM M  that decides L. Classify all possible random string R as follows: R is bad for an input x if M(x,R) is wrong. R is bad if there exists an input w for which R is bad. R is good otherwise. Fix w, Pr[R is bad for w] 

Pr[R is bad]  Pr[R is bad for w]  Therefore, Pr[R is good] = 1-Pr[R is bad] > 0 Thus, there exists a poly size advice string for any input of length n. ■

Thm: BPP   2 P(Sipser,Lautemann) Pf: Suppose L  BPP. –Goal: Show that there is a  2 P Machine that decides L. –I.e. show that  a deterministic poly time TM M(x,y,z) s.t. x  L   y s.t.  z M(x,y,z)=1 x  L   y  z s.t. M(x,y,z)=0.

Let A be a BPP machine that uses Q(n) random bits with c(n)= ½ and s(n)=1/3Q(n) where n is the input length and Q(n) is poly. Let R be the set of all random string of length Q(n) used on A’s random tape. |R|=2 Q(n). –Define F s (y)=y  s, s  R, y  R. F s (y) is random if s is chosen uniformly. Imagine a new machine, A ’ (x,y,S), where S is a sequence random bits (s 1,s 2,…,s k ), y  R, x is the input to test if x  L A A ’ is a deterministic TM s.t.: A ’ (x,y,S)=1   s i  S, A accepts x with y  s i on its random tape.

If x  L A, and a specific S is chosen at random, then I.e. if x  L A, then for any S  R k,  y  R s.t. A ’ (x,y,S)=0. let k  2Q(n)

If x  L A, and a specific y is chosen at random

Let k=2Q(n) I.e. if x  L A, then  an S  R k, s.t.  y  R, A ’ (x,y,S)=1 Therefore, x  L A   S  R k, s.t.  y  R, A ’ (x,y,S)=1. So a  2 P machine decides L A by guessing S, guessing all y and checking A ’ (x,y,S)=1.

USAT:  is USAT if  is satisfied by exactly one truth assignment. –Suppose  is satisfiable by at most one truth assignment. We want to decide if  USAT. It turns out to decide  USAT is as difficult as to decide  SAT.

Randomized reduction from SAT to USAT. M: randomized poly-time TM M s.t. –  SAT  M(  )  SAT (  USAT) –  SAT  Prob[M(  )  USAT]  1/8. Universal Hashing: Given sets S and T, a family H of functions from S to T is a universal family of hash functions from S to T if –1)  x  S,  w  T, Pr h  H [h(x)=w]=1/|T| –2)  x  y  S,  w,z  T, Pr h  H [(h(x)=w) ∧ (h(y)=z)]=1/|T| 2

eg. Let S={0,1} n, T={0,1} k and for x  {0,1} n, let h M,b (x)=Mx+b, where M is a k  n Boolean matrix, b is a column vector is {0,1} k. –H={ h M,b : for all possible M and b }. Prop: The above H is a family of universal hash functions from {0,1} n to {0,1} k.

Pf: –1) For any fixed x  {0,1} n - and y  {0,1} k Pr[x+y+1=1]=?

–2) For x  y  {0,1} n and w,z  {0,1} k.

Prop:  x  y  {0,1} n - and  w,z  {0,1} k, we have Pr M [Mx=w ∧ My=z]=1/2 2k. Pf: If x and y are e 1 =(1,0,…,0) and e 2 =(0,1,0,…,0), respectively, then it is true. Since neither x nor y is, they’re linear independent. Thus, there exists rank n matrix A s.t. Ax=e 1, and Ay=e 2. ∵ rank(A)=n,  MA is random if M is chosen randomly. So, the truth of the proposition is clear.

M x (a 1,…,a n )  (b 1,…,b n )=c fixed Pr[a 1 b 1 +a 2 b 2 +…+a n b n =c]=? a 1,a 2,…,a n  {0,1} are selected randomly, c  {0,1}.

Prop: Let S  {0,1} n, with 2 k-2  |S|  2 k-1,Then Pr[  ! s  S s.t. Ms= 0 ]  1/8, where the probability is taken over the uniform choice of M from the set of all k x n Boolean matrices. Pf: –

Successive restrictions: Given a CNF formula  on n variables, choose n+1 random vectors and create  I for 1  i  n+1 as follows:

Lemma: If  is not satisfiable, then none of the  i ’s are satisfiable. Lemma: If  is satisfiable, then with probability at least 1/8, at least one of the  i ’s has a unique satisfying assignment. Pf: Let S be the set of satisfying assignments of  : by hypothesis |S|  1. Let k be such that 2 k-2  |S|<2 k-1. By the previous prop.,  k has a probability  1/8 of having exactly one satisfying assignment.

Thus, detecting unique solutions is an hard as NP. –UP:the class of promise problems where instances are promised to whose either zero or one solution. Thm: NP  RP UP. Thm: If UP  RP, then NP=RP.