Download presentation
Presentation is loading. Please wait.
Published byIris Ducker Modified over 9 years ago
1
Isolation Technique April 16, 2001 Jason Ku Tao Li
2
Outline 1)Show that we can reduce NP, with high probability, to US. That is: NP randomized reduces to detecting unique solutions. 2)PH P PP
3
Isolation Lemma 1)Definitions 2)Isolation Lemma 3)Example of using Isolation Lemma
4
Definition of weight functions A weight function W, maps a finite set U + For a set S U, W(S)= x S W(x) Let F be a family of non-empty subsets of U. A weight function W is “good for” F if there is a unique minimum weight set in F, and “bad for” F otherwise. Ex: let U={u 1, u 2, u 3 }, let F ={(u 1 ), (u 2 ), (u 3 ), (u 1 u 2 )} define W 1 (u 1 )=1W 2 (u 1 )=1 W 1 (u 2 )=2 W 2 (u 2 )=1 W 1 (u 3 )=3W 2 (u 3 )=2 W 1 is good for F while W 2 is bad for F.
5
Isolation Lemma Let U be a finite set Let F 1, F 2, …F m be families of non-empty subsets of U Let D = ||U|| Let R > mD Let Z be a set of weight functions s.t. the weight of any individual element of U is at most R Let , 0 mD/R Then, more than (1- )||Z|| weight functions are good for all of F 1, F 2, …F m.
6
Proof of Isolation Lemma Proof sketch: By definition, a weight function W is bad if there are at least 2 different minimum weight sets in F. Let S1 and S2 be 2 different sets with the same minimum weights, then x S1 s.t. x S2. Call x ambiguous. If we know the weights of all other elements in U, either x is unambiguous, or there is one specific weight for x that makes x ambiguous.
7
Lets Count So, for an x U, there are at most R D-1 weight functions s.t. x is ambiguous. There are R D weight functions, m choices for F and D choices for x. Thus the fraction of weight functions that are bad for F i is at most mDR D-1 /R D = mD/R < . So the fraction of weight functions good for F i is 1- .
8
Example of Isolating Lemma Let U={u1, u2, u3} D=3 Let F 1 ={(u1), (u1,u3), (u1,u2,u3), (u2)} m=1 R = 4 > mD = 3 ||Z|| = 64 Then at least (1 – ¾)64 = 16 weight functions are good for F. W1(u1)=1W2(u1)=2W3(u1)=1W4(u1)=1 W1(u2)=2W2(u2)=3W3(u2)=2W4(u2)=3 W1(u3)=3W2(u3)=4W3(u3)=2W4(u3)=3 6 variations6 variations3 variations3 variations 18 variations, and more.
9
Definition of US US = {L | ( NPTM M) ( x) x L #acc M(x) =1}
10
NP randomized reduces to US NP RP US Proof Map: 1) Definitions I, II 2) Apply Isolation Lemma to get a probability 3) Construct an oracle B US 4) Construct a machine N that uses oracle B 5) show N RP US 6) Show x L NP implies x L(N) RP US
11
Definitions I Let A = { | NPTM L(x) on path y accepts} for L NP, a polynomial p, s.t. x *, x L at least 1 y, |y| = p(|x|), s.t. A Encode y as follows: y = y 1 y 2 …y n = {i | 1 |i| p(n) y i = 1} ex: y = 1001 = {1, 4} (1 take right branch, 0 take left branch)
12
Definitions II Let U(x) = {1, 2, …, p(|x|)} D = ||U|| = p(|x|) Let F(x) = y s.t. A (collection of accepting paths) m = 1 Let Z(x) = weight functions that assign weights no greater than 4p(|x|) R = 4p(|x|)
13
Applying Isolation Lemma By the Isolation Lemma: if x L, 3/4 of weights functions assigns F a unique minimum weight set if x L, there are no accepting paths y F so zero weight functions are assigns F a unique minimum weight set
14
Construct an oracle B US Let B = { | W Z(x), 1 j p 2 (|x|), and a unique y F s.t. W(y) = j} NPTM M B on input u: 1)if u is not of the form reject 2)else, using p(|x|) non-deterministic moves, selects y and accepts u A and W(y)=j.
15
Why B US If u B, there is a unique path y F s.t. W(y)=j. Thus machine M B will only accept once. If u B, there are either zero, or more than 1 y F s.t. W(y)=j. Thus machine M B will have either zero, or more than 1 accepting path.
16
Construct an RP machine with oracle B NPTM N on input x: 1)Create random weight functions W properly bounded. 2)For each j, 1 j 4p 2 (|x|), ask oracle B if B. If yes, accept. If no, reject.
17
N RP US and x L high probability x N For every x *, - If x L, M B on accepts with probability ¾, so N accepts with probability ¾. - If x L, M B on rejects with probability 1, so N rejects with probability 1. So, - x L high probability x N - Since x L implies (N, x) = ¾ >.5 acceptance, and x L implies (N, x) = 1 rejecting, N RP
18
Definition of P and #P P = {L | ( NPTM M) ( x) x L #acc M(x) is odd} #P ={f | ( NPTM M) ( x) f(x) = #acc M(x) }
19
Toda’s Theorem PH P PP Three major parts to prove it: (Valiant&Vazrani) NP BPP P Theorem 4.5 Lemma 4.13 PP P P PP, hence BPP P P PP
20
(Valiant&Vazrani) NP BPP P Proof: Let A NP, A = L(M) and M runs in time p(|x|). Let B={(x,w,k): M has an odd # of accepting paths on input x having weight k}, w:{1,…,p(|x|)}----{1,…,4p(|x|)}, B P
21
(Valiant &Vazrani) Cont. For a BPP P algorithm, consider On input x Randomly pick w for k:=1 to 4p 2 (|x|) ask if (x,w,k) is in B if so, then halt and accept end-for if control reaches here, then halt and reject
22
Note Valiant &Vazrani Theorem is relativizable. In other words, we have NP A = BPP P A for every oracle A
23
Theorem 4.5 PH BPP P We prove it by induction Three steps for induction step: Apply Valiant & Vazrani to the base machine Swap BPP and P in the middle Collapse BPP BPP BPP, P P P
24
Step 1 for Thm. 4.5 Induction hypothesis: Since NP A = BPP P A for every oracle A, Hence,
25
Step 2: Swapping By lemma 4.9 P BPP A BPP P A Hence
26
Step 3: Collapse Proposition 4.6: BPP BPP A = BPP A Proposition 4.8: P P = P Hence
27
Toda’s Theorem Proposition 4.15 P PP = P #P Toda’s Theorem: PP is Hard for the polynomial Hierarchy PH P PP = P #P
28
Proof for BPP P P #P Let A BPP P, where A is accepted by M B and let f be the #P function for B. Let n k be the running time of M. Assume first that M makes only one query along any path. Then let g(x,y) be a #P function that is defined to be the number of accepting paths of the following machine:
29
Proof cont. 1 On input x,y run M(x) along path y when a query “w is in B?” is made then flip a coin c in {0,1} and use this as the oracle answer and continue simulating M(x) if the simulation accepts, then generate f(w)+(1-c) paths and accept
30
Proof Cont. 2 g(x,y) is odd if and only if M B (x) accepts along y For g(x,y), consider a #P function g’(x,y) such that : g(x,y) is odd, then g’(x,y) =1(mod 2 ^ n k ) g(x,y) is even, then g’(x,y) = 0(mod 2 ^ n k ) Define h(x)=
31
Proof Cont.3 The value h(x) (mod 2^n k )represents the number of y’s such that M B (x) accepts along path y Our P #P algorithm: on input x, using the oracle h(x), decides if the following holds: if so, x is accepted, and if not x is rejected
32
Proof Cont. 4 If M makes more than one query, modify g(x,y) as follows: on input x,y repeat run M(x) along with path y when a query “w is in B?’’ is made then flip a coin c in {0,1} and generate f(w)+(1-c) paths use c as the oracle answer and continue simulating M(x) until no more queries are asked; if the simulation of M(x) along path y accepts with this sequence of guessed oracle queries then accept else reject
33
Proof Cont. 5 Call the above machine as N Claim : M B accepts x along y if and only if #acc N (x,y) = g(x,y) is odd
34
Fact 1 Let k in N, f in #P, then there exists g in #P such that f(x) is odd then g(x) = 1 (mod 2^n k ) f(x) is even than g(x) = 0 (mod 2^n k )
35
Fact 2 Let f(x,y) be a #P function, then Let M be the machine such that #acc M (x,y)=f(x,y). Consider the following machine M’: on input x compute |x| k guess y of length |x| k run M(x,y) g(x)= #acc M’ ( x,y)
36
Discussions UL/Poly = NL/Poly ? UL= NL ? UP = NP NP PSPACE = UP PSPACE = PSPACE There is an oracle relative to which NP<>UP.
37
Conclusions We’ve shown, by use of the isolation lemma, that NP RP US BPP P. This was the base case of an inductive proof to show PH BPP P. From there we extended to Toda’s theorem: PH P PP = P #P.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.