Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Polynomial Hierarchy

Similar presentations


Presentation on theme: "The Polynomial Hierarchy"— Presentation transcript:

1 The Polynomial Hierarchy
And Randomized Computations Complexity ©D. Moshkovitz

2 Introduction Objectives:
To introduce the polynomial-time hierarchy (PH) To introduce BPP To show the relationship between the two Overview: satisfiability and PH probabilistic TMs and BPP BPP2 Complexity ©D. Moshkovitz

3 Deciding Satifiability
We’ve already seen, that deciding whether a formula is satisfiable… x1 …xn(x1x2x8)… (x6x3) x1x2x3… [(x1x2x8)…(x6x3)] only existential quantifier existential & universal quantifiers PSPACE-complete NP-complete Complexity ©D. Moshkovitz

4 Technical Note x1x2…xk is the same as x=<x1,x2,…,xk>
Thus, allowing several adjacent quantifiers of the same type does not change the problem. Complexity ©D. Moshkovitz

5 i alternating quantifiers
The Hierarchy Definition (i): i is the class of all languages reducible to deciding the sat. of a formula of type: x1x2 x3… R(x1,x2,x3,…) i alternating quantifiers Complexity ©D. Moshkovitz

6 i alternating quantifiers
The Hierarchy Definition (i): i is the class of all languages reducible to deciding the sat. of a formula of type: x1x2x3… R(x1,x2,x3,…) i alternating quantifiers Complexity ©D. Moshkovitz

7 PH (Polynomial-time Hierarchy)
Definition: PH = i i Complexity ©D. Moshkovitz

8 Simple Observations “base”: 1=NP
“connection between  and ”: i=coi “hierarchy”: ii+1 and ii+1 “upper bound”: PHPSPACE Complexity ©D. Moshkovitz

9 Can the Hierarchy Collapse?
Proposition: If NP=coNP, then PH=NP. Proof Idea: By induction on i, i=NP. Complexity ©D. Moshkovitz

10 Probabilistic Turing Machines
Probabilistic TMs have an “extra” tape: the random tape “standard” TMs probabilistic TMs M(x) Prr[M(x,r)] content of input tape content of random tape Complexity ©D. Moshkovitz

11 Does It Really Capture The Notion of Randomized Algorithms?
It doesn’t matter if you toss all your coins in advance or throughout the computation… Complexity ©D. Moshkovitz

12 BPP (Bounded-Probability Polynomial-Time)
Definition: BPP is the class of all languages L which have a probabilistic polynomial time TM M, s.t x Prr[M(x,r) = L(x)]  2/3 L(x)=1  xL such TMs are called ‘Atlantic City’ Complexity ©D. Moshkovitz

13 random strings for which M is right
BPP Illustrated Note: TMs which are right for most x’s (e.g for PRIMES: always say ‘NO’) are NOT acceptable! For any input x, all random strings random strings for which M is right Complexity ©D. Moshkovitz

14 We can get better amplifications, but this will suffice here...
Claim: If LBPP, then there exists a probabilistic polynomial TM M’, and a polynomial p(n) s.t x{0,1}n Prr{0,1}p(n)[M’(x,r)L(x)] < 1/(3p(n)) We can get better amplifications, but this will suffice here... Complexity ©D. Moshkovitz

15 Proof Idea Repeat Output the majority answer
M(x,r) Repeat Pick r uniformly at random Simulate M(x,r) Output the majority answer Yes Yes No Yes No Yes Complexity ©D. Moshkovitz

16 ignore the random input
Relations to P and NP ? P BPP NP ignore the random input Complexity ©D. Moshkovitz

17 Does BPPNP? We may have considered saying:
“Use the random string as a witness” Why is that wrong? Because non-members may be recognized as members Complexity ©D. Moshkovitz

18 Make sure you understand why the theorem follows
“Some Comfort” Theorem (Sipser,Lautemann): BPP2 Underlying observation: LBPP  there exists a poly. probabilistic TM M, s.t for any n and x{0,1}n let m=p(n) s.t xL  s1,…,sm{0,1}m r{0,1}m 1imM(x,rsi)=1 Make sure you understand why the theorem follows Complexity ©D. Moshkovitz

19 Yes-instance {0, 1}m Complexity ©D. Moshkovitz

20 No-instance {0, 1}m Complexity ©D. Moshkovitz

21 false for less than 1/3m of the r’s
Our Starting Point m bits n bits M x r xL? LBPP By amplification, there’s a poly-time machine M which uses m random coins errs w.p < 1/3m false for less than 1/3m of the r’s Complexity ©D. Moshkovitz

22 Proving the Underlying Observation
We will follow the Probabilistic Method Prr[r has property P] > 0   r with property P Complexity ©D. Moshkovitz

23 Yes-Instances Accepted
Let xL. We want s1,…,sm{0,1}m s.t r{0,1}m 1imM(x,rsi)=1 So we’ll bound the probability over si’s that it doesn’t hold. Complexity ©D. Moshkovitz

24 Bounding The Probability Random si’s Do Not Satisfy This
union-bound si’s independent r: s is random  rs is random xL Complexity ©D. Moshkovitz

25 No-Instances Rejected
Let xL. Let s1,…,sm{0,1}m . We want r{0,1}m s.t 1imM(x,rsi)=0 So we’ll bound the probability over r that it doesn’t hold. Complexity ©D. Moshkovitz

26 Bounding The Probability Random r Does Not Satisfy This
union-bound xL Complexity ©D. Moshkovitz

27 Q.E.D! It follows that: LBPP  there’s a poly. prob. TM M, s.t for any x there is m s.t xL  s1,…,sm r 1imM(x,rsi)=1 Thus, L2  BPP2 Complexity ©D. Moshkovitz

28  Summary We defined the polynomial-time hierarchy
Saw NP  PH  PSPACE NP=coNP  PH=NP (“the hierarchy collapses”) Complexity ©D. Moshkovitz

29  Summary We presented probabilistic TMs
We defined the complexity class BPP We saw how to amplify randomized computations We proved P  BPP  2 Complexity ©D. Moshkovitz

30 Summary We also presented a new paradigm for proving existence utilizing the algebraic tools of probability theory Prr[r has property P] > 0   r with property P The probabilistic method Complexity ©D. Moshkovitz


Download ppt "The Polynomial Hierarchy"

Similar presentations


Ads by Google