Download presentation
Presentation is loading. Please wait.
Published byGervais Denis Small Modified over 9 years ago
2
Probabilistic algorithms Section 10.2 Giorgi Japaridze Theory of Computability
3
Definition of probabilistic Turing machines 10.2.a Giorgi Japaridze Theory of Computability Definition 10.3 A probabilistic Turing machine M is a type of nondeterministic TM in which each nondeterministic step is called a coin-flip step and has two legal next moves. We assign a probability to each branch b of M’s computation on input w as follows. Define the probability of b to be Pr[b] = 2 -k, where k is the number of coin-flip steps that occur on branch b. We define the probability that M accepts w to be Pr[M accepts w] = Pr[b] b is an accepting branch In other words, the probability that M accepts w is the probability that we would reach an accepting configuration if we simulated M on w by flipping a coin to determine which move to follow at each coin-flip step. We let Pr[M rejects w] = 1 - Pr[M accepts w]
4
Example 10.2.b Giorgi Japaridze Theory of Computability start 0R0R - R accept - R reject 0 R - R What is the probability that 0 is accepted? What is the probability that 00 is rejected? The language {0} is recognized with what error probability (see next slide)? Any other language is recognized with what error probability (see next slide)? 75% 100% 25% 100%
5
The class BPP 10.2.c Giorgi Japaridze Theory of Computability For 0 ≤ < ½, we say that M recognizes language A with error probability if the probability that we would obtain the wrong answer by simulating M is at most . I.e.: 1. w A implies Pr[M accepts w] ≥ 1- , and 2. w A implies Pr[M rejects w] ≥ 1- . We also consider error probability bounds that depend on the input length n. For example, error probability =2 -n indicates an exponentially small probability of error. Definition 10.4 BPP is the class of languages that are recognized by probabilistic polynomial time TMs with an error probability of 1/3. Instead of 1/3, any strictly between 0 and ½ would yield an equivalent definition by virtue of the amplification lemma (on the next slide). It gives a simple way of making the error probability exponentially small. Note that a probabilistic algorithm with an error probability of 2 -100 is far more likely to give an erroneous result because the computer on which it runs has a hardware failure than because of an unlucky toss of its coins.
6
The amplification lemma 10.2.d Giorgi Japaridze Theory of Computability Lemma 10.5 Let be a fixed constant strictly between 0 and ½, and p(n) any polynomial. Then any probabilistic polynomial time TM M 1 that operates with error probability has an equivalent probabilistic polynomial time TM M 2 that operates with an error probability of 2 -p(n). Proof idea: M 2 simulates M 1 by running it a polynomial number of times and taking the majority vote of the outcomes. The probability of error decreases exponentially with the number of runs of M 1 made.
7
Open problems ssurrounding BPP 10.2.e Giorgi Japaridze Theory of Computability Besides the problems in P, which are obviously in BPP, many problems were known to be in BPP but not known to be in P. The number of such problems is decreasing, and it is conjectured that P = BPP. For a long time, one of the most famous problems that was known to be in BPP but not known to be in P was PRIMES. However, in 2002, Agrawal and his students showed that PRIMES P. The relationship between BPP and NP is unknown: it is not known if BPP is a subset of NP, or if NP is a subset of BPP, or if they are incomparable. BPP is known to be a subset of PSPACE. It is however unknown whether vice versa also holds. It is also known that either P = BPP or P ≠ NP or both.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.