Alternating Turing Machine (ATM) –  node is marked accept iff any of its children is marked accept. –  node is marked accept iff all of its children.

Slides:



Advertisements
Similar presentations
Part VI NP-Hardness. Lecture 23 Whats NP? Hard Problems.
Advertisements

Lecture 16 Deterministic Turing Machine (DTM) Finite Control tape head.
1 Savitch and Immerman- Szelepcsènyi Theorems. 2 Space Compression  For every k-tape S(n) space bounded offline (with a separate read-only input tape)
1 Nondeterministic Space is Closed Under Complement Presented by Jing Zhang and Yingbo Wang Theory of Computation II Professor: Geoffrey Smith.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
1 Space Complexity. 2 Def: Let M be a deterministic Turing Machine that halts on all inputs. Space Complexity of M is the function f:N  N, where f(n)
Giorgi Japaridze Theory of Computability Savitch’s Theorem Section 8.1.
The class NP Section 7.3 Giorgi Japaridze Theory of Computability.
NL equals coNL Section 8.6 Giorgi Japaridze Theory of Computability.
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen.
Alternation Alternation: generalizes non-determinism, where each state is either “existential” or “universal”: Old: existential states New: universal states.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
Complexity 12-1 Complexity Andrei Bulatov Non-Deterministic Space.
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
Complexity 11-1 Complexity Andrei Bulatov Space Complexity.
Complexity 13-1 Complexity Andrei Bulatov Hierarchy Theorem.
Computability and Complexity 14-1 Computability and Complexity Andrei Bulatov Cook’s Theorem.
P and NP Sipser (pages ). CS 311 Fall Polynomial time P = ∪ k TIME(n k ) … P = ∪ k TIME(n k ) … TIME(n 3 ) TIME(n 2 ) TIME(n)
Computability and Complexity 19-1 Computability and Complexity Andrei Bulatov Non-Deterministic Space.
P, NP, PS, and NPS By Muhannad Harrim. Class P P is the complexity class containing decision problems which can be solved by a Deterministic Turing machine.
Complexity ©D.Moshkovitz 1 Turing Machines. Complexity ©D.Moshkovitz 2 Motivation Our main goal in this course is to analyze problems and categorize them.
CS5371 Theory of Computation Lecture 11: Computability Theory II (TM Variants, Church-Turing Thesis)
FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY Read sections 7.1 – 7.3 of the book for next time.
Submitted by : Estrella Eisenberg Yair Kaufman Ohad Lipsky Riva Gonen Shalom.
Computability and Complexity 20-1 Computability and Complexity Andrei Bulatov Class NL.
Time Complexity.
CS151 Complexity Theory Lecture 2 April 3, Time and Space A motivating question: –Boolean formula with n nodes –evaluate using O(log n) space?
February 20, 2015CS21 Lecture 191 CS21 Decidability and Tractability Lecture 19 February 20, 2015.
CS151 Complexity Theory Lecture 2 April 1, CS151 Lecture 22 Time and Space A motivating question: –Boolean formula with n nodes –evaluate using.
Non-Deterministic Space is Closed Under Complementation Neil Immerman Richard Szelepcsenyi Presented By: Subhajit Dasgupta.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
PSPACE-Completeness Section 8.3 Giorgi Japaridze Theory of Computability.
CS 461 – Nov. 21 Sections 7.1 – 7.2 Measuring complexity Dividing decidable languages into complexity classes. Algorithm complexity depends on what kind.
Definition: Let M be a deterministic Turing Machine that halts on all inputs. Space Complexity of M is the function f:N  N, where f(n) is the maximum.
Section 11.4 Language Classes Based On Randomization
חישוביות וסיבוכיות Computability and Complexity Lecture 7 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A AAAA.
Computational Complexity Theory Lecture 2: Reductions, NP-completeness, Cook-Levin theorem Indian Institute of Science.
Theory of Computing Lecture 17 MAS 714 Hartmut Klauck.
CSCI 2670 Introduction to Theory of Computing November 29, 2005.
Theory of Computing Lecture 21 MAS 714 Hartmut Klauck.
 2005 SDU Lecture13 Reducibility — A methodology for proving un- decidability.
CSC 3130: Automata theory and formal languages Andrej Bogdanov The Chinese University of Hong Kong Turing Machines.
NP-completeness Section 7.4 Giorgi Japaridze Theory of Computability.
Computability NP complete problems. Space complexity. Homework: [Post proposal]. Find PSPACE- Complete problems. Work on presentations.
Alternation Section 10.3 Giorgi Japaridze Theory of Computability.
Fall 2013 CMU CS Computational Complexity Lecture 2 Diagonalization, 9/12/2013.
NP-complete Languages
CSCI 2670 Introduction to Theory of Computing December 2, 2004.
CSCI 2670 Introduction to Theory of Computing December 7, 2005.
Theory of Computational Complexity Yuji Ishikawa Avis lab. M1.
1 Design and Analysis of Algorithms Yoram Moses Lecture 13 June 17, 2010
Chapter 7 Introduction to Computational Complexity.
CSCI 2670 Introduction to Theory of Computing November 15, 2005.
 2005 SDU Lecture14 Mapping Reducibility, Complexity.
Time complexity Here we will consider elements of computational complexity theory – an investigation of the time (or other resources) required for solving.
Part VI NP-Hardness.
Umans Complexity Theory Lectures
HIERARCHY THEOREMS Hu Rui Prof. Takahashi laboratory
Theory of Computational Complexity
Theory of Computability
Space Complexity Costas Busch - LSU.
Theory of Computability
CSE838 Lecture notes copy right: Moon Jung Chung
CS154, Lecture 13: P vs NP.
Theory of Computability
Turing Machines Complexity ©D.Moshkovitz.
CS21 Decidability and Tractability
Instructor: Aaron Roth
Theory of Computability
Intro to Theory of Computation
Presentation transcript:

Alternating Turing Machine (ATM) –  node is marked accept iff any of its children is marked accept. –  node is marked accept iff all of its children are marked accept.    accept reject  An ATM is an NTM with all states, except accept and reject states, divided into universal states and existential states. Alternation and Polynomial Time Hierarchy

Alternating Time and Space : ATIME(f(n))={ L :  an ATM that accepts L and use O(f(n)) time on all branches } ASPACE(f(n)) is defined similarly. AL= ASPACE(log n), AP = ATIME(n k ). eg. MIN-FORMULA={ |  is a minimal Boolean formula, i.e., there is no shorter equivalent}.

MIN-FORMULA  AP. On input  : 1. Universally select all formula f that are shorter than . 2. Existentially select an assignment to the input variables of . 3. Evaluate both  and f on this assignment. 4. Accept if the formulas evaluate to different values; else reject. Thm: ATIME(f(n))  SPACE(f(n))  ATIME(f 2 (n)). Thus, AP = PSPACE. Thm: ASPACE(f(n)) = TIME(2 O(f(n)) ). Thus, ASPACE = EXP.

Lemma: For f(n) >= n we have ATIME(f(n))  SPACE(f(n)). Proof: Convert an O(f(n))-time ATM M to a O(f(n))-space DTM S. S makes a depth-first search of M’s computation tree to determine whether the start configuration is “accept” or not. It is done by recursion and the depth is O(f(n)), since the computation tree has depth O(f(n)). For each level of recursion, the stack store the non-det choice that M made to reach that configuration from its parent and this uses only constant space. S can recover the configuration by “replaying” the computation from the start and following the recorded “signposts.”

Lemma: Let f(n)>= n we have SPACE(f(n))  ATIME(f 2 (n)). Proof: Let M be an O(f(n))-space DTM. We want to construct an O(f 2 (n))-time ATM S to simulate M. It is very similar to the proof of Savitch’s Theorem.  c1,c2,t =  m [  c1,m, t/2   m,c2,t/2 ]: indicate if C2 is reachable from C1 in t steps for M. S uses the above recursive alternating procedure to test whether the start configuration can reach an accepting conf. within 2 df(n) steps. The recursive depth is O(f(n)). For each level it takes O(f(n)) time to write a conf. (why?) Thus the algorithm runs in O(f 2 (n)) alternating time.

Lemma: f(n) >= log n we have ASPACE(f(n))  TIME (2 O(f(n)) ). Proof: Construct a 2 O(f(n)) -time DTM S to simulate an O(f(n))-space ATM M. On input w, S constructs the following graphs of the computation M on w. Nodes are conf of M on w and edges go from a conf to those configurations that it can yield in a step. After the graph is constructed, S repeatedly scans it and marks certain conf as accepting. Initially, only actual accepting conf are marked “accepting”. A conf that performs universal branching is marked “accepting” if all its children are marked and an existential conf is marked if any of its children are marked. S continues until no additional nodes are marked in a scan. Finally, S checks if the start conf is marked. There are 2 O(f(n)) conf of M on w, which is also the size of the graph. Hence the total time used is 2 O(f(n)).

Lemma: f(n) >= log n we have ASPACE(f(n))  TIME (2 O(f(n)) ). Proof: Construct an O(f(n))-space ATM S to simulate a 2 O(f(n)) -time DTM M. On input w, S has only enough space to store pointers into a tableau of the computation M on w as depicted in the following. The content of d is determined by the contents of its parents a, b, and c. # # # # q0q0 w1w1 # # # # start configuration 2 nd configuration window th configuration cell a b d c

S operates recursively to guess and then verify the contents of the individual cells of the tableau. It is easy to verify the cells of the first row, since it knows the start conf of M on w. For other cell d, S existentially guesses the contents of the parents, checks whether their contents would yield d’s contents according to M’s transition, and then universally branches to verify these guesses recursively. Assume M moves its head to the leftmost cell after entering accepting state. Thus S can determine whether M accepts w by checking the contents of the lower leftmost cell of the tableau. Hence S never needs to store more than a pointer to a cell in the tableau. So it uses at most O(f(n)) space.

Polynomial Hierarchy : Let  k T(n) be a class of language L accepted by an Alternating Turing Machine that begins in an existential state, alternates between  and  states  k-1 times and halts within O(T(n)). –Define The class of complement of languages within  k P is called  k P = co-  k P.  k P: where ATM begins in a  state. Note that  k P   k+1 P and  k P   k+1 P,  1 P=NP and  1 P=co-NP. MIN-CIRCUIT is in  2 P.

–Def: The polynomial hierarchy Thm: If  k P  k P for some k  1,then P  NP. Prop: If there exists a language L that is complete for PH, then there exists k s,t, the hierarchy collapses to level k.