CSC 4170 Theory of Computation Space complexity Chapter 8.

Slides:



Advertisements
Similar presentations
Based on Powerpoint slides by Giorgi Japaridze, Villanova University Space Complexity and Interactive Proof Systems Sections 8.0, 8.1, 8.2, 8.3, 10.4.
Advertisements

1 Nondeterministic Space is Closed Under Complement Presented by Jing Zhang and Yingbo Wang Theory of Computation II Professor: Geoffrey Smith.
Complexity Theory Lecture 3 Lecturer: Moni Naor. Recap Last week: Non deterministic communication complexity Probabilistic communication complexity Their.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
1 Space Complexity. 2 Def: Let M be a deterministic Turing Machine that halts on all inputs. Space Complexity of M is the function f:N  N, where f(n)
Giorgi Japaridze Theory of Computability Savitch’s Theorem Section 8.1.
NL equals coNL Section 8.6 Giorgi Japaridze Theory of Computability.
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
Complexity 12-1 Complexity Andrei Bulatov Non-Deterministic Space.
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
Complexity 11-1 Complexity Andrei Bulatov Space Complexity.
Computability and Complexity 22-1 Computability and Complexity Andrei Bulatov Hierarchy Theorem.
Complexity 13-1 Complexity Andrei Bulatov Hierarchy Theorem.
1 L is in NP means: There is a language L’ in P and a polynomial p so that L 1 · L 2 means: For some polynomial time computable map r : 8 x: x 2 L 1 iff.
P and NP Sipser (pages ). CS 311 Fall Polynomial time P = ∪ k TIME(n k ) … P = ∪ k TIME(n k ) … TIME(n 3 ) TIME(n 2 ) TIME(n)
Computability and Complexity 19-1 Computability and Complexity Andrei Bulatov Non-Deterministic Space.
Complexity ©D.Moshkovitz 1 Turing Machines. Complexity ©D.Moshkovitz 2 Motivation Our main goal in this course is to analyze problems and categorize them.
FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY Read sections 7.1 – 7.3 of the book for next time.
Complexity ©D.Moshkovits 1 Space Complexity Complexity ©D.Moshkovits 2 Motivation Complexity classes correspond to bounds on resources One such resource.
Alternating Turing Machine (ATM) –  node is marked accept iff any of its children is marked accept. –  node is marked accept iff all of its children.
Non-Deterministic Space is Closed Under Complementation Neil Immerman Richard Szelepcsenyi Presented By: Subhajit Dasgupta.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
PSPACE-Completeness Section 8.3 Giorgi Japaridze Theory of Computability.
Definition: Let M be a deterministic Turing Machine that halts on all inputs. Space Complexity of M is the function f:N  N, where f(n) is the maximum.
February 18, 2015CS21 Lecture 181 CS21 Decidability and Tractability Lecture 18 February 18, 2015.
The class P Section 7.2 CSC 4170 Theory of Computation.
Theory of Computing Lecture 21 MAS 714 Hartmut Klauck.
Measuring complexity Section 7.1 Giorgi Japaridze Theory of Computability.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
A Problem That Is Complete for PSPACE (Polynomial Space) BY TEJA SUDHA GARIGANTI.
Hierarchy theorems Section 9.1 Giorgi Japaridze Theory of Computability.
Computability NP complete problems. Space complexity. Homework: [Post proposal]. Find PSPACE- Complete problems. Work on presentations.
The Classes L and NL Section 8.4 Giorgi Japaridze Theory of Computability.
Chapter 11 Introduction to Computational Complexity Copyright © 2011 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1.
Alternation Section 10.3 Giorgi Japaridze Theory of Computability.
Overview of the theory of computation Episode 3 0 Turing machines The traditional concepts of computability, decidability and recursive enumerability.
Theory of Computational Complexity Yuji Ishikawa Avis lab. M1.
 2005 SDU Lecture14 Mapping Reducibility, Complexity.
The NP class. NP-completeness
Computational Complexity Theory
Space Complexity Complexity ©D.Moshkovits.
CSC 4170 Theory of Computation The class P Section 7.2.
Time complexity Here we will consider elements of computational complexity theory – an investigation of the time (or other resources) required for solving.
Relationship to Other Classes
CSC 4170 Theory of Computation The class P Section 7.2.
Theory of Computability
HIERARCHY THEOREMS Hu Rui Prof. Takahashi laboratory
Intro to Theory of Computation
Intro to Theory of Computation
Theory of Computational Complexity
Theory of Computability
Space Complexity Costas Busch - LSU.
CSC 4170 Theory of Computation The class NP Section 7.3.
Theory of Computability
CSC 4170 Theory of Computation Time complexity Section 7.1.
Theory of Computability
Turing Machines Complexity ©D.Moshkovitz.
CS21 Decidability and Tractability
Time Complexity Classes
CS21 Decidability and Tractability
Theory of Computability
Theory of Computability
Instructor: Aaron Roth
CSC 4170 Theory of Computation Time complexity Section 7.1.
Theory of Computability
Theory of Computability
Space Complexity and Interactive Proof Systems
Intro to Theory of Computation
CS151 Complexity Theory Lecture 4 April 8, 2004.
Presentation transcript:

CSC 4170 Theory of Computation Space complexity Chapter 8

CSC 4170 Theory of Computation Space complexity Chapter 8

Space complexity defined Definition 8.1 Let M be a deterministic Turing machine that halts on all inputs. The space complexity of M is the function f: NN, where f(n) is the maximum number of tape cells that M scans on any input of length n. If the space complexity of M is f(n), we also say that M runs in space f(n). If M is a nondeterministic TM wherein all branches halt on all inputs, we define its space complexity f(n) to be the maximum number of tape cells that M scans on any branch of its computation for any input of length n.

Definition 8.2 Let f: NR+ be a function. The space complexity SPACE and NSPACE 8.0.b Definition 8.2 Let f: NR+ be a function. The space complexity classes SPACE(f(n)) and NSPACE(f(n)) are defined as follows. SPACE(f(n)) = {L | L is a language decided by an O(f(n)) space deterministic Turing machine} NSPACE(f(n)) = {L | L is a language decided by an O(f(n)) space nondeterministic Turing machine}.

Space is more powerful than time Example 8.3. Space is more powerful than time, because it can be reused. E.g., while we believe that SAT has no polynomial (let alone linear) time algorithm, it can be easily decided in linear space: M1 = “On input <>, where  is a Boolean formula: 1. For each truth assignment to the variables x1,...,xm of : 2. Evaluate  on that truth assignment. 3. If  ever evaluated to 1, accept; if not, reject.” Each iteration of the loop needs extra memory only for remembering the current truth assignment, and for evaluating  on that assignment. This takes O(n) space. This space can then be recycled during the next iteration. Thus, the overall space complexity here remains linear. What is the time complexity of this algorithm, by the way? Exponential

Theorem 8.5 (Savitch’s Theorem) Nondeterministic space is not much more powerful than deterministic space Theorem 8.5 (Savitch’s Theorem) For any function f: NR+, where f(n)n, we have NSPACE(f(n))  SPACE(f2(n)).

PSPACE = SPACE(n)  SPACE(n2)  SPACE(n3)  ... PSPACE defined Definition 8.6 PSPACE is the class of languages that are decidable in polynomial space on a deterministic TM. In other words, PSPACE = SPACE(n)  SPACE(n2)  SPACE(n3)  ... NPSPACE can be defined similarly. However, the latter is not a very interesting class because, as an immediate corollary of Savitch’s theorem, it coincides with PSPACE (squaring polynomial space again yields polynomial space). This is what we know (why?): P  NP  PSPACE=NPSPACE  EXPTIME. We, however, do not know whether any of the three s can be replaced by =. Another set of huge open problems! It can be proven however that PEXPTIME. So, at least one of the three containments must be proper ( but not =), even though we do not know which one(s)!

Universal quantifier : xP(x) means “for any x{0,1}, P(x) is true” The TQBF problem 8.3.b Universal quantifier : xP(x) means “for any x{0,1}, P(x) is true” Existential quantifier : xP(x) means “for some x{0,1}, P(x) is true” We consider fully quantified Boolean formulas (in the prenex form). These are Boolean formulas prefixed with either x or x for each variable x. Examples (true or false?): x(xx) xy (xy) x(xx) xy ((xy)(xy)) x(xx) xy ((xy)(xy)) xy(xy) zxy ((xyz)(xyz)) TQBF = {<> |  is a true fully quantified Boolean formula} (True Quantified Boolean Formulas)

The PSPACE-completeness of TQBF Theorem 8.9 TQBF is PSPACE-complete. Intuitively, this means that there are no problems in PSPACE really harder than TQBF. Technically, this means that if one finds a polynomial time algorithm for TQBF, then polynomial time algorithms for all other problems from PSPACE can be automatically generated, and thus PSPACE=P.

A polynomial space algorithm for TQBF 8.3.d A polynomial space algorithm for TQBF The following algorithm obviously decides TQBF: T = “On input <>, a fully quantified Boolean formula: 1. If  contains no quantifiers, then it is an expression with only constants, so evaluate  and accept if true, otherwise, reject. 2. If  is x, recursively call T on , first with 0 substituted for x and then 1 substituted for x. If either result is accept, then accept; otherwise, reject. 3. If  is x, recursively call T on , first with 0 substituted for x and then 1 substituted for x. If both results are accept, then accept; otherwise, reject.” Analysis: Let m be the number of variables that appear in . The depth of recursion does not exceed m. And at each level of recursion, we need only store the value of one variable. So, the total space used is O(m), and hence linear in the size of . To complete the proof of Theorem 8.9, we also need to show that TQBF is PSPACE- hard. A a detailed technical proof of this part is technically somewhat trickier than (but otherwise similar to) the proof of the Cook-Levin theorem, and we omit it.

a game between two players A and E. Formulas as games Each fully quantified Boolean formula (in prenex form)  can be seen as a game between two players A and E. If  =x(x), it is E’s move, who should select x=0 or x=1, after which the game continues as (0) or (1), respectively. If  =x(x), it is A’s move, who should select x=0 or x=1, after which the game continues as (0) or (1), respectively. The play continues until all quantifiers are stripped off, after which E is considered the winner iff the final, variable-free formula, is true. xyz[(xy)(yz)(yz)] E moves, selects x=1 yz[(1y)(yz)(yz)] A moves, selects y=0 z[(10)(0z)(0z)] E moves, selects z=1 (10)(01)(01) A wins Who has a winning strategy (a strategy that guarantees a win no matter how the adversary acts) in this example?

The FORMULA-GAME problem Who has a winning strategy in xyz[(xy)(yz)(yz)] ? FORMULA-GAME = {<> | Player E has a winning strategy in } Theorem 8.11 FORMULA-GAME is PSPACE-complete. Proof . This is so for a simple reason: we simply have FORMULA-GAME = TQBF. To see this, observe that  is true iff player E has a winning strategy in it. A detailed proof of this fact (if it was necessary) can proceed by induction on the length of the quantifier-prefix of .

The child’s game Geography Players, called I and II, take turns naming cities from anywhere in the world (player I starts). Each city chosen must begin with the same letter that ended the previous city’s name. Repetitions are not permitted. The player who is unable to continue loses. We can model this game with a directed graph whose nodes are the cities of the world. There is an edge from one city to another if the first can lead to the second according to the game rules. One node is designated as the start node/city. The condition that cities cannot be repeated means that the path that is being spelled must be simple. Peoria Austin Nashua Albany Orsay ... Tokyo Amherst Tuscon Oakland

Generalized Geography In Generalized Geography, we take an arbitrary digraph with a designated start node instead of the graph associated with the actual cities. Who has a winning strategy here? 4 7 2 1 5 9 8 3 6 4 Who has a winning strategy here? 7 2 1 5 9 8 3 6

GG and its PSPACE-completeness GG = {<G,b> | Player I has a winning strategy for the Generalized Geography game played on graph G starting at node b} Theorem 8.14 GG, just like TQBF, is PSPACE-complete.

But it does make sense as space complexity. We need to slightly modify Sublinear complexity Sublinear (less than linear) time complexity does not make sense (why?). But it does make sense as space complexity. We need to slightly modify our TM model of computation though. The modification consists in separating the input tape from the work tape. The work tape remains read/write, but the input tape is read-only. When counting space complexity, we only look at how many cells of the work tape are utilized. This separation is not artificial. In real life, it is often the case that read-only input is bigger than the computer’s memory (“work tape”). CD-ROM is an example. Or, if we want to focus on fast computations, computer’s memory becomes even more limited --- registers only. Logarithmic space computability can be seen as computability with registers only. Or, more generally, computability with memory that is “much smaller than” input.

1. L is the class of languages that are decidable in logarithmic space L and NL defined Definition 8.17 1. L is the class of languages that are decidable in logarithmic space on a deterministic Turing machine. In other words, L = SPACE(log n) 2. NL is the class of languages that are decidable in logarithmic space on a nondeterministic Turing machine. In other words, NL = NSPACE(log n)

Example 8.18 Remember the machine for {0k1k | k0} from Section {0k1k | k0}  L Example 8.18 Remember the machine for {0k1k | k0} from Section 7.1, which works by zigzagging back and forth. What is its space complexity? Linear To improve space (but not time!) complexity, we can set up a different machine. Instead of zigzagging, such a machine just makes one pass through the input, and records on its work tape --- in the binary notation --- the number m of 0s and the number n of 1s. Holding m and n requires only logarithmic space. Then the machine compares these two numbers by zigzagging. This does not take any additional space.

Example 8.19 Remember our polynomial-time algorithm for PATH 8.4.d PATH  NL Example 8.19 Remember our polynomial-time algorithm for PATH from Section 7.2, which works by marking nodes. What is its space complexity? Linear We can set up a nondeterministic logarithmic space machine for PATH. Starting from s as the “current node”, every time it guesses a next node --- among the nodes pointed to by the current node --- and jumps to it, until it hits t or has already visited more nodes than the number of nodes in the graph without hitting t. In the former case it accepts, and in the latter case rejects. At any time, the machine thus only needs to remember “the current node”, as well as the node count. How much space do these two pieces of information take? Logarithmic. So, the whole algorithm runs in logarithmic space.

Theorem 8.9 PATH is NL-complete. Intuitively, this means that there are no problems in NL really harder than PATH. Technically, this means that if one finds a logarithmic space deterministic algorithm for PATH, then logarithmic space deterministic algorithms for all other problems from NL can be automatically generated, and thus NL=L. Of course, LNL. It is also known that L,NL P. But whether L=NL or L=P are among the greatest open problems in theoretical computer science.