Resource-Bounded Computation Previously: can it be done? Now: how efficiently can it be done? Goal: conserve computational resources: Time, space, other.

Slides:



Advertisements
Similar presentations
Resource-Bounded Computation
Advertisements

CS 461 – Nov. 9 Chomsky hierarchy of language classes –Review –Let’s find a language outside the TM world! –Hints: languages and TM are countable, but.
Turing -Recognizable vs. -Decidable
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen Department of Computer Science University of Texas-Pan American.
Giorgi Japaridze Theory of Computability Savitch’s Theorem Section 8.1.
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen.
Alternation Alternation: generalizes non-determinism, where each state is either “existential” or “universal”: Old: existential states New: universal states.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
Nathan Brunelle Department of Computer Science University of Virginia Theory of Computation CS3102 – Spring 2014 A tale.
The diagonalization method The halting problem is undecidable Decidability.
CS Department Fireside Chat All are welcome! Wed Nov 18, 5-6pm, Ols 228e/236d Kim Hazelwood and Wes Weimer Meet and ask them questions in a non-academic.
Complexity 12-1 Complexity Andrei Bulatov Non-Deterministic Space.
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
Complexity 11-1 Complexity Andrei Bulatov Space Complexity.
Computability and Complexity 22-1 Computability and Complexity Andrei Bulatov Hierarchy Theorem.
Complexity 13-1 Complexity Andrei Bulatov Hierarchy Theorem.
1 Introduction to Computability Theory Lecture14: Recap Prof. Amos Israeli.
P and NP Sipser (pages ). CS 311 Fall Polynomial time P = ∪ k TIME(n k ) … P = ∪ k TIME(n k ) … TIME(n 3 ) TIME(n 2 ) TIME(n)
Computability and Complexity 19-1 Computability and Complexity Andrei Bulatov Non-Deterministic Space.
P, NP, PS, and NPS By Muhannad Harrim. Class P P is the complexity class containing decision problems which can be solved by a Deterministic Turing machine.
Complexity ©D.Moshkovitz 1 Turing Machines. Complexity ©D.Moshkovitz 2 Motivation Our main goal in this course is to analyze problems and categorize them.
CS 302: Discrete Math II A Review. An alphabet Σ is a finite set (e.g., Σ = {0,1}) A string over Σ is a finite-length sequence of elements of Σ For x.
FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY Read sections 7.1 – 7.3 of the book for next time.
Complexity ©D.Moshkovits 1 Space Complexity Complexity ©D.Moshkovits 2 Motivation Complexity classes correspond to bounds on resources One such resource.
Programming the TM qa  (,q) (,q) q1q1 0q1q1 R q1q1 1q1q1 R q1q1  h  Qa  (,q) (,q) q1q1 0q2q2  q1q1 1q3q3  q1q1  h  q2q2 0q4q4 R q2q2 1q4q4.
Computability and Complexity 20-1 Computability and Complexity Andrei Bulatov Class NL.
1 Slides: Asaf Shapira & Oded Schwartz; Sonny Ben-Shimon & Yaniv Nahum. Sonny Ben-Shimon & Yaniv Nahum. Notes: Leia Passoni, Reuben Sumner, Yoad Lustig.
Linear Bounded Automata LBAs
Theory of Computing Lecture 20 MAS 714 Hartmut Klauck.
Fall 2003Costas Busch - RPI1 Turing Machines (TMs) Linear Bounded Automata (LBAs)
1 Slides: Asaf Shapira & Oded Schwartz; Sonny Ben-Shimon & Yaniv Nahum. Sonny Ben-Shimon & Yaniv Nahum. Notes: Leia Passoni, Reuben Sumner, Yoad Lustig.
Robbie Hott Department of Computer Science University of Virginia Theory of Computation CS3102.
Final Exam Review Cummulative Chapters 0, 1, 2, 3, 4, 5 and 7.
Nathan Brunelle Department of Computer Science University of Virginia Theory of Computation CS3102 – Spring 2014 A tale.
Definition: Let M be a deterministic Turing Machine that halts on all inputs. Space Complexity of M is the function f:N  N, where f(n) is the maximum.
חישוביות וסיבוכיות Computability and Complexity Lecture 7 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A AAAA.
Review Byron Gao. Overview Theory of computation: central areas: Automata, Computability, Complexity Computability: Is the problem solvable? –solvable.
Theory of Computing Lecture 21 MAS 714 Hartmut Klauck.
CS848: Topics in Databases: Foundations of Query Optimization Topics covered  Review of complexity.
Aaron Bloomfield CS 4102 Spring 2011
Measuring complexity Section 7.1 Giorgi Japaridze Theory of Computability.
Hierarchy theorems Section 9.1 Giorgi Japaridze Theory of Computability.
Turing -Recognizable vs. -Decidable
1 Linear Bounded Automata LBAs. 2 Linear Bounded Automata (LBAs) are the same as Turing Machines with one difference: The input string tape space is the.
Homework 8 Solutions Problem 1. Draw a diagram showing the various classes of languages that we have discussed and alluded to in terms of which class.
Lecture 2 Time and Space of DTM. Time of DTM Time M (x) = # of moves that DTM M takes on input x. Time M (x) < infinity iff x ε L(M).
Turing Machines Sections 17.6 – The Universal Turing Machine Problem: All our machines so far are hardwired. ENIAC
Theory of Computational Complexity Yuji Ishikawa Avis lab. M1.
Chapter 7 Introduction to Computational Complexity.
CS 154 Formal Languages and Computability May 12 Class Meeting Department of Computer Science San Jose State University Spring 2016 Instructor: Ron Mak.
Theory of Computational Complexity TA : Junichi Teruyama Iwama lab. D3
CSCI 2670 Introduction to Theory of Computing November 15, 2005.
 2005 SDU Lecture14 Mapping Reducibility, Complexity.
CSE 105 theory of computation
Lecture 1-2 Time and Space of DTM
CSE 105 theory of computation
Linear Bounded Automata LBAs
HIERARCHY THEOREMS Hu Rui Prof. Takahashi laboratory
CSC 4170 Theory of Computation Space complexity Chapter 8.
Theory of Computability
CS154, Lecture 12: Time Complexity
Part II Theory of Nondeterministic Computation
Recall last lecture and Nondeterministic TMs
Turing Machines Complexity ©D.Moshkovitz.
CSE 105 theory of computation
Turing -Recognizable vs. -Decidable
CSE 105 theory of computation
The Extended Chomsky Hierarchy Reloaded
Intro to Theory of Computation
Lecture 1-2 Time and Space of DTM
Presentation transcript:

Resource-Bounded Computation Previously: can it be done? Now: how efficiently can it be done? Goal: conserve computational resources: Time, space, other resources? Def: L is decidable within time O(t(n)) if some TM M that decides L always halts on all w  * within O(t(|w|)) steps / time. Def: L is decidable within space O(s(n)) if some TM M that decides L always halts on all w  * while never using more than O(s(|w|)) space / tape cells.

Complexity Classes Def: DTIME(t(n))={L | L is decidable within time O(t(n)) by some deterministic TM} Def: NTIME(t(n))={L | L is decidable within time O(t(n)) by some non-deterministic TM} Def: DSPACE(s(n))={L | L decidable within space O(s(n)) by some deterministic TM} Def: NSPACE(s(n))={L | L decidable within space O(s(n)) by some non-deterministic TM} Properties!

Time is Tape Dependent Theorem: The time depends on the # of TM tapes. Idea: more tapes can enable higher efficiency. Ex: {0 n 1 n | n>0} is in DTIME(n 2 ) for 1-tape TM’s, and is in DTIME(n) for 2-tape TM’s. Note: For multi-tape TM’s, input tape space does not “count” in the total space s(n). This enables analyzing sub-linear space complexities.

Space is Tape Independent Theorem: The space does not depend on the # tapes. Proof: Note: This does not asymptotically increase the overall space (but can increase the total time). Theorem: A 1-tape TM can simulate a t(n)-time- bounded k-tape TM in time O(t 2 (n)). Idea: Tapes can be “interlaced” space-efficiently:

Space-Time Relations Theorem: If t(n) 1 then: DTIME(t(n))  DTIME(t’(n)) NTIME(t(n))  NTIME(t’(n)) Theorem: If s(n) 1 then: DSPACE(s(n))  DSPACE(s’(n)) NSPACE(s(n))  NSPACE(s’(n)) Example: NTIME(n)  NTIME(n 2 ) Example : DSPACE(log n)  DSPACE(n)

Examples of Space & Time Classes Let L 1 ={0 n 1 n | n>0}: For 1-tape TM’s: L 1  DTIME(n 2 ) L 1  DSPACE(n) L 1  DTIME(n log n) For 2-tape TM’s: L 1  DTIME(n) L 1  DSPACE(log n)

Examples of Space & Time Classes Let L 2 =  * L 2  DTIME(n) Theorem: every regular language is in DTIME(n) L 2  DSPACE(1) Theorem: every regular language is in DSPACE(1) L 2  DTIME(1) Let L 3 ={w$w | w in  *} L 3  DTIME(n 2 ) L 3  DSPACE(n) L 3  DSPACE(log n)

Special Time Classes Def: P =  DTIME(n k )  k>1 P  deterministic polynomial time Note: P is robust / model-independent Def: NP =  NTIME(n k )  k>1 NP  non-deterministic polynomial time Theorem: P  NP Conjecture: P  NP ?Million $ question!

Other Special Space Classes Def: PSPACE =  DSPACE(n k )  k>1 PSPACE  deterministic polynomial space Def: NPSPACE =  NSPACE(n k )  k>1 NPSPACE  non-deterministic polynomial space Theorem: PSPACE  NPSPACE (obvious) Theorem: PSPACE  NPSPACE (not obvious)

Other Special Space Classes Def: EXPTIME =  DTIME(2 n k )  k>1 EXPTIME  exponential time Def: EXPSPACE =  DSPACE(2 n k )  k>1 EXPSPACE  exponential space Def: L = LOGSPACE = DSPACE(log n) Def: NL = NLOGSPACE = NSPACE(log n)

Space/Time Relationships Theorem: DTIME(f(n))  DSPACE(f(n)) Theorem: NTIME(f(n))  DTIME(c f(n) ) for some c depending on the language. Theorem: DSPACE(f(n))  DTIME(c f(n) ) for some c, depending on the language. Theorem [Savitch]: NSPACE(f(n))  DSPACE(f 2 (n)) Corollary: PSPACE  NPSPACE Theorem: NSPACE(n r )  DSPACE(n r+  )  r>0,  >0

PSPACE-complete QBF The Extended Chomsky Hierarchy Finite {a,b} Regular a* Det. CF a n b n Context-free ww R PanbncnPanbncn NP PSPACE EXPSPACE Recognizable Not Recognizable HH Decidable Presburger arithmetic NP-complete SAT Not finitely describable  ** EXPTIME EXPTIME-complete Go EXPSPACE-complete =RE  Turing degrees Context sensitive LBA

Time Complexity Hierarchy Theorem: for any t(n)>0, there exists a decidable language L  DTIME(t(n)).  No time complexity class contains all the decidable languages, and the time hierarchy is  !  There are decidable languages that take arbitrarily long times to decide! Note: t(n) must be computable & everywhere defined Proof: (by diagonalization) Fix lexicographic orders for TM’s: M 1, M 2, M 2,... Interpret TM inputs i  Σ* as encodings of integers: a=1, b=2, aa=3, ab=4, ba=5, bb=6, aaa=7, …

“Lexicographic order.”

Time Hierarchy (proof) Define L={i | M i does not accept i within t(i) time} Note: L is decidable (by simulation) Q: is L  DTIME(t(n)) ? Assume (towards contradiction) L  DTIME(t(n)) i.e.,  a fixed  K  N such that Turing machine M K decides L within time bound t(n) i M K  decides / accepts L If M i accepts i within t(i) time then else Reject Accept

Time Hierarchy (proof) Consider whether K  L: K  L  M K must accept K within t(K) time  M K must reject K  K  L K  L  M K must reject K within t(K) time  M K must accept K  K  L So (K  L)  (K  L), a contradiction!  Assumption is false  L  DTIME(t(n)) K M K  decides / accepts L If M K accepts K within t(K) time then else Reject Accept

i = … w i  Σ * = abaaabbabbaaaaababaabbbaababbbabbbaaaa… M 1 (i)  √  √√  √√√  √  … M 2 (i)   √  √  √  √√  … M 3 (i)  √√√  √  √√√√  √  … M 4 (i)  √  √  √√  √√  √  … M 5 (i)   √√  √√  √√√√  √  …... Time Hierarchy (another proof) Consider all t(n)-time-bounded TM’s on all inputs: M’(i)  is t(n) time-bounded. But M’ computes a different function than any M j  Contradiction!  √√√...

Space Complexity Hierarchy Theorem: for any s(n)>0, there exists a decidable language L  DSPACE(s(n)).  No space complexity class contains all the decidable languages, and the space hierarchy is  !  There are decidable languages that take arbitrarily much space to decide! Note: s(n) must be computable & everywhere defined Proof: (by diagonalization) Fix lexicographic orders for TM’s: M 1, M 2, M 2,... Interpret TM inputs i  Σ* as encodings of integers: a=1, b=2, aa=3, ab=4, ba=5, bb=6, aaa=7, …

Space Hierarchy (proof) Define L={i | M i does not accept i within t(i) space} Note: L is decidable (by simulation;  -loops?) Q: is L  DSPACE(s(n)) ? Assume (towards contradiction) L  DSPACE(s(n)) i.e.,  a fixed  K  N such that Turing machine M K decides L within space bound s(n) i M K  decides / accepts L If M i accepts i within t(i) space then else Reject Accept

Space Hierarchy (proof) Consider whether K  L: K  L  M K must accept K within s(K) space  M K must reject K  K  L K  L  M K must reject K within s(K) space  M K must accept K  K  L So (K  L)  (K  L), a contradiction!  Assumption is false  L  DSPACE(s(n)) K M K  decides / accepts L If M K accepts K within s(K) space then else Reject Accept

i = … w i  Σ * = abaaabbabbaaaaababaabbbaababbbabbbaaaa… M 1 (i)  √  √√  √√√  √  … M 2 (i)   √  √  √  √√  … M 3 (i)  √√√  √  √√√√  √  … M 4 (i)  √  √  √√  √√  √  … M 5 (i)   √√  √√  √√√√  √  …... Space Hierarchy (another proof) Consider all s(n)-space-bounded TM’s on all inputs: M’(i)  is s(n) space-bounded. But M’ computes a different function than any M j  Contradiction!  √√√...

Savitch’s Theorem Theorem: NSPACE(f(n))  PSPACE (f 2 (n)) Proof: Simulation: idea is to agressively conserve and reuse space while sacrificing (lots of) time. Consider a sequence of TM states in one branch of an NSPACE(f(n))-bounded computation: Computation time / length is bounded by c f(n) (why?) We need to simulate this branch and all others too! Q: How can we space-efficiently simulate this? A: Use divide-and-conquer with heavy space-reuse!

Savitch’s Theorem Pick a midpoint state along target path: Verify it is a valid intermediate state by recursively solving both subproblems. Iterate for all possible midpoint states! The recursion stack depth is at most log(c f(n) )=O(f(n)) Each recursion stack frame size is O(f(n)).  total space needed is O(f(n)*f(n))=O(f 2 (n)) Note: total time is exponential (but that’s OK).  non-determinism can be eliminated by squaring the space: NSPACE(f(n))  PSPACE (f 2 (n))

Savitch’s Theorem Corollary: NPSPACE = PSPACE Proof:NPSPACE =  NSPACE(n k ) k>1    DSPACE(n 2k ) k>1 =   DSPACE(n k ) k>1 = PSPACE i.e., polynomial space is invariant with respect to non-determinism! Q: What about polynomial time? A: Still open! (P=NP)

PSPACE-complete QBF The Extended Chomsky Hierarchy Finite {a,b} Regular a* Det. CF a n b n Context-free ww R PanbncnPanbncn NP PSPACE=NPSPACE EXPSPACE Recognizable Not Recognizable HH Decidable Presburger arithmetic NP-complete SAT Not finitely describable  ** EXPTIME EXPTIME-complete Go EXPSPACE-complete =RE  Turing degrees Context sensitive LBA

Q: Can we enumerate TM’s for all languages in P? Q: Can we enumerate TM’s for all languages in PSPACE? EXPTIME? EXPSPACE? Note: not necessarily in a lexicographic order. Enumeration of Resource-Bounded TMs Extra credit

Denseness of Space Hierarchy Q: How much additional space does it take to recognize more languages? A: Very little more! Theorem: Given two space bounds s 1 and s 2 such that Lim s 1 (n) / s 2 (n)=0 as n , i.e., s 1 (n) = o(s 2 (n)),  a decidable language L such that L  DSPACE(s 2 (n)) but L  DSPACE(s 1 (n)). Note: s 2 (n) must be computable within s 2 (n) space.  Space hierarchy is infinite and very dense!

Denseness of Space Hierarchy Space hierarchy is infinite and very dense! Examples: DSPACE(log n)  DSPACE(log 2 n) DSPACE(n)  DSPACE(n log n) DSPACE(n 2 )  DSPACE(n ) DSPACE(n x )  DSPACE(n y )  1<x<y Corollary: LOGSPACE  PSPACE Corollary: PSPACE  EXPSPACE

Denseness of Time Hierarchy Q: How much additional time does it take to recognize more languages? A: At most a logarithmic factor more! Theorem: Given two time bounds t 1 and t 2 such that t 1 (n)  log(t 1 (n)) = o(t 2 (n)),  a decidable language L such that L  DTIME(t 2 (n)) but L  DTIME(t 1 (n)). Note: t 2 (n) must be computable within t 2 (n) time.  Time hierarchy is infinite and pretty dense!

Denseness of Time Hierarchy Time hierarchy is infinite and pretty dense! Examples: DTIME(n)  DTIME(n log 2 n) DTIME(n 2 )  DTIME(n ) DTIME(2 n )  DTIME(n 2 2 n ) DTIME(n x )  DTIME(n y )  1<x<y Corollary: LOGTIME  P Corollary: P  EXPTIME

Complexity Classes Relationships Theorems: LOGTIME  L  NL  P  NP  PSPACE  EXPTIME  NEXPTIME  EXPSPACE ... Theorems: L  PSPACE  EXPSPACE  Theorems: LOGTIME  P  EXPTIME  Conjectures: L  NL, NL  P, P  NP, NP  PSPACE, PSPACE  EXPTIME, EXPTIME  NEXPTIME, NEXPTIME  EXPSPACE,... Theorem: At least two of the above conjectures are true!

Theorem: At least two of the following conjectures are true: LLLL NLPNLP PNPPNP NP  PSPACE PSPACE  EXPTIME EXPTIME  NEXPTIME NEXPTIME  EXPSPACE Open problems!

PSPACE-complete QBF The Extended Chomsky Hierarchy Finite {a,b} Regular a* Det. CF a n b n Context-free ww R PanbncnPanbncn NP PSPACE=NPSPACE EXPSPACE Recognizable Not Recognizable HH Decidable Presburger arithmetic NP-complete SAT Not finitely describable  ** EXPTIME EXPTIME-complete Go EXPSPACE-complete =RE  Turing degrees Context sensitive LBA

Gap Theorems There exist arbitrarily large space & time complexity gaps! Theorem [Borodin]: For any computable function g(n),  t(n) such that DTIME(t(n)) = DTIME(g(t(n)). Ex: DTIME(t(n)) = DTIME(2 2 t(n) ) for some t(n) Theorem [Borodin]: For any computable function g(n),  s(n) such that DSPACE(s(n)) = DSPACE(g(s(n)). Ex: DSPACE(s(n)) = DSPACE(S(n) s(n) ) for some s(n) Corollary:  f(n) such that DTIME(f(n)) = DSPACE(f(n)). Note: does not contradict the space and time hierarchy theorems, since t(n), s(n), f(n) may not be computable.

The First Complexity Gap The first space “gap” is between O(1) and O(log log n) Theorem: L  DSPACE(o(log log n))  L  DSPACE(O(1))  L is regular! All space classes below O(log log n) collapes to O(1).

Speedup Theorem There are languages for which there are no asymptotic space or time lower bounds for deciding them. Theorem: For any computable function g(n),  a language L such that if TM M accepts L within t(n) time, there is also another TM M’ that accepts L within g(t(n)) time. Corollary: There is a problem such that if any algorithm solves it in time t(n), there is also another algorithm that solves it in time O(log t(n)). Note: does not contradict the time hierarchy theorem. The speedup theorem was proved independently by Boris Trakhtenbrot [1964] and Alan Borodin [1972].

Abstract Complexity Theory Complexity theory can be machine-independent! Instead of referring to TM’s, we state simple axioms that any complexity measure  must satisfy. Example: the Blum axioms: 1)  (M,w) is finite iff M(w) halts; and 2) The predicate “  (M,w)=n” is decidable. Theorem: Any complexity measure that satisfies these axioms gives rise to hierarchy, gap, & speedup theorems. Corollary: Space & time measures satisfy these axioms. Abstract (or axiomatic) complexity was developed by Manuel Blum in 1967.