TM Design Universal TM MA/CSSE 474 Theory of Computation.

Slides:



Advertisements
Similar presentations
Context-Free and Noncontext-Free Languages
Advertisements

Theory of Computation CS3102 – Spring 2014 A tale of computers, math, problem solving, life, love and tragic death Nathan Brunelle Department of Computer.
Pushdown Automata Chapter 12. Recognizing Context-Free Languages Two notions of recognition: (1) Say yes or no, just like with FSMs (2) Say yes or no,
Pushdown Automata Chapter 12. Recognizing Context-Free Languages We need a device similar to an FSM except that it needs more power. The insight: Precisely.
The Big Picture Chapter 3. We want to examine a given computational problem and see how difficult it is. Then we need to compare problems Problems appear.
Pushdown Automata Part II: PDAs and CFG Chapter 12.
CS21 Decidability and Tractability
1 Introduction to Computability Theory Lecture12: Decidable Languages Prof. Amos Israeli.
CS 490: Automata and Language Theory Daniel Firpo Spring 2003.
Homework #9 Solutions.
1 Decidability continued. 2 Undecidable Problems Halting Problem: Does machine halt on input ? State-entry Problem: Does machine enter state halt on input.
CS5371 Theory of Computation Lecture 10: Computability Theory I (Turing Machine)
MA/CSSE 474 Theory of Computation
CS5371 Theory of Computation Lecture 12: Computability III (Decidable Languages relating to DFA, NFA, and CFG)
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen Department of Computer Science University of Texas-Pan American.
Birthday Gift from Dad …. … Most Powerful Computer.
CS490 Presentation: Automata & Language Theory Thong Lam Ran Shi.
 Computability Theory Turing Machines Professor MSc. Ivan A. Escobar
Pushdown Automata (PDA) Intro
Introduction to CS Theory Lecture 3 – Regular Languages Piotr Faliszewski
Pushdown Automata (PDAs)
MA/CSSE 474 Theory of Computation More Reduction Examples Non-SD Reductions.
Pushdown Automata Part I: PDAs Chapter Recognizing Context-Free Languages Two notions of recognition: (1) Say yes or no, just like with FSMs (2)
MA/CSSE 474 Theory of Computation Enumerability Reduction.
MA/CSSE 474 Theory of Computation Decision Problems DFSMs.
Turing Machines Chapter 17. Languages and Machines SD D Context-Free Languages Regular Languages reg exps FSMs cfgs PDAs unrestricted grammars Turing.
1 CD5560 FABER Formal Languages, Automata and Models of Computation Lecture 11 Midterm Exam 2 -Context-Free Languages Mälardalen University 2005.
Pushdown Automata Chapters Generators vs. Recognizers For Regular Languages: –regular expressions are generators –FAs are recognizers For Context-free.
Context-Free and Noncontext-Free Languages Chapter 13 1.
1 Turing’s Thesis. 2 Turing’s thesis: Any computation carried out by mechanical means can be performed by a Turing Machine (1930)
 2005 SDU Lecture13 Reducibility — A methodology for proving un- decidability.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
Foundations of (Theoretical) Computer Science Chapter 2 Lecture Notes (Section 2.2: Pushdown Automata) Prof. Karen Daniels, Fall 2010 with acknowledgement.
TM Design Macro Language D and SD MA/CSSE 474 Theory of Computation.
Context-Free and Noncontext-Free Languages Chapter 13.
CS 208: Computing Theory Assoc. Prof. Dr. Brahim Hnich
Computability Review homework. Video. Variations. Definitions. Enumerators. Hilbert's Problem. Algorithms. Summary Homework: Give formal definition of.
Lecture 17 Undecidability Topics:  TM variations  Undecidability June 25, 2015 CSCE 355 Foundations of Computation.
CSCI 2670 Introduction to Theory of Computing October 13, 2005.
Donghyun (David) Kim Department of Mathematics and Physics North Carolina Central University 1 Chapter 2 Context-Free Languages Some slides are in courtesy.
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen Department of Computer Science University of Texas-Pan American.
Lecture 16b Turing Machines Topics: Closure Properties of Context Free Languages Cocke-Younger-Kasimi Parsing Algorithm June 23, 2015 CSCE 355 Foundations.
1 Turing Machines. 2 The Language Hierarchy Regular Languages Context-Free Languages ? ?
Automata & Formal Languages, Feodor F. Dragan, Kent State University 1 CHAPTER 3 The Church-Turing Thesis Contents Turing Machines definitions, examples,
Grammar Set of variables Set of terminal symbols Start variable Set of Production rules.
1 Chapter 9 Undecidability  Turing Machines Coded as Binary Strings  Universal Turing machine  Diagonalizing over Turing Machines  Problems as Languages.
Pushdown Automata Chapter 12. Recognizing Context-Free Languages Two notions of recognition: (1) Say yes or no, just like with FSMs (2) Say yes or no,
The Church-Turing Thesis Chapter Are We Done? FSM  PDA  Turing machine Is this the end of the line? There are still problems we cannot solve:
Universal Turing Machine
MA/CSSE 474 Theory of Computation Decision Problems, Continued DFSMs.
Recursively Enumerable and Recursive Languages. Definition: A language is recursively enumerable if some Turing machine accepts it.
Context-Free and Noncontext-Free Languages Chapter 13.
TM Macro Language MA/CSSE 474 Theory of Computation.
Decidability and Undecidability Proofs
MA/CSSE 474 Theory of Computation
Turing Machines Chapter 17.
CSE 105 theory of computation
CSE 105 theory of computation
Turing Machines Acceptors; Enumerators
Turing Machines Chapter 17.
CS21 Decidability and Tractability
MA/CSSE 474 Theory of Computation
MA/CSSE 474 Theory of Computation
MA/CSSE 474 Theory of Computation
MA/CSSE 474 Theory of Computation
CSE 105 theory of computation
MA/CSSE 474 Theory of Computation
MA/CSSE 474 Theory of Computation
CSE 105 theory of computation
Presentation transcript:

TM Design Universal TM MA/CSSE 474 Theory of Computation

Your Questions? Previous class days' material Reading Assignments HW 12 problems Anything else I have included some slides online that we will not have time to do in class, but may be helpful to you anyway.

The CFL Hierarchy

Context-Free Languages Over a Single-Letter Alphabet Theorem: Any context-free language over a single-letter alphabet is regular. Proof: Requires Parikh’s Theorem, which we are skipping

Algorithms and Decision Procedures for Context-Free Languages Chapter 14

Decision Procedures for CFLs Membership: Given a language L and a string w, is w in L? Two approaches: ● If L is context-free, then there exists some context-free grammar G that generates it. Try derivations in G and see whether any of them generates w. Problem (later slide): ● If L is context-free, then there exists some PDA M that accepts it. Run M on w. Problem (later slide):

Decision Procedures for CFLs Membership: Given a language L and a string w, is w in L? Two approaches: ● If L is context-free, then there exists some context-free grammar G that generates it. Try derivations in G and see whether any of them generates w. S  S T | a Try to derive aaa S ST

Decision Procedures for CFLs Membership: Given a language L and a string w, is w in L? ● If L is context-free, then there exists some PDA M that accepts it. Run M on w. Problem:

Using a Grammar decideCFLusingGrammar(L: CFL, w: string) = 1. If given a PDA, build G so that L(G) = L(M). 2. If w =  then if S G is nullable then accept, else reject. 3. If w   then: 3.1 Construct G in Chomsky normal form such that L(G) = L(G) – {  }. 3.2 If G derives w, it does so in 2  |w| - 1 steps. Try all derivations in G of 2  |w| - 1 steps. If one of them derives w, accept. Otherwise reject. Alternative O(n 3 ) algorithm: CKY

Recall CFGtoPDAtopdown, which built: M = ({p, q}, , V, , p, {q}), where  contains: ● The start-up transition ((p, ,  ), (q, S)). ● For each rule X  s 1 s 2 …s n. in R, the transition ((q, , X), (q, s 1 s 2 …s n )). ● For each character c  , the transition ((q, c, c), (q,  )). Can we make this work so there are no  -transitions? If every transition consumes an input character then M would have to halt after |w| steps. Membership Using a PDA Put the grammar into Greibach Normal form: All rules are of the following form: ● X  a A, where a   and A  (V -  )*. We can combime pushing the RHS of the production with matching the first character a. Details on p

Greibach Normal Form All rules are of the following form: ● X  a A, where a   and A  (V -  )*. No need to push the a and then immediately pop it. So M = ({p, q}, , V, , p, {q}), where  contains: 1. The start-up transitions: For each rule S  cs 2 …s n, the transition: ((p, c,  ), (q, s 2 …s n )). 2. For each rule X  cs 2 …s n (where c   and s 2 through s n are elements of V -  ), the transition: ((q, c, X), (q, s 2 …s n ))

A PDA Without  -Transitions Must Halt Consider the execution of M on input w: ● Each individual path of M must halt within |w| steps. ● The total number of paths pursued by M must be less than or equal to P = B |w|, where B is the maximum number of competing transitions from any state in M. ● The total number of steps that will be executed by all paths of M is bounded by P  |w|. So all paths must eventually halt.

An Algorithm to Decide Whether M Accepts w decideCFLusingPDA(L: CFL, w: string) = 1. If L is specified as a PDA, use PDAtoCFG to construct a grammar G such that L(G) = L(M). 2. If L is specified as a grammar G, simply use G. 3. If w =  then if S G is nullable then accept, otherwise reject. 4. If w   then: 4.1 From G, construct G such that L(G) = L(G) – {  } and G is in Greibach normal form. 4.2 From G construct a PDA M such that L(M) = L(G) and M has no  -transitions. 4.3 All paths of M are guaranteed to halt within a finite number of steps. So run M on w. Accept if it accepts and reject otherwise. Each individual path of M must halt within |w| steps. ● The total number of paths pursued by M must be less than or equal to P = B |w|, where B is the maximum number of competing transitions from any state in M. ● The total number of steps that will be executed by all paths of M is bounded by P  |w|.

Emptiness Given a context-free language L, is L =  ? decideCFLempty(G: context-free grammar) = 1. Let G = removeunproductive(G). 2. If S is not present in G then return True else return False.

Finiteness Given a context-free language L, is L infinite? decideCFLinfinite(G: context-free grammar) = 1. Lexicographically enumerate all strings in  * of length greater than b n and less than or equal to b n+1 + b n. 2. If, for any such string w, decideCFL(L, w) returns True then return True. L is infinite. 3. If, for all such strings w, decideCFL(L, w) returns False then return False. L is not infinite. Why these bounds?

Equivalence of DCFLs Theorem: Given two deterministic context-free languages L 1 and L 2, there exists a decision procedure to determine whether L 1 = L 2. Proof: Given in [Sénizergues 2001].

Some Undecidable Questions about CFLs ● Is L =  *? ● Is the complement of L context-free? ● Is L regular? ● Is L 1 = L 2 ? ● Is L 1  L 2 ? ● Is L 1  L 2 =  ? ● Is L inherently ambiguous? ● Is G ambiguous?

Regular and CF Languages Regular LanguagesContext-Free Languages ● regular exprs.● context-free grammars ● or ● regular grammars ● = DFSMs● = NDPDAs ● recognize● parse ● minimize FSMs● try to find unambiguous grammars ● try to reduce nondeterminism in PDAs ● find efficient parsers● closed under: ♦ concatenation ♦ concatenation ♦ union ♦ union ♦ Kleene star ♦ Kleene star ♦ complement ♦ intersection ♦ intersection w/ reg. langs ● pumping theorem ● D = ND ● D  ND

Languages and Machines SD D Context-Free Languages Regular Languages reg exps FSMs cfgs PDAs unrestricted grammars Turing Machines

SD Language Unrestricted Grammar Turing Machine L Accepts Grammars, SD Languages, and Turing Machines

Turing Machines We want a new kind of automaton: ● powerful enough to describe all computable things unlike FSMs and PDAs. ● simple enough that we can reason formally about it like FSMs and PDAs, unlike real computers. Goal: Be able to prove things about what can and cannot be computed.

Turing Machines At each step, the machine must: ● choose its next state, ● write on the current square, and ● move left or right.

A Formal Definition A (deterministic) Turing machine M is (K, , , , s, H): ● K is a finite set of states; ●  is the input alphabet, which does not contain  ; ●  is the tape alphabet, which must contain  and have  as a subset. ● s  K is the initial state; ● H  K is the set of halting states; ●  is the transition function: (K - H)   to K    { ,  } non-halting  tape  state  tape  direction to move state char char (R or L)

Notes on the Definition 1. The input tape is infinite in both directions. 2.  is a function, not a relation. So this is a definition for deterministic Turing machines. 3.  must be defined for all (state, tape symbol) pairs unless the state is a halting state. 4. Turing machines do not necessarily halt (unlike FSM's and most PDAs). Why? To halt, they must enter a halting state. Otherwise they loop. 5. Turing machines generate output, so they can compute functions.

An Example M takes as input a string in the language: { a i b j, 0  j  i}, and adds b ’s as required to make the number of b ’s equal the number of a ’s. The input to M will look like this: The output should be:

The Details K = {1, 2, 3, 4, 5, 6},  = { a, b },  = { a, b, , $, #}, s = 1,H = {6},  =

Notes on Programming The machine has a strong procedural feel, with one phase coming after another. There are common idioms, like scan left until you find a blank There are two common ways to scan back and forth marking things off. Often there is a final phase to fix up the output. Even a very simple machine is a nuisance to write.

Halting ● A DFSM M, on input w, is guaranteed to halt in |w| steps. ● A PDA M, on input w, is not guaranteed to halt. To see why, consider again M = But there exists an algorithm to construct an equivalent PDA M that is guaranteed to halt. A TM M, on input w, is not guaranteed to halt. And there is no algorithm to construct an equivalent TM that is guaranteed to halt.

Formalizing the Operation A configuration of a Turing machine M = (K, , , s, H) is an element of: K  ((  - {  })  *)  {  }    (  * (  - {  }))  {  } state up current after to current square current square square

Example Configurations (1)(q, ab, b, b )=(q, abbb ) (2)(q, , , aabb )=(q,  aabb ) Initial configuration is (s,  w). b

Yields (q 1, w 1 ) |- M (q 2, w 2 ) iff (q 2, w 2 ) is derivable, via , in one step. For any TM M, let |- M * be the reflexive, transitive closure of |- M. Configuration C 1 yields configuration C 2 if: C 1 |- M * C 2. A path through M is a sequence of configurations C 0, C 1, …, C n for some n  0 such that C 0 is the initial configuration and: C 0 |- M C 1 |- M C 2 |- M … |- M C n. A computation by M is a path that halts. If a computation is of length n (has n steps), we write: C 0 |- M n C n

Exercise A TM to recognize { ww R : w  {a, b}* }. If the input string is in the language, the machine should halt with y as its current tape symbol If not, it should halt with n as its current tape symbol. The final symbols on the rest of the tape may be anything.

Hidden: Exercise A TM to recognize ww R, where w  {a, b}* New tape symbols: A, B: (1,  )  (2, ,   ) (2,  )  (3, ,   ) 3 is a halting state (2, a)  (4,  A,   ) (4, a)  (4,  a,   ) (4, b)  (4,  b,   ) (4,  )  (5, ,   ) (4, A)  (5, A,  )

TMs are complicated … and low-level! We need higher-level "abbreviations". –Macros –Subroutines

A Macro language for Turing Machines (1) Define some basic machines ● Symbol writing machines For each x  , define M x, written just x, to be a machine that writes x. ● Head moving machines R:for each x  ,  (s, x) = (h, x,  ) L:for each x  ,  (s, x) = (h, x,  ) ● Machines that simply halt: h, which simply halts (don't care whether it accepts). n, which halts and rejects. y, which halts and accepts.

Next we need to describe how to: ● Check the tape and branch based on what character we see, and ● Combine the basic machines to form larger ones. To do this, we need two forms: ● M 1 M 2 Checking Inputs and Combining Machines

Turing Machines Macros Cont'd Example: >M 1 a M 2 b M 3 ● Start in the start state of M 1. ● Compute until M 1 reaches a halt state. ● Examine the tape and take the appropriate transition. ● Start in the start state of the next machine, etc. ● Halt if any component reaches a halt state and has no place to go. ● If any component fails to halt, then the entire machine may fail to halt.

a M 1 M 2 becomesM 1 a, b M 2 b M 1 all elems of  M 2 becomesM 1 M 2 or M 1 M 2 Variables M 1 all elems of  M 2 becomesM 1 x   a M 2 except a and x takes on the value of the current square M 1 a, b M 2 becomesM 1 x  a, b M 2 and x takes on the value of the current square M 1 x = y M 2 if x = y then take the transition e.g., > x    Rx if the current square is not blank, go right and copy it. More macros

Blank/Non-blank Search Machines Find the first blank square to the right of the current square. Find the first blank square to the left of the current square. Find the first nonblank square to the right of the current square. Find the first nonblank square to the left of the current square RR LL RR LL

More Search Machines L a Find the first occurrence of a to the left of the current square. R a, b Find the first occurrence of a or b to the right of the current square. L a, b a M 1 Find the first occurrence of a or b to the left of the current square, b then go to M 1 if the detected character is a ; go to M 2 if the M 2 detected character is b. L x  a, b Find the first occurrence of a or b to the left of the current square and set x to the value found. L x  a, b RxFind the first occurrence of a or b to the left of the current square, set x to the value found, move one square to the right, and write x ( a or b ).

An Example Input:  w w  {1}* Output:  w 3 Example:  111 

What does this machine do?