CSE 3813 Introduction to Formal Languages and Automata Chapter 10 Other Models of Turing Machines These class notes are based on material from our textbook,

Slides:



Advertisements
Similar presentations
Restricted Machines Presented by Muhannad Harrim.
Advertisements

Turing Machines Memory = an infinitely long tape Persistent storage A read/write tape head that can move around the tape Initially, the tape contains only.
CS 345: Chapter 9 Algorithmic Universality and Its Robustness
Variants of Turing machines
THE CHURCH-TURING T H E S I S “ TURING MACHINES” Pages COMPUTABILITY THEORY.
Pushdown Automata Chapter 12. Recognizing Context-Free Languages We need a device similar to an FSM except that it needs more power. The insight: Precisely.
Foundations of (Theoretical) Computer Science Chapter 3 Lecture Notes (Section 3.2: Variants of Turing Machines) David Martin With.
CSC 3130: Automata theory and formal languages Andrej Bogdanov The Chinese University of Hong Kong Variants.
CS605 – The Mathematics and Theory of Computer Science Turing Machines.
CS21 Decidability and Tractability
1 Introduction to Computability Theory Lecture11: Variants of Turing Machines Prof. Amos Israeli.
Introduction to Computability Theory
1 Introduction to Computability Theory Lecture7: PushDown Automata (Part 1) Prof. Amos Israeli.
Turing’s Thesis Fall 2006 Costas Busch - RPI.
P, NP, PS, and NPS By Muhannad Harrim. Class P P is the complexity class containing decision problems which can be solved by a Deterministic Turing machine.
Complexity ©D.Moshkovitz 1 Turing Machines. Complexity ©D.Moshkovitz 2 Motivation Our main goal in this course is to analyze problems and categorize them.
CS 310 – Fall 2006 Pacific University CS310 Turing Machines Section 3.1 November 6, 2006.
CS 490: Automata and Language Theory Daniel Firpo Spring 2003.
Computability and Complexity 3-1 Turing Machine Computability and Complexity Andrei Bulatov.
Finite State Machines Data Structures and Algorithms for Information Processing 1.
CS5371 Theory of Computation Lecture 8: Automata Theory VI (PDA, PDA = CFG)
1 Turing Machines. 2 A Turing Machine Tape Read-Write head Control Unit.
AUTOMATA THEORY VIII.
CSE202: Introduction to Formal Languages and Automata Theory Chapter 9 The Turing Machine These class notes are based on material from our textbook, An.
CSCI 2670 Introduction to Theory of Computing September 28, 2005.
 Computability Theory Turing Machines Professor MSc. Ivan A. Escobar
Pushdown Automata (PDAs)
Introduction to CS Theory Lecture 15 –Turing Machines Piotr Faliszewski
CS 3813: Introduction to Formal Languages and Automata
1 More About Turing Machines “Programming Tricks” Restrictions Extensions Closure Properties.
THE CHURCH-TURING T H E S I S “ TURING MACHINES” Part 1 – Pages COMPUTABILITY THEORY.
1 Turing Machines Reading: Chapter 8. 2 Turing Machines are… Very powerful (abstract) machines that could simulate any modern day computer (although very,
Computer Science 101 Theory of Computing. Computer Science is... The study of algorithms, with respect to –their formal properties –their linguistic realizations.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
Automata & Formal Languages, Feodor F. Dragan, Kent State University 1 CHAPTER 3 The Church-Turing Thesis Contents Turing Machines definitions, examples,
1 Extensions to Turing machines TMs are clumsy. Can we make them more efficient or more powerful? We will consider 6 possibilities: 1.Multiple tapes 2.Multiple.
CS 3813: Introduction to Formal Languages and Automata Chapter 2 Deterministic finite automata These class notes are based on material from our textbook,
CS 3813: Introduction to Formal Languages and Automata Chapter 12 Limits of Algorithmic Computation These class notes are based on material from our textbook,
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
Foundations of (Theoretical) Computer Science Chapter 2 Lecture Notes (Section 2.2: Pushdown Automata) Prof. Karen Daniels, Fall 2010 with acknowledgement.
Costas Busch - LSU1 Turing’s Thesis. Costas Busch - LSU2 Turing’s thesis (1930): Any computation carried out by mechanical means can be performed by a.
Lecture 17 Undecidability Topics:  TM variations  Undecidability June 25, 2015 CSCE 355 Foundations of Computation.
CS 154 Formal Languages and Computability February 4 Class Meeting Department of Computer Science San Jose State University Spring 2016 Instructor: Ron.
1 Turing Machines and Equivalent Models Section 13.1 Turing Machines.
Automata & Formal Languages, Feodor F. Dragan, Kent State University 1 CHAPTER 3 The Church-Turing Thesis Contents Turing Machines definitions, examples,
1 Introduction to Turing Machines
1 CD5560 FABER Formal Languages, Automata and Models of Computation Lecture 12 Mälardalen University 2007.
Pushdown Automata Chapter 12. Recognizing Context-Free Languages Two notions of recognition: (1) Say yes or no, just like with FSMs (2) Say yes or no,
1 8.4 Extensions to the Basic TM Extended TM’s to be studied: Multitape Turing machine Nondeterministic Turing machine The above extensions make no increase.
Turing Machines CS 130 Theory of Computation HMU Textbook: Chap 8.
FORMAL LANGUAGES, AUTOMATA, AND COMPUTABILITY * Read chapter 4 of the book for next time * Lecture9x.ppt.
Theory of Computation Automata Theory Dr. Ayman Srour.
Universal Turing Machine
CS6800 Advance Theory of Computation Spring 2016 Nasser Alsaedi
Lecture 11  2004 SDU Lecture7 Pushdown Automaton.
More About Turing Machines
Turing’s Thesis Costas Busch - LSU.
CSE202: Introduction to Formal Languages and Automata Theory
8. Introduction to Turing Machines
More variants of Turing Machines
OTHER MODELS OF TURING MACHINES
CSE 105 theory of computation
Turing Machines Acceptors; Enumerators
Turing Machines (TM) Deterministic Turing Machine (DTM)
8. Introduction to Turing Machines
Recall last lecture and Nondeterministic TMs
Decidability and Tractability
CSE 105 theory of computation
Intro to Theory of Computation
CSE 105 theory of computation
Presentation transcript:

CSE 3813 Introduction to Formal Languages and Automata Chapter 10 Other Models of Turing Machines These class notes are based on material from our textbook, An Introduction to Formal Languages and Automata, 4th ed., by Peter Linz, published by Jones and Bartlett Publishers, Inc., Sudbury, MA, They are intended for classroom use only and are not a substitute for reading the textbook.

Diagrams from some slides are from a previous year’s textbook: Martin, John C., Introduction to Languages and the Theory of Computation. Boston: WCG McGraw- Hill, Slides are for use of this class only.

Variations on TMs Most variations don’t add to or subtract from the power of the standard TM Additions: Stay instruction: in each move, the R/W head moves right or left or stays under the same cell Tape is infinite in both directions Multiple tapes (see proof in textbook) Restrictions: Each move either writes to the tape or moves, but doesn’t do both

Computational power No attempt to extend the computational power of Turing machines yields a model of computation more powerful than the standard one-tape, one- head, deterministic Turing machine By computational power, we mean what can be computed -- not how fast it can be computed. Your desktop may run faster than a Turing machine, but it can’t compute anything that a Turing machine can’t also compute.

Off-line Turing machine What if the TM has a second tape that is used to hold the original string, while the main tape is used for processing. You never have to write over the original string. Does this add any power to the TM? No. Imagine writing the string onto the main tape, then inserting a special mark on the tape, then copying the string after the mark and doing all the processing after the mark.

Multiple tapes Consider a TM with k tapes and a separate head for each tape. It reads and writes on these tapes in parallel. We can show that this does not increase the computational power of a TM by showing that any multi-tape TM can be simulated by a standard, single-tape TM. The details of the simulation involve dividing a single tape into multiple tracks -- using an alphabet consisting of tuples, with one element for each track.

 abc... lmn xyz   Multiple tapes TM with 3 tapes, simulated by...

(a,l,x)(b,m,y)(c,n,z)...  abc lmn xyz a TM with 1 tape, but 3 tracks on the tape... or a TM with 1 track with tuples in each cell... 

@$%... or a TM with the “words” replaced by unique symbols, one symbol per tuple  alxbmycnz... or a TM with “words” instead of tuples... 

Multiple heads In this case, there is a single tape but k heads that can read/write at different places on the tape at the same time. We show that this does not increase the computational power of TMs by showing that a multiple-head TM can be simulated by a standard single-head TM. The simulation details are similar to those for a multi-tape TM.

Multiple heads xyz...   simulated by a tape with single head that reads and writes tuples... (x, z) (y,  )( ,  )... 

Two-dimensional tapes A 2-dimensional tape is a grid that extends infinitely downward as well as to the right. The head can move in 4 directions: right, left, up, and down. This TM can also be simulated by a TM with a single, one-dimensional tape

Two-dimensional tapes a l x... b m y... c n z... abc#lmn#x 

Random-access Turing machine Instead of accessing data on the tape sequentially, imagine a TM that has random-access memory and can go to any cell of the tape in one step. To allow this, the TM has registers that can store memory addresses. We can simulate this by a multi-tape TM in which one tape is used as memory and the extra tapes are used as registers.

Random-access Turing machine abcabcabc Tape Register Register 2

Nondeterministic Turing machine A nondeterministic TM (NTM) has more than one transition with the same left-hand part, which means more than one transition can be made from the same configuration. Nondeterminism allows a TM to have different outputs for the same input. This does not make sense when computing a function, but makes sense for language-recognition in the same way as before. A string is accepted if some computation leads to the halting state.

Non-determinism Back when we looked at finite state machines, we discovered that, although it might take fewer moves to process a string in a regular language with a non- deterministic finite automaton, we could always build a deterministic finite automaton to recognize the same strings.

Non-determinism Deterministic and nondeterministic Turing machines are similar; it may be possible to do things faster with a non-deterministic TM, but it is always possible to build an equivalent deterministic TM that recognizes the same language.

Nondeterminism and computational power Nondeterminism does not increase the computational power of a TM. We can show this by showing that any NTM can be simulated by a DTM using a technique that the book calls “dovetailing.”

Nondeterminism and efficiency Although nondeterminism does not increase the computational power of a TM, it lets it compute some things more efficiently by guessing the right thing to do. Although a DTM can always simulate a NTM, the DTM may be much more inefficient because it has to try all possibilities to find the right one.

Nondeterminism and efficiency Surprisingly, the question whether a DTM can simulate an NTM efficiently is still unresolved. It is the famous question of whether P = NP. P stands for “can be solved by a standard deterministic Turing machine in polynomial time” NP stands for “can be solved by a non- deterministic Turing machine in polynomial time”

Nondeterministic TMs Non-determinism doesn’t add any power to a TM to solve harder problems. We can always simulate an ordinary TM on a non-deterministic Turing Machine (NTM) by not using the freedom to be non-deterministic. Theorem: A non-deterministic Turing Machine (NTM) can be simulated exactly by a deterministic Turing Machine. So TMs and NTMs are equivalent.

Variations of TM that limit its power What if we change the transition rules so that the read/write head can only move right? Or delete the finite state controller? Wouldn’t those changes limit the power of a TM? Yes! In fact, those changes would limit the power of the TM so much that you really couldn’t call it a TM any more.

Variations of TM that limit its power Restricting the amount of tape that a TM can use limits its computational power. Theory tells us that this is the only modification to the standard TM that can limit the power of the TM.

Variations of TM that limit its power What if we limit the size of the tape to some arbitrary constant limit, no matter what language we are trying to recognize? We may have a string too long to fit on the tape. The resulting machine is weaker than a Push- Down Automaton, which has an infinite stack. The advantages of a tape can’t make up for lack of adequate storage. Equivalent to a Finite State Automaton (tape size = 0).

Variations of TM that limit its power What if we limit the size of tape to the size of the input string? This gives us a Linear-Bounded Automaton (LBA). An LBA can accept all context-free languages plus other languages like {a n b n c n | n  0} and {ww | w  {a,b}*}, but not some of the other languages accepted by a standard TM. It is more powerful than a PDA but less powerful than a TM.

Universal Turing Machines The Universal Turing machine simulates any other TM with any tape. The UTM tape has a description of another TM on it, followed by an encoding of the tape that the machine will run on. The Universal Turing machine decodes and simulates the represented TM.

Encoding function A specific TM is defined primarily by its transition function. Each move of a TM is described by the formula: d (p, a) = (q, b, D ) where: p is the initial state a is the current character on the tape q is the state moved to b is the character written on the tape D is the direction the tape head moves

Encoding function Suppose that we represent a move, such as d (q 3, a) = (q 4, , R) like this: q 3 a q 4  R initial state current character on the tape state moved to character written on the tape direction the tape head moves q 3 a q 4  R Can you tell what this is supposed to represent?

Encoding function So here is our “condensed” rule: q 3 a q 4  R Now let’s encode each of these 5 components as a sequence of 0’s, separated by 1’s. For example the halt state will be represented by a single 0 q 0 will be represented by two 0’s q 1 will be represented by three 0’s etc.

Encoding function Characters:  = 0 a= 00 b= 000 States: halt= 0 q 0 = 00 q 1 = 000 Direction: Stay= 0 Left= 00 Right= 000

Encoding function But 00 can stand for both the character a and state q 0 ; won’t we get confused? No, because there are 5 parts to each rule, the parts are separated by 1’s, and the parts always come in the same order. So: unambiguously represents: q 0  q 1  R

Change leftmost a to b This TM has 6 transition rules.

Change leftmost a to b This TM has 6 transition rules: q 0 Δ q 1 Δ R q 1 b q 1 b R q 1 a q 2 b L q 1 Δ q 2 Δ L q 2 b q h b L q 2 Δ q h Δ S q0q0 q1q1 q2q2 q halt  / ,R  / ,L  / ,S b / b,Rb / b,L a / b,L

Change leftmost a to b We use “11” to separate the rules from each other. So this TM can be represented by: q 0 Δq 1 ΔR = q 1 bq 1 bR = q 1 aq 2 bL = q 1 Δq 2 ΔL = q 2 bq h bL = q 2 Δq h ΔS =

Change leftmost a to b We can also encode the input string. The string baa would be encoded as: We use 11 to separate this string from the TM. So, an encoding of the entire TM, plus the string that it is supposed to process, looks like this:

How does the T u work? The universal Turing machine, T u, will have 3 tapes. The first tape will be the input/output tape, and initially it contains the entire string, representing both the specific TM we want to simulate, plus the string the TM is supposed to process. The second tape is the work tape. We will move the encoded string to this tape.

How does the T u work? The third tape will be used to represent the state that the simulated TM is currently in. We start off by copying the initial state of the TM (q 0, or 00, in this case) to tape 3.

How does the T u work? Tape 1: input/output tape Tape 3: state the simulated TM is in Tape 2: work tape; contains encoded string

How does the T u work? You can see how the T u is going to work: The precondition of any transition rule is: the current state the TM is in (available on tape 3), and the character on the TM’s tape that we are currently reading (available on tape 2). We then look on tape 1 to find the rule whose precondition matches this one.

How does the T u work? Finally, we execute the postcondition part: changing the TM’s state to the new state (replacing the old state on tape 3), writing a character onto the TM’s tape (on T u ’s tape 2), and moving the tape head (on tape 2) left, right, or staying.

Does the T u model the encoded TM? Yes. Why? Because it is deterministic, there are only 2 possibilities, crash or halt. It will crash when the encoded TM does, and halt when the encoded TM does.

Does the T u model the encoded TM? Crash: If the encoded TM crashes, T u will not find a transition and will crash. Halt: If the encoded TM halts, T u notices this when it tries to write a single 0 (the halt state) to tape 3 (which keeps track of the current state the TM is in). At this point it erases tape 1, copies tape 2 onto tape 1 and halts.

Conclusion: Anything that is effectively calculable can be executed on a TM. A universal TM can compute anything that any other Turing machine can compute. The universal TM is itself a standard TM. A CPU with RAM is a finite version of a TM; it has the power of a TM up to the point that it runs out of memory. Languages or hardware that provides compares, loops, and increments are termed Turing complete and can also compute anything that is effectively calculable.