A shorted version from: Anastasia Berdnikova & Denis Miretskiy.

Slides:



Advertisements
Similar presentations
C O N T E X T - F R E E LANGUAGES ( use a grammar to describe a language) 1.
Advertisements

Lecture # 8 Chapter # 4: Syntax Analysis. Practice Context Free Grammars a) CFG generating alternating sequence of 0’s and 1’s b) CFG in which no consecutive.
Transformational Grammars The Chomsky hierarchy of grammars Context-free grammars describe languages that regular grammars can’t Unrestricted Context-sensitive.
1 Parsing The scanner recognizes words The parser recognizes syntactic units Parser operations: Check and verify syntax based on specified syntax rules.
CFGs and PDAs Sipser 2 (pages ). Long long ago…
Pushdown Automata Consists of –Pushdown stack (can have terminals and nonterminals) –Finite state automaton control Can do one of three actions (based.
Chapter Chapter Summary Languages and Grammars Finite-State Machines with Output Finite-State Machines with No Output Language Recognition Turing.
CFGs and PDAs Sipser 2 (pages ). Last time…
Theory of Computation What types of things are computable? How can we demonstrate what things are computable?
Languages, grammars, and regular expressions
Transformational grammars
January 14, 2015CS21 Lecture 51 CS21 Decidability and Tractability Lecture 5 January 14, 2015.
CS5371 Theory of Computation Lecture 8: Automata Theory VI (PDA, PDA = CFG)
PZ03A Programming Language design and Implementation -4th Edition Copyright©Prentice Hall, PZ03A - Pushdown automata Programming Language Design.
Grammars, Languages and Finite-state automata Languages are described by grammars We need an algorithm that takes as input grammar sentence And gives a.
FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY
Final Exam Review Cummulative Chapters 0, 1, 2, 3, 4, 5 and 7.
1 Introduction to Parsing Lecture 5. 2 Outline Regular languages revisited Parser overview Context-free grammars (CFG’s) Derivations.
CS490 Presentation: Automata & Language Theory Thong Lam Ran Shi.
Context-free Grammars Example : S   Shortened notation : S  aSaS   | aSa | bSb S  bSb Which strings can be generated from S ? [Section 6.1]
CSCI 2670 Introduction to Theory of Computing September 20, 2005.
1 Section 14.2 A Hierarchy of Languages Context-Sensitive Languages A context-sensitive grammar has productions of the form xAz  xyz, where A is a nonterminal.
Pushdown Automata (PDAs)
Design contex-free grammars that generate: L 1 = { u v : u ∈ {a,b}*, v ∈ {a, c}*, and |u| ≤ |v| ≤ 3 |u| }. L 2 = { a p b q c p a r b 2r : p, q, r ≥ 0 }
A sentence (S) is composed of a noun phrase (NP) and a verb phrase (VP). A noun phrase may be composed of a determiner (D/DET) and a noun (N). A noun phrase.
Grammars CPSC 5135.
Lecture # 9 Chap 4: Ambiguous Grammar. 2 Chomsky Hierarchy: Language Classification A grammar G is said to be – Regular if it is right linear where each.
Parsing Introduction Syntactic Analysis I. Parsing Introduction 2 The Role of the Parser The Syntactic Analyzer, or Parser, is the heart of the front.
Turing Machines Chapter 17. Languages and Machines SD D Context-Free Languages Regular Languages reg exps FSMs cfgs PDAs unrestricted grammars Turing.
THE CHURCH-TURING T H E S I S “ TURING MACHINES” Part 1 – Pages COMPUTABILITY THEORY.
1 A well-parenthesized string is a string with the same number of (‘s as )’s which has the property that every prefix of the string has at least as many.
9.7: Chomsky Hierarchy.
CS 208: Computing Theory Assoc. Prof. Dr. Brahim Hnich Faculty of Computer Sciences Izmir University of Economics.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
PZ03A Programming Language design and Implementation -4th Edition Copyright©Prentice Hall, PZ03A - Pushdown automata Programming Language Design.
Re-enter Chomsky More about grammars. 2 Parse trees S  A B A  aA | a B  bB | b Consider L = { a m b n | m, n > 0 } (one/more a ’s followed by one/more.
Grammars Hopcroft, Motawi, Ullman, Chap 5. Grammars Describes underlying rules (syntax) of programming languages Compilers (parsers) are based on such.
Grammars CS 130: Theory of Computation HMU textbook, Chap 5.
Introduction Finite Automata accept all regular languages and only regular languages Even very simple languages are non regular (  = {a,b}): - {a n b.
Syntax Analyzer (Parser)
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen Department of Computer Science University of Texas-Pan American.
Grammar Set of variables Set of terminal symbols Start variable Set of Production rules.
FORMAL LANGUAGES, AUTOMATA, AND COMPUTABILITY
1 CD5560 FABER Formal Languages, Automata and Models of Computation Lecture 12 Mälardalen University 2007.
1 A well-parenthesized string is a string with the same number of (‘s as )’s which has the property that every prefix of the string has at least as many.
Formal grammars A formal grammar is a system for defining the syntax of a language by specifying sequences of symbols or sentences that are considered.
1 Chapter Pushdown Automata. 2 Section 12.2 Pushdown Automata A pushdown automaton (PDA) is a finite automaton with a stack that has stack operations.
1 Section 12.2 Pushdown Automata A pushdown automaton (PDA) is a finite automaton with a stack that has stack operations pop, push, and nop. PDAs always.
Theory of Computation Automata Theory Dr. Ayman Srour.
CMSC 330: Organization of Programming Languages Pushdown Automata Parsing.
Theory of Languages and Automata By: Mojtaba Khezrian.
CS 154 Formal Languages and Computability May 12 Class Meeting Department of Computer Science San Jose State University Spring 2016 Instructor: Ron Mak.
1 Turing Machines. 2 The Language Hierarchy Regular Languages Context-Free Languages ? ?
Week 14 - Friday.  What did we talk about last time?  Simplifying FSAs  Quotient automata.
Syntax Analysis By Noor Dhia Syntax analysis:- Syntax analysis or parsing is the most important phase of a compiler. The syntax analyzer considers.
CS6800 Advance Theory of Computation Spring 2016 Nasser Alsaedi
Lecture 11  2004 SDU Lecture7 Pushdown Automaton.
Modeling Arithmetic, Computation, and Languages Mathematical Structures for Computer Science Chapter 8 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesAlgebraic.
Pushdown automata Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Section
Pushdown Automata PDAs
PZ03A - Pushdown automata
Jaya Krishna, M.Tech, Assistant Professor
فصل دوم Context-Free Languages
Chapter 2 Context-Free Language - 01
Pushdown automata Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Section
Pushdown automata Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Section
Pushdown automata Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Section
Pushdown automata Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Section
Pushdown automata Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Section
Presentation transcript:

A shorted version from: Anastasia Berdnikova & Denis Miretskiy

 ‘Colourless green ideas sleep furiously’.  Chomsky constructed finite formal machines – ‘grammars’.  ‘Does the language contain this sentence?’ (intractable)  ‘Can the grammar create this sentence?’ (can be answered).  TG are sometimes called generative grammars. Transformational grammars2

 TG = ( {symbols}, {rewriting rules α→β - productions} )  {symbols} = {nonterminal} U {terminal}  α contains at least one nonterminal, β – terminals and/or nonterminals.  S → aS, S → bS, S → e (S → aS | bS | e)  Derivation: S=>aS=>abS=>abbS=>abb.  Parse tree: root – start nonterminal S, leaves – the terminal symbols in the sequence, internal nodes are nonterminals.  The children of an internal node are the productions of it. Transformational grammars3

 W – nonterminal, a – terminal, α and γ – strings of nonterminals and/or terminals including the null string, β – the same not including the null string.  regular grammars: W → aW or W → a  context-free grammars: W → β  context-sensitive grammars: α 1 Wα 2 → α 1 βα 2. AB → BA  unrestricted (phase structure) grammars: α 1 Wα 2 → γ Transformational grammars4

5

 Each grammar has a corresponding abstract computational device – automaton.  Grammars: generative models, automata: parsers that accept or reject a given sequence.  - automata are often more easy to describe and understand than their equivalent grammars. - automata give a more concrete idea of how we might recognise a sequence using a formal grammar. Transformational grammars6

GrammarParsing automaton regular grammarsfinite state automaton context-free grammarspush-down automaton context-sensitive grammarslinear bounded automaton unrestricted grammarsTuring machine Transformational grammars7

 W → aW or W → a  sometimes allowed: W → e  RG generate sequence from left to right (or right to left: W → Wa or W → a)  RG cannot describe long-range correlations between the terminal symbols (‘primary sequence’) Transformational grammars8

 An example of a regular grammar that generates only strings of as and bs that have an odd number of as: start from S, S → aT | bS, T → aS | bT | e. Transformational grammars9

 One symbol at a time from an input string.  The symbol may be accepted => the automaton enters a new state.  The symbol may not be accepted => the automaton halts and reject the string.  If the automaton reaches a final ‘accepting’ state, the input string has been succesfully recognised and parsed by the automaton.  {states, state transitions of FSA}  {nonterminals, productions of corresponding grammar} Transformational grammars10

RG cannot describe language L when:  L contains all the strings of the form aa, bb, abba, baab, abaaba, etc. (a palindrome language).  L contains all the strings of the form aa, abab, aabaab (a copy language). Transformational grammars11

 Regular language: a b a a a b  Palindrome language: a a b b a a  Copy language: a a b a a b  Palindrome and copy languages have correlations between distant positions. Transformational grammars12

 The reason: RNA secondary structure is a kind of palindrome language.  The context-free grammars (CFG) permit additional rules that allow the grammar to create nested, long-distance pairwise correlations between terminal symbols.  S → aSa | bSb | aa | bb S => aSa => aaSaa => aabSbaa => aabaabaa Transformational grammars13

 The parsing automaton for CFGs is called a push- down automaton.  A limited number of symbols are kept in a push- down stack.  A push-down automaton parses a sequence from left to right according to the algorithm.  The stack is initialised by pushing the start nonterminal into it.  The steps are iterated until no input symbols remain.  If the stack is empty at the end then the sequence has been successfully parsed. Transformational grammars14

 Pop a symbol off the stack.  If the poped symbol is nonterminal: - Peek ahead in the input from the current position and choose a valid production for the nonterminal. If there is no valid production, terminate and reject the sequence. - Push the right side of the chosen production rule onto the stack, rightmost symbols first.  If the poped symbol is a terminal: - Compare it to the current symbol of the input. If it matches, move the automaton to the right on the input (the input symbol is accepted). If it does not match, terminate and reject the sequence. Transformational grammars15

 Copy language: cc, acca, agaccaga, etc.  initialisation: S → CWterminal generation: nonterminal generation:CA → aC W → AÂW | GĜW | CCG → gC nonterminal reordering: ÂC → Ca ÂG → GÂ ĜC → Cg ÂA → AÂtermination: ĜA → AĜCC → cc ĜG → GĜ Transformational grammars16

 A mechanism for working backwards through all possible derivations: either the start was reached, or valid derivation was not found.  Finite number of possible derivations to examine.  Abstractly: ‘tape’ of linear memory and a read/write head.  The number of possible derivations is exponentially large. Transformational grammars17

 Nondeterministic polynomial problems: there is no known polynomial-time algorithm for finding a solution, but a solution can be checked for correctness in polynomial time. [Context-sensitive grammars parsing.]  A subclass of NP problems - NP-complete problems. A polynomial time algorithm that solves one NP-complete problem will solve all of them. [Context-free grammar parsing.] Transformational grammars18

 Left and right sides of the production rules can be any combinations of symbols.  The parsing automaton is a Turing machine.  There is no general algorithm for determination whether a string has a valid derivation in less than infinite time. Transformational grammars19