1 Dr. torng CFG → PDA construction Shows that for any CFL L, there exists a PDA M such that L(M) = L The reverse is true, but we skip the proof Parsing.

Slides:



Advertisements
Similar presentations
1 Chapter Parsing Techniques. 2 Section 12.3 Parsing Techniques We know (via a theorem) that the context- free languages are exactly those languages.
Advertisements

Theory of Computation CS3102 – Spring 2014 A tale of computers, math, problem solving, life, love and tragic death Nathan Brunelle Department of Computer.
Chapter 5 Pushdown Automata
1 Pushdown Automata (PDA) Informally: –A PDA is an NFA-ε with a stack. –Transitions are modified to accommodate stack operations. Questions: –What is a.
1 Lecture 32 Closure Properties for CFL’s –Kleene Closure construction examples proof of correctness –Others covered less thoroughly in lecture union,
Pushdown Automata Chapter 12. Recognizing Context-Free Languages Two notions of recognition: (1) Say yes or no, just like with FSMs (2) Say yes or no,
LR-Grammars LR(0), LR(1), and LR(K).
142 Parsing start (a, Z 0 /aZ 0 ) ( a, a/aa ) (b, a/  ) ( , Z 0 /Z 0 ) For a given CFG G, parsing a string w is to check if w  L(G) and, if it is,
Pushdown Automata Part II: PDAs and CFG Chapter 12.
1 Module 32 Pushdown Automata (PDA’s) –definition –Example We define configurations and computations of PDA’s We define L(M) for PDA’s.
Module 28 Context Free Grammars Definition of a grammar G
1 Module 33 Pushdown Automata (PDA’s) –Another example.
1 Normal Forms for Context-free Grammars. 2 Chomsky Normal Form All productions have form: variable and terminal.
1 Module 34 CFG --> PDA construction –Shows that for any CFL L, there exists a PDA M such that L(M) = L –The reverse is true as well, but we do not prove.
1 Lecture 32 CFG --> PDA construction –Shows that for any CFL L, there exists a PDA M such that L(M) = L –The reverse is true as well, but we do not prove.
1 Lecture 36 Attempt to prove that CFL’s are closed under intersection –Review previous constructions –Translate previous constructions to current setting.
1 Module 31 Closure Properties for CFL’s –Kleene Closure construction examples proof of correctness –Others covered less thoroughly in lecture union, concatenation.
CS5371 Theory of Computation Lecture 8: Automata Theory VI (PDA, PDA = CFG)
Context-Free Grammars Chapter 3. 2 Context-Free Grammars and Languages n Defn A context-free grammar is a quadruple (V, , P, S), where  V is.
FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY
Finite State Machines Data Structures and Algorithms for Information Processing 1.
1 Computer Language Theory Chapter 2: Context-Free Languages.
1 CD5560 FABER Formal Languages, Automata and Models of Computation Lecture 8 Mälardalen University 2010.
Pushdown Automata (PDA) Part 2 While I am distributing graded exams: Design a PDA to accept L = { a m b n : m  n; m, n > 0} MA/CSSE 474 Theory of Computation.
Pushdown Automata.
Chapter 7 PDA and CFLs.
نظریه زبان ها و ماشین ها فصل دوم Context-Free Languages دانشگاه صنعتی شریف بهار 88.
Pushdown Automata (PDA) Intro
Context-free Grammars Example : S   Shortened notation : S  aSaS   | aSa | bSb S  bSb Which strings can be generated from S ? [Section 6.1]
TM Design Universal TM MA/CSSE 474 Theory of Computation.
Pushdown Automata (PDAs)
Context Free Grammars CIS 361. Introduction Finite Automata accept all regular languages and only regular languages Many simple languages are non regular:
Chapter 5 Context-Free Grammars
Context-free Languages Chapter 2. Ambiguity.
1 Module 31 Closure Properties for CFL’s –Kleene Closure –Union –Concatenation CFL’s versus regular languages –regular languages subset of CFL.
1 Section 12.3 Context-Free Parsing We know (via a theorem) that the context-free languages are exactly those languages that are accepted by PDAs. When.
1 CD5560 FABER Formal Languages, Automata and Models of Computation Lecture 11 Midterm Exam 2 -Context-Free Languages Mälardalen University 2005.
Pushdown Automata Chapters Generators vs. Recognizers For Regular Languages: –regular expressions are generators –FAs are recognizers For Context-free.
Chapter 7 Pushdown Automata
Chapter 7 Pushdown Automata. Context Free Languages A context-free grammar is a simple recursive way of specifying grammar rules by which strings of a.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
PushDown Automata. What is a stack? A stack is a Last In First Out data structure where I only have access to the last element inserted in the stack.
Foundations of (Theoretical) Computer Science Chapter 2 Lecture Notes (Section 2.2: Pushdown Automata) Prof. Karen Daniels, Fall 2010 with acknowledgement.
Context-Free and Noncontext-Free Languages Chapter 13.
Formal Languages, Automata and Models of Computation
Lecture 14UofH - COSC Dr. Verma 1 COSC 3340: Introduction to Theory of Computation University of Houston Dr. Verma Lecture 14.
FORMAL LANGUAGES, AUTOMATA, AND COMPUTABILITY
1 CDT314 FABER Formal Languages, Automata and Models of Computation Lecture 8 Mälardalen University 2011.
Pushdown Automata Chapter 12. Recognizing Context-Free Languages Two notions of recognition: (1) Say yes or no, just like with FSMs (2) Say yes or no,
1 Chapter Pushdown Automata. 2 Section 12.2 Pushdown Automata A pushdown automaton (PDA) is a finite automaton with a stack that has stack operations.
CS 154 Formal Languages and Computability March 10 Class Meeting Department of Computer Science San Jose State University Spring 2016 Instructor: Ron Mak.
1 Section 12.2 Pushdown Automata A pushdown automaton (PDA) is a finite automaton with a stack that has stack operations pop, push, and nop. PDAs always.
CS 154 Formal Languages and Computability March 15 Class Meeting Department of Computer Science San Jose State University Spring 2016 Instructor: Ron Mak.
Lecture 11  2004 SDU Lecture7 Pushdown Automaton.
Introduction to Formal Languages and Automata
Pushdown Automata.
Programming Languages Translator
Theorem 29 Given any PDA, there is another PDA that accepts exactly the same language with the additional property that whenever a path leads to ACCEPT,
Chapter 7 PUSHDOWN AUTOMATA.
Pushdown Automata.
PARSE TREES.
Pushdown Automata Reading: Chapter 6.
Chapter 5 Pushdown Automata
فصل دوم Context-Free Languages
Key to Homework #8 1. For each of the following context-free grammars (a) and (b) below, construct an LL(k) parser with minimum k according to the following.
CFGs: Formal Definition
Chapter Fifteen: Stack Machine Applications
Pushdown Automata (PDA) Part 2
Conversion of CFG to PDA Conversion of PDA to CFG
More About Nondeterminism
Presentation transcript:

1 Dr. torng CFG → PDA construction Shows that for any CFL L, there exists a PDA M such that L(M) = L The reverse is true, but we skip the proof Parsing

2 CFL subset LPDA Let L be an arbitrary CFL Let G be the CFG such that L(G) = L G exists by definition of L is CF Construct a PDA M such that L(M) = L(G) Argue L(M) = L There exists a PDA M such that L(M) = L L is in LPDA By definition of L in LPDA

3 Visualization CFL LPDA CFG’s PDA’s L L G M Let L be an arbitrary CFL Let G be the CFG such that L(G) = L G exists by definition of L is CF Construct a PDA M such that L(M) = L M is constructed from CFG G Argue L(M) = L There exists a PDA M such that L(M) = L L is in LPDA By definition of L in LPDA

4 Algorithm Specification Input CFG G Output PDA M such that L(M) = CFG GPDA M A

5 Construction Idea The basic idea is to have a 2-phase PDA Phase 1: Derive all strings in L(G) on the stack nondeterministically Do not process any input while we are deriving the string on the stack Phase 2: Match the input string against the derived string on the stack This is a deterministic process Move to an accepting state only when the stack is empty

6 Illustration Input Grammar G V = {S}  = {a,b} S = S P: S → aSb | What is L(G)? 1. Derive all strings in L(G) on the stack 2. Match the derived string against input (q 0, aabb, Z) /* put S on stack */ (q 1, aabb, SZ) /* derive aabb on stack */ (q 1, aabb, aSbZ) (q 1, aabb, aaSbbZ) (q 1, aabb, aabbZ) /* match stack vs input */ (q 2, aabb, aabbZ) (q 2, abb, abbZ) (q 2, bb, bbZ) (q 2, b, bZ) (q 2,, Z) (q 3,, Z) Illustration of how the PDA might work, though not completely accurate.

7 Difficulty 1. Derive all strings in L(G) on the stack 2. Match the derived string against input (q 0, aabb, Z) /* put S on stack */ (q 1, aabb, SZ) /* derive aabb on stack */ (q 1, aabb, aSbZ) (q 1, aabb, aaSbbZ) (q 1, aabb, aabbZ) /* match stack vs input */ (q 2, aabb, aabbZ) (q 2, abb, abbZ) (q 2, bb, bbZ) (q 2, b, bZ) (q 2,, Z) (q 3,, Z) What is illegal with the computation graph on the left?

8 Construction Input Grammar G=(V, , S, P) Output PDA M=(Q, , , q 0, Z, F,  ) Q = {q 0, q 1, q 2 }  =   = V union  union {Z} Z = Z q 0 = q 0 F = {q 2 }  : Fixed Transitions –  (q 0,, Z) = (q 1, SZ) –  (q 1,, Z) = (q 2, Z) Production Transitions For all productions A → , add  q 1,, A) = (q 1,  ) Matching Transitions For all a in , add  q 1, a, a) = (q 1, )

9 Examples

10 Balanced Parentheses BALG: V = {S}  = {(,)} S = S P: S → SS | (S) | λ Output PDA M=(Q, ,q 0,Z,F, δ) Q = {q 0, q 1, q 2 }  = {(,),S,Z} q 0 = q 0 Z = Z F = {q 2 } δ: Fixed Transitions δ(q 0, λ, Z) = (q 1, SZ) δ(q 1, λ, Z) = (q 2, Z) Production Transitions δ(q 1, λ, S) = (q1, SS) δ  q 1, λ, S) = (q 1, (S)) δ  q 1, λ, S) = (q 1, λ) Matching transitions δ  q 1, (, ( = (q 1, λ) δ  q 1, ), ) = (q 1, λ)

11 BALG Transition Table Transition Current Input Top of Next Stack Number State Symbol Stack State Update q 0 Z q 1 SZ 2 q 1 Z q 2 Z 3 q 1 S q 1 SS 4 q 1 S q 1 (S) 5 q 1 S q 1 λ 6 q 1 ( ( q 1 7 q 1 ) ) q 1

12 Partial Computation Graph (q 0, ()(), Z) (q 1, ()(), SZ) (q 1, ()(), SSZ) (other branches not shown) (q 1, ()(), (S)SZ) (other branches not shown) (q 1, )(), S)SZ) (q 1, )(),)SZ) (other branches not shown) (q 1, (), SZ) (q 1, (), (S)Z) (other branches not shown) (q 1, ), S)Z) (q 1, ), )Z) (other branches not shown) (q 1,, Z) (q 2,, Z)

13 Palindromes PALG: V = {S}  = {a,b} S = S P: S → aSa | bSb | a | b | Output PDA M=(Q, ,q 0,Z,F,  ) Q = {q 0, q 1, q 2 }  = {a,b,S,Z} q 0 = q 0 Z = Z F = {q 2 } δ: Fixed Transitions  (q 0,, Z) = (q 1, SZ)  (q 1,, Z) = (q 2, Z) Production Transitions  q 1,, S) = (q 1, aSa)  q 1,, S) = (q 1, bSb)  q 1,, S) = (q 1, a)  q 1,, S) = (q 1, b)  q 1,, S) = (q 1, ) Matching transitions  q 1, a, a) = (q 1, )  q 1, b, b) = (q 1, )

14 Palindrome Transition Table Transition Current Input Top of Next Stack Number State Symbol Stack State Update q 0 Z q 1 SZ 2 q 1 Z q 2 Z 3 q 1 S q 1 aSa 4 q 1 S q 1 bSb 5 q 1 S q 1 a 6 q 1 S q 1 b 7 q 1 S q 1 8 q 1 a a q 1 9 q 1 b b q 1

15 Partial Computation Graph (q 0, aba, Z) (q 1, aba, SZ) (q 1, aba, aSaZ) (other branches not shown) (q 1, ba, SaZ) (q 1, ba, baZ) (other branches not shown) (q 1, a, aZ) (q 1,, Z) (q 2,, Z) On your own, draw computation trees for other strings not in the language and see that they are not accepted.

16 {a n b n | n ≥ 0} Grammar G: V = {S}  = {a,b} S = S P: S → aSb | Output PDA M=(Q, ,q 0,Z,F,  ) Q = {q 0, q 1, q 2 }  = {a,b,S,Z} q 0 = q 0 Z = Z F = {q 2 }  : Fixed Transitions  (q 0,, Z) = (q 1, SZ)  (q 1,, Z) = (q 2, Z) Production Transitions Matching transitions

17 {a n b n | n ≥ 0} Transition Table Transition Current Input Top of Next Stack Number State Symbol Stack State Update q 0 Z 2 q 1 Z 3 q 1 S 4 q 1 S 5 q 1 a a 6 q 1 b b

18 Partial Computation Graph (q 0, aabb, Z) (q 1, aabb, SZ) (q 1, aabb, aSbZ) (other branch not shown) (q 1, abb, SbZ) (q 1, abb, aSbbZ) (other branch not shown) (q 1, bb, SbbZ) (q 1, bb, bbZ) (other branch not shown) (q 1, b, bZ) (q 1,, Z) (q 2,, Z)

19 {a i b j | i = j or i = 2j} Grammar G: V = {S,T,U}  = {a,b} S = S P: S → T | U T → aTb | U → aaUb | Output PDA M=(Q, ,q 0,Z,F,  ) Q = {q 0, q 1, q 2 }  = {a,b,S,T,U,Z} q 0 = q 0 Z = Z F = {q 2 }  Fixed Transitions   (q 0,, Z) = (q 1, SZ)   (q 1,, Z) = (q 2, Z) Production Transitions Matching transitions

20 {a i b j | i = j or i = 2j} Transition Table Transition Current Input Top of Next Stack Number State Symbol Stack State Update q 0 Z q 1 SZ 2 q 1 Z q 2 Z 3 q 1 S q 1 T 4 q 1 S q 1 U 5 q 1 T q 1 aTb 6 q 1 T q 1 7 q 1 U q 1 aaUb 8 q 1 U q 1 9 q 1 a a q 1 10 q 1 b b q 1

21 Partial Computation Graph (q 0, aab, Z) (q 1, aab, SZ) (q 1, aab, UZ) (other branch not shown) (q 1, aab, aaUbZ) (other branch not shown) (q 1, ab, aUbZ) (q 1, b, UbZ) (q 1, b, bZ) (other branch not shown) (q 1,, Z) (q 2,, Z)

22 Parsing

Eliminating Nondeterminism Lets revisit the BALG grammar S → SS | (S) | λ Whenever S is on top of the stack, we do not look at the input and nondeterministically select one of the 3 productions Think about the PDA parsing ( ) How might we try to eliminate the nondeterminism to choose between the productions? Will that work for this grammar? 23

Another Grammar BALG2 grammar S → (S)S | λ Now try parsing ( ) Any issues? Need an end marker, say $ T → S$, S → (S)S | λ 24

Resulting Transition Table * Transition Current Input Top of Next Stack Number State Symbol Stack State Update q 0 Z q 1 TZ 2 q 1 Z q 2 Z 3 q 1 T q 1 S$ 4 q 1 S q 1 (S)S 5 q 1 S q 1 λ 6 q 1 ( ( q 1 7 q 1 ) ) q 1 8 q 1 $ $ q 1 4’ q 1 ( S q 1 S)S 5’ q 1 ) S q ) λ 5’’ q ) λ ) q 1 λ 5’’’ q 1 $ S q $ λ 5’’’’ q $ λ $ q 1 λ 25

Deterministic Parsers BALG2 grammar {S → (S)S | λ} is called an LL(1) grammar The nondeterministic top-down PDA can be converted into a deterministic top-down parser by “looking ahead” 1 character Generalize to LL(k) grammar with k characters of lookahead LR(k) grammars correspond to bottom-up parsers using “shift” and “reduce” operations Shift: read and push input symbol onto stack Reduce: replace string on stack with variable that derives it 26

27 Comments You should be able to execute the algorithm Given any CFG, construct an equivalent PDA You should understand the idea behind this algorithm Derive string on stack and then match it against input You should understand how this construction can help you design PDA’s You should understand that it can be used in answer- preserving input transformations between decision problems about CFL’s. You should have a basic intuition about parsing