Conversion of CFG to PDA Conversion of PDA to CFG

Slides:



Advertisements
Similar presentations
The Pumping Lemma for CFL’s
Advertisements

Chapter 5 Pushdown Automata
1 Pushdown Automata (PDA) Informally: –A PDA is an NFA-ε with a stack. –Transitions are modified to accommodate stack operations. Questions: –What is a.
1 Lecture 32 Closure Properties for CFL’s –Kleene Closure construction examples proof of correctness –Others covered less thoroughly in lecture union,
Grammars, constituency and order A grammar describes the legal strings of a language in terms of constituency and order. For example, a grammar for a fragment.
Pushdown Automata (PDA)
Introduction to Computability Theory
CS5371 Theory of Computation
Conversion of CFG to PDA Conversion of PDA to CFG
Courtesy Costas Busch - RPI1 NPDAs Accept Context-Free Languages.
Fall 2004COMP 3351 NPDA’s Accept Context-Free Languages.
1 Module 31 Closure Properties for CFL’s –Kleene Closure construction examples proof of correctness –Others covered less thoroughly in lecture union, concatenation.
CS5371 Theory of Computation Lecture 8: Automata Theory VI (PDA, PDA = CFG)
January 15, 2014CS21 Lecture 61 CS21 Decidability and Tractability Lecture 6 January 16, 2015.
Fall 2006Costas Busch - RPI1 PDAs Accept Context-Free Languages.
Finite State Machines Data Structures and Algorithms for Information Processing 1.
Pushdown Automata.
Definition Moves of the PDA Languages of the PDA Deterministic PDA’s Pushdown Automata 11.
1 CD5560 FABER Formal Languages, Automata and Models of Computation Lecture 11 Midterm Exam 2 -Context-Free Languages Mälardalen University 2005.
Closure Properties Lemma: Let A 1 and A 2 be two CF languages, then the union A 1  A 2 is context free as well. Proof: Assume that the two grammars are.
C SC 473 Automata, Grammars & Languages Automata, Grammars and Languages Discourse 04 Context-Free Grammars and Pushdown Automata.
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen Department of Computer Science University of Texas-Pan American.
FORMAL LANGUAGES, AUTOMATA, AND COMPUTABILITY
CS 154 Formal Languages and Computability March 15 Class Meeting Department of Computer Science San Jose State University Spring 2016 Instructor: Ron Mak.
Costas Busch - LSU1 PDAs Accept Context-Free Languages.
6. Pushdown Automata CIS Automata and Formal Languages – Pei Wang.
5. Context-Free Grammars and Languages
Pushdown Automata.
Closed book, closed notes
Normal Forms for CFG’s Eliminating Useless Variables Removing Epsilon
Lecture 14 Push Down Automata (PDA)
Properties of Context-Free Languages
PDAs Accept Context-Free Languages
CSE 105 theory of computation
Theorem 29 Given any PDA, there is another PDA that accepts exactly the same language with the additional property that whenever a path leads to ACCEPT,
NPDAs Accept Context-Free Languages
CIS Automata and Formal Languages – Pei Wang
PDAs Accept Context-Free Languages
PDAs Accept Context-Free Languages
Pushdown Automata.
Chapter 7 PUSHDOWN AUTOMATA.
Lecture 17 Oct 25, 2011 Section 2.1 (push-down automata)
PARSE TREES.
NPDAs Accept Context-Free Languages
Pushdown Automata Reading: Chapter 6.
Lecture 14 Grammars – Parse Trees– Normal Forms
Context-Free Languages
Relationship to Left- and Rightmost Derivations
Context-Free Grammars
5. Context-Free Grammars and Languages
CS21 Decidability and Tractability
Intro to Data Structures
Last Lecture We began to show CFL = PDA
Context-Free Grammars 1
فصل دوم Context-Free Languages
Finite Automata Reading: Chapter 2.
Definition Moves of the PDA Languages of the PDA Deterministic PDA’s
CSCE 531 Compiler Construction
The Pumping Lemma for CFL’s
Relationship to Left- and Rightmost Derivations
Chapter 2 Context-Free Language - 01
CSE 105 theory of computation
CS21 Decidability and Tractability
CS21 Decidability and Tractability
FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY
Teori Bahasa dan Automata Lecture 10: Push Down Automata
Automata, Grammars and Languages
CSE 105 theory of computation
Push Down Automata Push Down Automata (PDAs) are L-NFAs with stack memory. Transitions are labeled by an input symbol together with a pair of the form.
CSE 105 theory of computation
Presentation transcript:

Conversion of CFG to PDA Conversion of PDA to CFG Equivalence of PDA, CFG Conversion of CFG to PDA Conversion of PDA to CFG

Overview When we talked about closure properties of regular languages, it was useful to be able to jump between RE and DFA representations. Similarly, CFG’s and PDA’s are both useful to deal with properties of the CFL’s.

Overview – (2) Also, PDA’s, being “algorithmic,” are often easier to use when arguing that a language is a CFL. Example: It is easy to see how a PDA can recognize balanced parentheses; not so easy as a grammar. But all depends on knowing that CFG’s and PDA’s both define the CFL’s.

Converting a CFG to a PDA Let L = L(G). Construct PDA P such that N(P) = L. P has: One state q. Input symbols = terminals of G. Stack symbols = all symbols of G. Start symbol = start symbol of G.

Intuition About P Given input w, P will step through a leftmost derivation of w from the start symbol S. Since P can’t know what this derivation is, or even what the end of w is, it uses nondeterminism to “guess” the production to use at each step.

Intuition – (2) At each step, P represents some left-sentential form (step of a leftmost derivation). If the stack of P is , and P has so far consumed x from its input, then P represents left-sentential form x. At empty stack, the input consumed is a string in L(G).

Transition Function of P δ(q, a, a) = (q, ε). (Type 1 rules) This step does not change the LSF represented, but “moves” responsibility for a from the stack to the consumed input. If A ->  is a production of G, then δ(q, ε, A) contains (q, ). (Type 2 rules) Guess a production for A, and represent the next LSF in the derivation.

Proof That L(P) = L(G) We need to show that (q, wx, S) ⊦* (q, x, ) for any x if and only if S =>*lm w. Part 1: “only if” is an induction on the number of steps made by P. Basis: 0 steps. Then  = S, w = ε, and S =>*lm S is surely true.

Induction for Part 1 Consider n moves of P: (q, wx, S) ⊦* (q, x, ) and assume the IH for sequences of n-1 moves. There are two cases, depending on whether the last move uses a Type 1 or Type 2 rule.

Use of a Type 1 Rule The move sequence must be of the form (q, yax, S) ⊦* (q, ax, a) ⊦ (q, x, ), where ya = w. By the IH applied to the first n-1 steps, S =>*lm ya. But ya = w, so S =>*lm w.

Use of a Type 2 Rule The move sequence must be of the form (q, wx, S) ⊦* (q, x, A) ⊦ (q, x, ), where A ->  is a production and  = . By the IH applied to the first n-1 steps, S =>*lm wA. Thus, S =>*lm w = w.

Proof of Part 2 (“if”) We also must prove that if S =>*lm w, then (q, wx, S) ⊦* (q, x, ) for any x. Induction on number of steps in the leftmost derivation. Ideas are similar; read in text.

Proof – Completion We now have (q, wx, S) ⊦* (q, x, ) for any x if and only if S =>*lm w. In particular, let x =  = ε. Then (q, w, S) ⊦* (q, ε, ε) if and only if S =>*lm w. That is, w is in N(P) if and only if w is in L(G).

From a PDA to a CFG Now, assume L = N(P). We’ll construct a CFG G such that L = L(G). Intuition: G will have variables generating exactly the inputs that cause P to have the net effect of popping a stack symbol X while going from state p to state q. P never gets below this X while doing so.

Variables of G G’s variables are of the form [pXq]. This variable generates all and only the strings w such that (p, w, X) ⊦*(q, ε, ε). Also a start symbol S we’ll talk about later.

[Hayes] A quick example Parse ‘a = b + c;’ Grammar: ASSIGN -> VAR ‘=‘ EXPR ‘;’ Above is of the form X -> XaYb We need to handle general such products in our proof, but let’s start with simpler case We will name our non-terminals also according to state changes they induce

Productions of G Each production for [pXq] comes from a move of P in state p with stack symbol X. Simplest case: δ(p, a, X) contains (q, ε). Then the production is [pXq] -> a. Note a can be an input symbol or ε. Here, [pXq] generates a, because reading a is one way to pop X and go from p to q.

Productions of G – (2) Next simplest case: δ(p, a, X) contains (r, Y) for some state r and symbol Y. G has production [pXq] -> a[rYq]. We can erase X and go from p to q by reading a (entering state r and replacing the X by Y) and then reading some w that gets P from r to q while erasing the Y. Note: [pXq] =>* aw whenever [rYq] =>* w.

Productions of G – (3) Third simplest case: δ(p, a, X) contains (r, YZ) for some state r and symbols Y and Z. Now, P has replaced X by YZ. To have the net effect of erasing X, P must erase Y, going from state r to some state s, and then erase Z, going from s to q.

Picture of Action of P p X a w x r Y Z w x s Z x q

Third-Simplest Case – Concluded Since we do not know state s, we must generate a family of productions: [pXq] -> a[rYs][sZq] for all states s. [pXq] =>* awx whenever [rYs] =>* w and [sZq] =>* x.

Productions of G: General Case Suppose δ(p, a, X) contains (r, Y1,…Yk) for some state r and k > 3. Generate family of productions [pXq] -> a[rY1s1][s1Y2s2]…[sk-2Yk-1sk-1][sk-1Ykq]

Completion of the Construction We can prove that (q0, w, Z0)⊦*(p, ε, ε) if and only if [q0Z0p] =>* w. Proof is in text; it is two easy inductions. But state p can be anything. Thus, add to G another variable S, the start symbol, and add productions S -> [q0Z0p] for each state p.