LR(1) Parsers Comp 412 COMP 412 FALL 2010

Slides:



Advertisements
Similar presentations
Parsing V: Bottom-up Parsing
Advertisements

Lesson 8 CDT301 – Compiler Theory, Spring 2011 Teacher: Linus Källberg.
1 CMPSC 160 Translation of Programming Languages Fall 2002 slides derived from Tevfik Bultan, Keith Cooper, and Linda Torczon Lecture-Module #10 Parsing.
Parsing VI The LR(1) Table Construction. LR(k) items The LR(1) table construction algorithm uses LR(1) items to represent valid configurations of an LR(1)
Parsing VII The Last Parsing Lecture Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at.
Parsing V Introduction to LR(1) Parsers. from Cooper & Torczon2 LR(1) Parsers LR(1) parsers are table-driven, shift-reduce parsers that use a limited.
Parsing — Part II (Ambiguity, Top-down parsing, Left-recursion Removal)
1 LR parsing techniques SLR (not in the book) –Simple LR parsing –Easy to implement, not strong enough –Uses LR(0) items Canonical LR –Larger parser but.
Table-driven parsing Parsing performed by a finite state machine. Parsing algorithm is language-independent. FSM driven by table (s) generated automatically.
1 CIS 461 Compiler Design & Construction Fall 2012 slides derived from Tevfik Bultan, Keith Cooper, and Linda Torczon Lecture-Module #12 Parsing 4.
Parsing Wrap-up. from Cooper and Torczon2 Filling in the A CTION and G OTO Tables The algorithm Many items generate no table entry  Closure( ) instantiates.
Bottom-up parsing Goal of parser : build a derivation
Parsing IV Bottom-up Parsing Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at Rice University.
Parsing IV Bottom-up Parsing. Parsing Techniques Top-down parsers (LL(1), recursive descent) Start at the root of the parse tree and grow toward leaves.
LR(1) Parsers Comp 412 Copyright 2010, Keith D. Cooper & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at Rice University have explicit.
Syntax and Semantics Structure of programming languages.
Introduction to Parsing Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at Rice University.
Top Down Parsing - Part I Comp 412 Copyright 2010, Keith D. Cooper & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at Rice University.
Bottom-up Parsing, Part I Comp 412 Copyright 2010, Keith D. Cooper & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at Rice University.
Parsing — Part II (Top-down parsing, left-recursion removal) Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved. Students.
Parsing — Part II (Top-down parsing, left-recursion removal) Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved. Students.
1 Syntax Analysis Part II Chapter 4 COP5621 Compiler Construction Copyright Robert van Engelen, Florida State University, 2005.
Parsing V LR(1) Parsers C OMP 412 Rice University Houston, Texas Fall 2001 Copyright 2000, Keith D. Cooper, Ken Kennedy, & Linda Torczon, all rights reserved.
Top-down Parsing Recursive Descent & LL(1) Comp 412 Copyright 2010, Keith D. Cooper & Linda Torczon, all rights reserved. Students enrolled in Comp 412.
Top-Down Parsing CS 671 January 29, CS 671 – Spring Where Are We? Source code: if (b==0) a = “Hi”; Token Stream: if (b == 0) a = “Hi”; Abstract.
Parsing VII The Last Parsing Lecture. High-level overview  Build the canonical collection of sets of LR(1) Items, I aBegin in an appropriate state, s.
Parsing V LR(1) Parsers N.B.: This lecture uses a left-recursive version of the SheepNoise grammar. The book uses a right-recursive version. The derivations.
Bottom Up Parsing CS 671 January 31, CS 671 – Spring Where Are We? Finished Top-Down Parsing Starting Bottom-Up Parsing Lexical Analysis.
Parsing V LR(1) Parsers. LR(1) Parsers LR(1) parsers are table-driven, shift-reduce parsers that use a limited right context (1 token) for handle recognition.
1 Syntax Analysis Part II Chapter 4 COP5621 Compiler Construction Copyright Robert van Engelen, Florida State University, 2007.
Syntax and Semantics Structure of programming languages.
Introduction to Parsing
Parsing — Part II (Top-down parsing, left-recursion removal)
Parsing III (Top-down parsing: recursive descent & LL(1) )
Programming Languages Translator
Compiler design Bottom-up parsing Concepts
Lexical and Syntax Analysis
UNIT - 3 SYNTAX ANALYSIS - II
Parsing IV Bottom-up Parsing
Table-driven parsing Parsing performed by a finite state machine.
Parsing — Part II (Top-down parsing, left-recursion removal)
Compiler Construction
Parsing VI The LR(1) Table Construction
LALR Parsing Canonical sets of LR(1) items
Syntax Analysis Part II
4 (c) parsing.
Parsing Techniques.
N.B.: This lecture uses a left-recursive version of the SheepNoise grammar. The book uses a right-recursive version. The derivations (& the tables) are.
Subject Name:COMPILER DESIGN Subject Code:10CS63
Lexical and Syntax Analysis
Top-Down Parsing CS 671 January 29, 2008.
Lexical Analysis — Part II: Constructing a Scanner from Regular Expressions Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved.
4d Bottom Up Parsing.
Parsing VII The Last Parsing Lecture
Lecture 9: Bottom-Up Parsing
Lecture 8 Bottom Up Parsing
Compilers Principles, Techniques, & Tools Taught by Jing Zhang
Lexical Analysis — Part II: Constructing a Scanner from Regular Expressions Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved.
Bottom Up Parsing.
Parsing IV Bottom-up Parsing
Introduction to Parsing
Automating Scanner Construction
Lexical Analysis — Part II: Constructing a Scanner from Regular Expressions Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved.
Introduction to Parsing
Parsing — Part II (Top-down parsing, left-recursion removal)
Kanat Bolazar February 16, 2010
Chap. 3 BOTTOM-UP PARSING
Lecture 11 LR Parse Table Construction
Lecture 11 LR Parse Table Construction
Presentation transcript:

LR(1) Parsers Comp 412 COMP 412 FALL 2010 Copyright 2010, Keith D. Cooper & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at Rice University have explicit permission to make copies of these materials for their personal use. Faculty from other educational institutions may use these materials for nonprofit educational purposes, provided this copyright notice is preserved.

LR Parsers — A Roadmap Last Lecture Today Monday Handles, handles, and more handles Today Revisit handles (briefly) Skeleton LR(1) parser & its operation Introduce notation & concepts for LR(1) Table Construction Monday LR(1) Table construction Comp 412, Fall 2010

Handles Consider our example Sentential Form Goal Expr — Term * Factor One more time, with less tedium Consider our example Sentential Form Goal Expr — Term * Factor <id,y> <num,2> <id,x> Unambiguous grammar implies unique rightmost derivation At each step, we have one step that leads to x – 2 * y Any other choice leads to another distinct expression A bottom-up parse reverses the rightmost derivation It has a unique reduction at each step The key is finding that reduction Comp 412, Fall 2010

Handles Consider our example Sentential Form Reduction Goal Expr — Term Expr  Expr — Term * Factor Term  Term * Factor <id,y> Factor  id Term  Factor <num,2> Factor  num Expr  Term <id,x> Factor  id Comp 412, Fall 2010

Handles Consider our example Sentential Form Goal Expr — Term * Factor <id,y> <num,2> <id,x> Now, look at the sentential forms in the example They have a specific form NT * (NT |T)* T* Reductions happen in the (NT |T )* portion Track right end of region Search left from there Finite set of rhs strings We know that each step has a unique reduction That reduction is the handle Comp 412, Fall 2010

From Handles to Parsers Consider the sentential forms in the derivation NT* (NT |T)* T* They are the upper fringe of partially complete syntax tree The suffix consisting of T* is, at each step, the unread input So, our shift-reduce parser operates by: Keeping the portion NT* (NT |T)* on a stack Leftmost symbol at bottom of stack, rightmost at stack top Searching for handles from stack top to stack bottom If search fails, shift another terminal onto stack Comp 412, Fall 2010

Bottom-up Parser A conceptual shift-reduce parser: push INVALID word  NextWord( ) repeat until (top of stack = Goal and word = EOF) if the top of the stack is a handle A then // reduce  to A pop || symbols off the stack push A onto the stack else if (word  EOF) then // shift push word else // need to shift, but out of input report an error What happens on an error? It fails to find a handle Thus, it keeps shifting Eventually, it consumes all input This parser reads all input before reporting an error, not a desirable property. Table-driven LR parsers do much better. The handle finder reports errors as soon as possible. Comp 412, Fall 2010

Actual LR(1) Skeleton Parser stack.push(INVALID); stack.push(s0); // initial state word = scanner.next_word(); loop forever { s = stack.top(); if ( ACTION[s,word] == “reduce A” ) then { stack.popnum(2*||); // pop 2*|| symbols stack.push(A); // push A stack.push(GOTO[s,A]); // push next state } else if ( ACTION[s,word] == “shift si” ) then { stack.push(word); stack.push(si); word  scanner.next_word(); else if ( ACTION[s,word] == “accept” & word == EOF ) then break; else throw a syntax error; report success; The skeleton parser follows basic scheme for shift-reduce parsing from last slide relies on a stack & a scanner keeps both symbols & states on the stack uses two tables, called ACTION & GOTO shifts |words| times reduces |derivation| times accepts at most once detects errors by failure of the other three cases, which is a failure to find a handle Comp 412, Fall 2010

The Parentheses Language Language of balanced parentheses Beyond power of REs Exhibits role of context in LR(1) parsing Goal  List 1 List Pair 2 | Pair 3 ( Pair ) 4 ( ) Comp 412, Fall 2010

The Parentheses Language ACTION TABLE State eof ( ) S 3 1 acc 2 R 2 3 S 6 S 7 4 R 1 5 S 8 6 S 10 7 R 4 8 R 3 9 S 11 10 11 GOTO TABLE State List Pair 1 2 4 3 5 6 9 7 8 10 11 Goal  List 1 List Pair 2 | Pair 3 ( Pair ) 4 ( ) Comp 412, Fall 2010

The Parentheses Language State Lookahead Stack Handle Action — ( $ 0 —none— shift 3 3 ) $ 0 ( 3 shift 7 7 eof $ 0 ( 3 ) 7 ( ) reduce 4 2 $ 0 Pair 2 Pair reduce 2 1 $ 0 List 1 List accept Parsing “( )” Goal  List 1 List Pair 2 | Pair 3 ( Pair ) 4 ( ) Comp 412, Fall 2010

The Parentheses Language Goal  List 1 List Pair 2 | Pair 3 ( Pair ) 4 ( ) The Parentheses Language State L’ahead Stack Handle Action — ( $ 0 —none— shift 3 3 $ 0 ( 3 shift 6 6 ) $ 0 ( 3 ( 6 shift 10 10 $ 0 ( 3 ( 6 ) 10 ( ) reduce 4 5 $ 0 ( 3 Pair 5 shift 8 8 $ 0 ( 3 Pair 5 ) 8 ( Pair ) reduce 3 2 $ 0 Pair 2 Pair reduce 2 1 $ 0 List 1 $ 0 List 1 ( 3 shift 7 7 eof $ 0 List 1 ( 3 ) 7 4 $ 0 List 1 Pair 4 List Pair reduce 1 List accept Parsing “(( )) ( )” Let’s look at how the parser reduces “( )” Comp 412, Fall 2010

The Parentheses Language State Lookahead Stack Handle Action — ( $ 0 —none— shift 3 3 ) $ 0 ( 3 shift 7 7 eof $ 0 ( 3 ) 7 ( ) reduce4 2 $ 0 Pair 2 Pair reduce 2 1 $ 0 List 1 List accept Parsing “( )” Goal  List 1 List Pair 2 | Pair 3 ( Pair ) 4 ( ) Here, peeling ( ) off the stack reveals s0. Goto(s0,Pair) is s2. Comp 412, Fall 2010

The Parentheses Language Goal  List 1 List Pair 2 | Pair 3 ( Pair ) 4 ( ) The Parentheses Language State L’ahead Stack Handle Action — ( $ 0 —none— shift 3 3 $ 0 ( 3 shift 6 6 ) $ 0 ( 3 ( 6 shift 10 10 $ 0 ( 3 ( 6 ) 10 ( ) reduce 4 5 $ 0 ( 3 Pair 5 shift 8 8 $ 0 ( 3 Pair 5 ) 8 ( Pair ) reduce 3 2 $ 0 Pair 2 Pair reduce 2 1 $ 0 List 1 $ 0 List 1 ( 3 shift 7 7 eof $ 0 List 1 ( 3 ) 7 4 $ 0 List 1 Pair 4 List Pair reduce 1 List accept Parsing “(( )) ( )” Here, peeling ( ) off the stack reveals s3, which represents the left context of an unmatched (. Goto(s3,Pair) is s5, a state in which we expect a ). That path leads to a reduction by production 3, Pair  ( Pair ) . Comp 412, Fall 2010

The Parentheses Language Goal  List 1 List Pair 2 | Pair 3 ( Pair ) 4 ( ) The Parentheses Language State L’ahead Stack Handle Action — ( $ 0 —none— shift 3 3 $ 0 ( 3 shift 6 6 ) $ 0 ( 3 ( 6 shift 10 10 $ 0 ( 3 ( 6 ) 10 ( ) reduce 4 5 $ 0 ( 3 Pair 5 shift 8 8 $ 0 ( 3 Pair 5 ) 8 ( Pair ) reduce 3 2 $ 0 Pair 2 Pair reduce 2 1 $ 0 List 1 $ 0 List 1 ( 3 shift 7 7 eof $ 0 List 1 ( 3 ) 7 reduce 1 4 $ 0 List 1 Pair 4 List Pair reduce 0 List accept Parsing “(( )) ( )” Here, peeling ( ) off the stack reveals s1, which represents the left context of a previously recognized List. Goto(s1,Pair) is s4, a state in which we can reduce List Pair to List (on lookahead of either ( or eof). Three copies of “reduce 4” with different context — produce three distinct behaviors Comp 412, Fall 2010

LR(1) Parsers How does this LR(1) stuff work? Unambiguous grammar  unique rightmost derivation Keep upper fringe on a stack All active handles include top of stack (TOS) Shift inputs until TOS is right end of a handle Language of handles is regular (finite) Build a handle-recognizing DFA to control the stack-based recognizer ACTION & GOTO tables encode the DFA To match a subterm, invoke the DFA recursively leave old DFA’s state on stack and go on Final state in DFA  a reduce action Pop rhs off the stack to reveal invoking state “It would be legal to recognize an x, and we did …” New state is GOTO[revealed state, lhs] Take a DFA transition on the new NT — the lhs we just pushed… Comp 412, Fall 2010

Control DFA for the Parentheses Language LR(1) Parsers The Control DFA for the Parentheses Language Transitions on terminals represent shift actions [ACTION] Transitions on nonterminals represent reduce actions [GOTO] The table construction derives this DFA from the grammar S0 S1 S2 S3 S6 S9 S11 S4 S5 S8 S7 S10 Pair ( List ) Control DFA for the Parentheses Language Comp 412, Fall 2010

Building LR(1) Tables How do we generate the ACTION and GOTO tables? Use the grammar to build a model of the Control DFA Encode actions & transitions in ACTION & GOTO tables If construction succeeds, the grammar is LR(1) “Succeeds” means defines each table entry uniquely The Big Picture Model the state of the parser Use two functions goto( s, X ) and closure( s ) goto() is analogous to move() in the subset construction closure() adds information to round out a state Build up the states and transition functions of the DFA Use this information to fill in the ACTION and GOTO tables grammar symbol, T or NT fixed-point algorithm, like subset construction Comp 412, Fall 2010

LR(k) Items The LR(1) table construction algorithm represents a valid configuration of an LR(1) parser with a data structure called an LR(1) items An LR(k) item is a pair [P, ], where P is a production A with a • at some position in the rhs  is a lookahead string of length ≤ k (words or EOF ) The • in an item indicates the position of the top of the stack [A•,a] means that the input seen so far is consistent with the use of A  immediately after the symbol on top of the stack [A •,a] means that the input sees so far is consistent with the use of A  at this point in the parse, and that the parser has already recognized  (that is,  is on top of the stack). [A •,a] means that the parser has seen , and that a lookahead symbol of a is consistent with reducing to A. Comp 412, Fall 2010

The production A, where  = B1B1B1 with lookahead a, LR(1) Items The production A, where  = B1B1B1 with lookahead a, can give rise to 4 items [A•B1B2B3,a], [AB1•B2B3,a], [AB1B2•B3,a], & [AB1B2B3•,a] The set of LR(1) items for a grammar is finite What’s the point of all these lookahead symbols? Carry them along to help choose the correct reduction Lookaheads are bookkeeping, unless item has • at right end Has no direct use in [A•,a] In [A•,a], a lookahead of a implies a reduction by A  For { [A•,a],[B•,b] }, a  reduce to A; FIRST()  shift Limited right context is enough to pick the actions Comp 412, Fall 2010

LR(1) Table Construction High-level overview Build the canonical collection of sets of LR(1) Items, I Begin in an appropriate state, s0 [S’ •S,EOF], along with any equivalent items Derive equivalent items as closure( s0 ) Repeatedly compute, for each sk, and each X, goto(sk,X) If the set is not already in the collection, add it Record all the transitions created by goto( ) This eventually reaches a fixed point Fill in the table from the collection of sets of LR(1) items Let’s build the tables for the SheepNoise grammar The states of the canonical collection are precisely the states of the Control DFA The construction traces the DFA’s transitions 1 Goal  SheepNoise 2 SheepNoise baa 3 | baa Comp 412, Fall 2010

Lookaheads are generated here Computing Closures Closure(s) adds all the items implied by items already in s Any item [AB,a] implies [B,x] for each production with B on the lhs, and each x  FIRST(a) Since B is valid, any way to derive B is valid, too The algorithm Closure( s ) while ( s is still changing )  items [A   •B,a]  s  productions B    P  b  FIRST(a) //  might be  if [B • ,b]  s then s  s  { [B • ,b] } Classic fixed-point method Halts because s  ITEMS Worklist version is faster Closure “fills out” a state Lookaheads are generated here Comp 412, Fall 2010

Example From SheepNoise Initial step builds the item [Goal•SheepNoise,EOF] and takes its closure( ) Closure( [Goal•SheepNoise,EOF] ) So, S0 is { [Goal • SheepNoise,EOF], [SheepNoise • SheepNoise baa,EOF], [SheepNoise• baa,EOF], [SheepNoise • SheepNoise baa,baa], [SheepNoise • baa,baa] } Remember, this is the left-recursive SheepNoise; EaC shows the right-recursive version. Item Source [Goal   SheepNoise,EOF] Original item [SheepNoise   SheepNoise baa, EOF] 1, a is EOF [SheepNoise   baa, EOF] [SheepNoise   SheepNoise baa, baa] 2, a is baa EOF [SheepNoise   baa, baa] Goal  SheepNoise 1 SheepNoise baa 2 | baa Comp 412, Fall 2010

Computing Gotos Goto(s,x) computes the state that the parser would reach if it recognized an x while in state s Goto( { [AX,a] }, X ) produces [AX,a] (obviously) It finds all such items & uses closure() to fill out the state The algorithm Goto( s, X ) new Ø  items [A•X,a]  s new  new  { [AX•,a] } return closure(new) Not a fixed-point method! Straightforward computation Uses closure ( ) Goto() moves us forward Comp 412, Fall 2010

Example from SheepNoise S0 is { [Goal • SheepNoise,EOF], [SheepNoise • SheepNoise baa,EOF], [SheepNoise • baa,EOF], [SheepNoise • SheepNoise baa,baa], [SheepNoise • baa,baa] } Goto( S0 , baa ) Loop produces Closure adds nothing since • is at end of rhs in each item In the construction, this produces s2 { [SheepNoisebaa •, {EOF,baa}] } From Slide 16 Item Source [SheepNoise  baa , EOF] Item 3 in s0 [SheepNoise  baa , baa] Item 5 in s0 New, but obvious, notation for two distinct items [SheepNoisebaa •, EOF] & [SheepNoisebaa •, baa] Goal  SheepNoise 1 SheepNoise baa 2 | baa Comp 412, Fall 2010

Building the Canonical Collection Start from s0 = closure( [S’S,EOF ] ) Repeatedly construct new states, until all are found The algorithm s0  closure ( [S’S,EOF] ) S  { s0 } k  1 while ( S is still changing )  sj  S and  x  ( T  NT ) sk  goto(sj,x) record sj  sk on x if sk  S then S  S  { sk } k  k + 1 Fixed-point computation Loop adds to S S  2ITEMS, so S is finite Worklist version is faster Comp 412, Fall 2010

Example from SheepNoise Starts with S0 S0 : { [Goal • SheepNoise, EOF], [SheepNoise • SheepNoise baa, EOF], [SheepNoise • baa, EOF], [SheepNoise • SheepNoise baa, baa], [SheepNoise • baa, baa] } Iteration 1 computes S1 = Goto(S0 , SheepNoise) = { [Goal SheepNoise •, EOF], [SheepNoise SheepNoise • baa, EOF], [SheepNoise SheepNoise • baa, baa] } S2 = Goto(S0 , baa) = { [SheepNoise baa •, EOF], [SheepNoise baa •, baa] } Nothing more to compute, since • is at the end of every item in S3 . Iteration 2 computes S3 = Goto(S1 , baa) = { [SheepNoise SheepNoise baa •, EOF], [SheepNoise SheepNoise baa •, baa] } Goal  SheepNoise 1 SheepNoise baa 2 | baa Comp 412, Fall 2010

Example from SheepNoise S0 : { [Goal • SheepNoise, EOF], [SheepNoise • SheepNoise baa, EOF], [SheepNoise • baa, EOF], [SheepNoise • SheepNoise baa, baa], [SheepNoise • baa, baa] } S1 = Goto(S0 , SheepNoise) = { [Goal SheepNoise •, EOF], [SheepNoise SheepNoise • baa, EOF], [SheepNoise SheepNoise • baa, baa] } S2 = Goto(S0 , baa) = { [SheepNoise baa •, EOF], [SheepNoise baa •, baa] } S3 = Goto(S1 , baa) = { [SheepNoise SheepNoise baa •, EOF], [SheepNoise SheepNoise baa •, baa] } Goal  SheepNoise 1 SheepNoise baa 2 | baa Comp 412, Fall 2010

Filling in the ACTION and GOTO Tables The algorithm Many items generate no table entry Closure( ) instantiates FIRST(X) directly for [A • X , a] x is the state number  set Sx  S  item i  Sx if i is [A • ad,b] and goto(Sx,a) = Sk , a  T then ACTION[x,a]  “shift k” else if i is [S’S •,EOF] then ACTION[x ,EOF]  “accept” else if i is [A •,a] then ACTION[x,a]  “reduce A”  n  NT if goto(Sx ,n) = Sk then GOTO[x,n]  k • before T  shift have Goal  accept • at end  reduce Comp 412, Fall 2010

Example from SheepNoise S0 : { [Goal • SheepNoise, EOF], [SheepNoise • SheepNoise baa, EOF], [SheepNoise • baa, EOF], [SheepNoise • SheepNoise baa, baa], [SheepNoise • baa, baa] } S1 = Goto(S0 , SheepNoise) = { [Goal SheepNoise •, EOF], [SheepNoise SheepNoise • baa, EOF], [SheepNoise SheepNoise • baa, baa] } S2 = Goto(S0 , baa) = { [SheepNoise baa •, EOF], [SheepNoise baa •, baa] } S3 = Goto(S1 , baa) = { [SheepNoise SheepNoise baa •, EOF], [SheepNoise SheepNoise baa •, baa] } • before T  shift k so, ACTION[s0,baa] is “shift S2” (clause 1) (items define same entry) Comp 412, Fall 2010

Example from SheepNoise S0 : { [Goal • SheepNoise, EOF], [SheepNoise • SheepNoise baa, EOF], [SheepNoise • baa, EOF], [SheepNoise • SheepNoise baa, baa], [SheepNoise • baa, baa] } S1 = Goto(S0 , SheepNoise) = { [Goal SheepNoise •, EOF], [SheepNoise SheepNoise • baa, EOF], [SheepNoise SheepNoise • baa, baa] } S2 = Goto(S0 , baa) = { [SheepNoise baa •, EOF], [SheepNoise baa •, baa] } S3 = Goto(S1 , baa) = { [SheepNoise SheepNoise baa •, EOF], [SheepNoise SheepNoise baa •, baa] } so, ACTION[S1,baa] is “shift S3” (clause 1) Comp 412, Fall 2010

Example from SheepNoise S0 : { [Goal • SheepNoise, EOF], [SheepNoise • SheepNoise baa, EOF], [SheepNoise • baa, EOF], [SheepNoise • SheepNoise baa, baa], [SheepNoise • baa, baa] } S1 = Goto(S0 , SheepNoise) = { [Goal SheepNoise •, EOF], [SheepNoise SheepNoise • baa, EOF], [SheepNoise SheepNoise • baa, baa] } S2 = Goto(S0 , baa) = { [SheepNoise baa •, EOF], [SheepNoise baa •, baa] } S3 = Goto(S1 , baa) = { [SheepNoise SheepNoise baa •, EOF], [SheepNoise SheepNoise baa •, baa] } so, ACTION[S1,EOF] is “accept ” (clause 2) Comp 412, Fall 2010

Example from SheepNoise S0 : { [Goal • SheepNoise, EOF], [SheepNoise • SheepNoise baa, EOF], [SheepNoise • baa, EOF], [SheepNoise • SheepNoise baa, baa], [SheepNoise • baa, baa] } S1 = Goto(S0 , SheepNoise) = { [Goal SheepNoise •, EOF], [SheepNoise SheepNoise • baa, EOF], [SheepNoise SheepNoise • baa, baa] } S2 = Goto(S0 , baa) = { [SheepNoise baa •, EOF], [SheepNoise baa •, baa] } S3 = Goto(S1 , baa) = { [SheepNoise SheepNoise baa •, EOF], [SheepNoise SheepNoise baa •, baa] } so, ACTION[S2,EOF] is “reduce 3 ” (clause 3) Comp 412, Fall 2010

Example from SheepNoise S0 : { [Goal • SheepNoise, EOF], [SheepNoise • SheepNoise baa, EOF], [SheepNoise • baa, EOF], [SheepNoise • SheepNoise baa, baa], [SheepNoise • baa, baa] } S1 = Goto(S0 , SheepNoise) = { [Goal SheepNoise •, EOF], [SheepNoise SheepNoise • baa, EOF], [SheepNoise SheepNoise • baa, baa] } S2 = Goto(S0 , baa) = { [SheepNoise baa •, EOF], [SheepNoise baa •, baa] } S3 = Goto(S1 , baa) = { [SheepNoise SheepNoise baa •, EOF], [SheepNoise SheepNoise baa •, baa] } ACTION[S3,EOF] is “reduce 2” (clause 3) ACTION[S2,baa] is “reduce 3 ” (clause 3) ACTION[S2,EOF] is “reduce 2 ” (clause 3) ACTION[S3,baa] is “reduce 2”, as well Comp 412, Fall 2010

Example from SheepNoise The GOTO Table records Goto transitions on NTs S0 : { [Goal • SheepNoise, EOF], [SheepNoise • SheepNoise baa, EOF], [SheepNoise • baa, EOF], [SheepNoise • SheepNoise baa, baa], [SheepNoise • baa, baa] } S1 = Goto(S0 , SheepNoise) = { [Goal SheepNoise •, EOF], [SheepNoise SheepNoise • baa, EOF], [SheepNoise SheepNoise • baa, baa] } S2 = Goto(S0 , baa) = { [SheepNoise baa •, EOF], [SheepNoise baa •, baa] } S3 = Goto(S1 , baa) = { [SheepNoise SheepNoise baa •, EOF], [SheepNoise SheepNoise baa •, baa] } Based on T, not NT Only 1 transition in the entire GOTO table Remember, we recorded these so we don’t need to recompute them. Comp 412, Fall 2010

Here are the tables for the augmented ACTION & GOTO Tables Here are the tables for the augmented left-recursive SheepNoise grammar The tables The grammar Remember, this is the left-recursive SheepNoise; EaC shows the right-recursive version. ACTION TABLE State EOF baa — shift 2 1 accept shift 3 2 reduce 2 3 reduce 1 GOTO TABLE State SheepNoise 1 2 3 Goal  SheepNoise 1 SheepNoise baa 2 | baa Comp 412, Fall 2010

In either case, the grammar is not LR(1) What can go wrong? What if set s contains [A•a,b] and [B•,a] ? First item generates “shift”, second generates “reduce” Both define ACTION[s,a] — cannot do both actions This is a fundamental ambiguity, called a shift/reduce error Modify the grammar to eliminate it (if-then-else) Shifting will often resolve it correctly What is set s contains [A•, a] and [B•, a] ? Each generates “reduce”, but with a different production Both define ACTION[s,a] — cannot do both reductions This is a fundamental ambiguity, called a reduce/reduce conflict Modify the grammar to eliminate it (PL/I’s overloading of (...)) In either case, the grammar is not LR(1) Comp 412, Fall 2010

Shrinking the Tables Three options: Combine terminals such as number & identifier, + & -, * & / Directly removes a column, may remove a row For expression grammar, 198 (vs. 384) table entries Combine rows or columns Implement identical rows once & remap states Requires extra indirection on each lookup Use separate mapping for ACTION & for GOTO Use another construction algorithm Both LALR(1) and SLR(1) produce smaller tables Implementations are readily available left-recursive expression grammar with precedence, see §3.7.2 in EAC classic space-time tradeoff Is handle recognizing DFA minimal? Comp 412, Fall 2010

LR(k ) versus LL(k ) Finding Reductions LR(k)  Each reduction in the parse is detectable with the complete left context, the reducible phrase, itself, and the k terminal symbols to its right LL(k)  Parser must select the reduction based on The complete left context The next k terminals Thus, LR(k) examines more context “… in practice, programming languages do not actually seem to fall in the gap between LL(1) languages and deterministic languages” J.J. Horning, “LR Grammars and Analysers”, in Compiler Construction, An Advanced Course, Springer-Verlag, 1976 generalizations of LR(1) and LL(1) to longer lookaheads Comp 412, Fall 2010

Summary Advantages Disadvantages Top-down recursive descent LR(1) Fast Good locality Simplicity Good error detection Deterministic langs. Automatable Left associativity Disadvantages Hand-coded High maintenance Right associativity Large working sets Poor error messages Large table sizes Top-down recursive descent LR(1) Comp 412, Fall 2010