Presentation is loading. Please wait.

Presentation is loading. Please wait.

Context-Free Parsing Read J & M Chapter 10.. Basic Parsing Facts Regular LanguagesContext-Free Languages Required Automaton FSMPDA Algorithm to get rid.

Similar presentations


Presentation on theme: "Context-Free Parsing Read J & M Chapter 10.. Basic Parsing Facts Regular LanguagesContext-Free Languages Required Automaton FSMPDA Algorithm to get rid."— Presentation transcript:

1 Context-Free Parsing Read J & M Chapter 10.

2 Basic Parsing Facts Regular LanguagesContext-Free Languages Required Automaton FSMPDA Algorithm to get rid of ND? YesNo Inherent ambiguity? NoYes Parse time  (n)  (n 3 ) (*) Algorithm to minimize? YesNo

3 Parsing Formal Languages vs Natural Ones Formal Languages Natural Languages Do we get to design for efficiency? YesNo Is existence of a single parse critical? YesNo Can we get what we need without ambiguity? YesApparently not

4 Example Comparison Formal language: 17 * 8 + 24 First, we define which interpretation we want. Second, we exploit parsing techniques so that it is not even necessary to try multiple paths to get to that answer. If a = 2 then if b = 3 then i := 7 else i := 9 English: I hit the boy with a bat.

5 Ambiguity Lexical category ambiguity: Type this report. Structural ambiguity: Attachment ambiguity: I hit the boy with the bat. I saw the Statue of Liberty flying over New York. A Belgian furniture shop is offering special packages for divorced men who hate shopping in a country where half of all marriages end in a divorce after five years. Coordination ambiguity: Would you like pancakes or bacon and eggs? Noun phrase bracketing: I’d like Canadian bacon and eggs. Quantifier scope ambiguity: All the boys read a book about politics.

6 How Bad is the Ambiguity Problem? Consider: Show me the meal on flight 286 from SF to Denver and NY. Bracketings: How many are there? ConstituentsParses 22 35 414 5132 6469 71430 84867

7 What’s the Fix? There isn’t an easy one. Two approaches: Semantic filtering Representing families of parses, not individual ones.

8 Ambiguity and Nondeterminism Imply Search Two basic approaches: Top down parsing Bottom up parsing Example: Book that flight.

9 A Miniature Grammar

10 A Top-Down, Recursive Descent Parser Def (S: S := Build(S, NP(), VP()); If success then return S; S := Build(S, Aux(), NP(), VP()); If success then return S; S := Build(S, VP()); If success then return S; S := Build(S, S(), Conj(), S()); If success then return S; Return fail) Def (NP: NP := Build(NP, Det(), Nom()) ; If success then return NP; NP := Build(NP, PropN() ) ; If success then return NP; NP := Build(NP, NP(), S()); If success then return NP; Return fail) Def (VP: VP := Build(VP, V()); If success then return VP; VP := Build(VP, V(), NP()); If success then return VP; Return fail) Def (PropN: PrN:= Lookup(“PropN”); If success then return PropN; Return fail) Def (V: V:= Lookup(“V”); If success then returnV; Return fail) Def (Det: Det:= Lookup(“Det”); If success then return Det; Return fail)

11 Example Let’s try to parse: Lucy purred.

12 Adding Bottom Up Filtering Consider the sentence: Listen to me. We know that we cannot use either of the first two S rules because listen cannot be on the left corner of either an NP or an Aux. Facts such as this can be compiled from the grammar and stored in a table that lists, for each constituent, those parts of speech that can form the left corner. Example using S:Left corners: S  NP VPDet, PropN S  Aux NP VP Aux S  VPV S  S Conj S

13 Problems with Top-Down Parsers Left recursive rules: Let’s try: Cats and dogs make good pets. Rewrite A  A  |  asA   A A   A |  Ambiguity We will only find the first parse. Repeated parsing of subtrees: Imagine we’d solved the conjunction problem. Now consider: The large furry cat with the huge tail purred and the baby smiled.

14 The Earley Algorithm Goal: find one or more edges from 0 to 3 labeled S.

15 Could We Do it with an FSM? Not the way we’ve been doing it because of the recursive rules. But the FSM idea may be useful if we do one of the following: Just parse fragments of sentences. Augment the FSM formalism to allow more powerful operations.

16 Parsing Fragments Example: noun phrases. Maybe we don’t need to handle: Lawyers whose clients committed fraud Lawyers who committed fraud Clients whose lawyers committed fraud But we do care about: The huge and friendly cat Then we could build an FSM like:

17 Augmenting the FSM - Recursion A Recursive Transition Network has the same power as a PDA or a context-free parser. Here’s a fragment of an RTN that recognizes sentences: S Conj S

18 Augmenting the FSM – Allow Actions Who did she say she saw ____ coming down the hill? OBJ  QWORD Augmented Transition Networks (ATNs)

19 Demos A parser with a nice graphical interface: http://www.ling.helsinki.fi/~tapanain/dg/eng/demo.html A parser with machine readable output: http://www.lingsoft.fi/cgi- bin/engcg?snt=Who+did+you+say+you+saw%3F&h=on A shallow parser: http://ilk.kub.nl/cgi-bin/tstchunk/demo.pl Whatever you pick, it needs to connect with the modules that come before and after it.


Download ppt "Context-Free Parsing Read J & M Chapter 10.. Basic Parsing Facts Regular LanguagesContext-Free Languages Required Automaton FSMPDA Algorithm to get rid."

Similar presentations


Ads by Google