CS626-449: Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 13: Deeper Adjective and PP Structure; Structural Ambiguity.

Slides:



Advertisements
Similar presentations
The Structure of Sentences Asian 401
Advertisements

Linguistics Lecture-12: X-bar Theory; Adjuncts and Complements
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 20– Parsing) Pushpak Bhattacharyya CSE Dept., IIT Bombay 28 th Feb, 2011.
Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
Feature Structures and Parsing Unification Grammars Algorithms for NLP 18 November 2014.
Sub-constituents of NP in English September 12, 2007.
1 Grammars and Parsing Allen ’ s Chapters 3, Jurafski & Martin ’ s Chapters 8-9.
Properties of X-bar Complements, Adjuncts, & Specifiers.
Another look at PSRs: Intermediate Structure Starting X-bar theory.
Statistical NLP: Lecture 3
CS344: Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 20-21– Natural Language Parsing.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
Parameters of Word Order
1 CSC 594 Topics in AI – Applied Natural Language Processing Fall 2009/ Outline of English Syntax.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
Phrase Structure The formal means of representing constituency.
Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky.
Constituency Tests Phrase Structure Rules
THE PARTS OF SYNTAX Don’t worry, it’s just a phrase ELL113 Week 4.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
CS626: NLP, Speech and the Web Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 12,13: Parsing 23 rd and 27 th August, 2012.
Meeting 3 Syntax Constituency, Trees, and Rules
October 2008csa3180: Setence Parsing Algorithms 1 1 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up.
CS : Speech, Natural Language Processing and the Web/Topics in Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 12: Deeper.
Natural Language Processing Lecture 6 : Revision.
CS : Language Technology for the Web/Natural Language Processing Pushpak Bhattacharyya CSE Dept., IIT Bombay Constituent Parsing and Algorithms (with.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Parsing with Context Free Grammars.
NLP. Introduction to NLP Is language more than just a “bag of words”? Grammatical rules apply to categories and groups of words, not individual words.
October 2005csa3180: Parsing Algorithms 11 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up Parsing Strategies.
Today Phrase structure rules, trees Constituents Recursion Conjunction
CS626: NLP, Speech and the Web Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 15, 17: Parsing Ambiguity, Probabilistic Parsing, sample seminar 17.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
Syntax Why is the structure of language (syntax) important? How do we represent syntax? What does an example grammar for English look like? What strategies.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Linguistic Essentials
Parsing with Context-Free Grammars for ASR Julia Hirschberg CS 4706 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
Review of basic concepts.  The knowledge of sentences and their structure.  Syntactic rules include: ◦ The grammaticality of sentences ◦ Word order.
LING 388: Language and Computers Sandiway Fong Lecture 12.
CPE 480 Natural Language Processing Lecture 4: Syntax Adapted from Owen Rambow’s slides for CSc Fall 2006.
Rules, Movement, Ambiguity
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 17 (14/03/06) Prof. Pushpak Bhattacharyya IIT Bombay Formulation of Grammar.
Natural Language - General
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
Making it stick together…
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 6 (14/02/06) Prof. Pushpak Bhattacharyya IIT Bombay Top-Down and Bottom-Up.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
Parsing and Code Generation Set 24. Parser Construction Most of the work involved in constructing a parser is carried out automatically by a program,
TYPES OF PHRASES REPRESENTING THE INTERNAL STRUCTURE OF PHRASES 12/5/2016.
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 13 (17/02/06) Prof. Pushpak Bhattacharyya IIT Bombay Top-Down Bottom-Up.
GRAMMARS & PARSING. Parser Construction Most of the work involved in constructing a parser is carried out automatically by a program, referred to as a.
NLP. Introduction to NLP #include int main() { int n, reverse = 0; printf("Enter a number to reverse\n"); scanf("%d",&n); while (n != 0) { reverse =
CS : Language Technology for the Web/Natural Language Processing Pushpak Bhattacharyya CSE Dept., IIT Bombay Parsing Algos.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
Week 3. Clauses and Trees English Syntax. Trees and constituency A sentence has a hierarchical structure Constituents can have constituents of their own.
CS : Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 11: Evidence for Deeper Structure; Top Down Parsing.
Week 9. Cross-Categorial Generalisations: X-Bar Syntax English Syntax.
Statistical NLP: Lecture 3
BBI 3212 ENGLISH SYNTAX AND MORPHOLOGY
SYNTAX.
CS : Speech, NLP and the Web/Topics in AI
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 23, 24– Parsing Algorithms; Parsing in case of Ambiguity; Probabilistic Parsing)
Natural Language - General
Parsing and More Parsing
Linguistic Essentials
CS : Language Technology for the Web/Natural Language Processing
David Kauchak CS159 – Spring 2019
Presentation transcript:

CS : Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 13: Deeper Adjective and PP Structure; Structural Ambiguity and Parsing

Types of Grammar Prescriptive Grammar Taught in schools Emphasis is on usage Descriptive Grammar Also known as Linguistic Grammar Describes Language

Types of Languages SVO Subject – Verb – Object English E.g. Ram likes music. S V O SOV Subject- Object-Verb Indian Languages E.g. राम पानी पी रहा है | S O V

More deeply embedded structure NP PP AP big The of poems with the blue cover N’ 1 N book PP N’ 2 N’ 3

Bar-level projections Add intermediate structures NP  (D) N’ N’  (AP) N’ | N’ (PP) | N (PP) () indicates optionality

New rules produce this tree NP PP AP big The of poems with the blue cover N’ 1 N book PP N’ 2 N’ 3 N-bar

As opposed to this tree NP PPAP big The of poems with the blue coverbook PP

V-bar What is the element in verbs corresponding to one-replacement for nouns do-so or did-so

I [eat beans with a fork] VP NP beans eat with a fork PP No constituent that groups together V and NP and excludes PP

Need for intermediate constituents I [eat beans] with a fork but Ram [does so] with a spoon V2’V2’ NP beans eat with a fork PP VP V1’V1’ V VP  V’ V’  V’ (PP) V’  V (NP)

How to target V 1 ’ I [eat beans with a fork], and Ram [does so] too. V2’V2’ NP beans eat with a fork PP VP V1’V1’ V VP  V’ V’  V’ (PP) V’  V (NP)

Case of conjunction V3’V3’ NP beans eat In the afternoon PP VP V1’V1’ V V4’V4’ NP coffee drink V V2’V2’ Conj and

A-bar: adjectives A3’A3’ A4’A4’ blue Very AP bright A5’A5’ A6’A6’ green A1’A1’ AP A2’A2’ Conj and AP dull AP  A’ A’  (AP) A’ A’  A (PP)

So-replacement for adjectives Ram is very serious about studies, but less so than Shyam

P-bar: prepositions P2’P2’ NP the table right AP off P3’P3’ NP the trash A1’A1’ AP P1’P1’ Conj and P P into PP  P’ P’  P’ (PP) P’  P (NP) PP

So-replacement for Prepositions Ram is utterly in debt, but Shyam is only partly so.

Complements and Adjuncts or Arguments and Adjuncts

Rules in bar notation: Noun NP  (D) N’ N’  (AP) N’ N’  N’ (PP) N’  N (PP)

Rules in bar notation: Verb VP  V’ V’  V’ (PP) V’  V (NP)

Rules in bar notation: Adjective AP  A’ A’  (AP) A’ A’  A (PP)

Rules in bar notation: Preposition PP  P’ P’  P’ (PP) P’  P (NP)

Introducing the “X factor” Let X stand for any category N, V, A, P Let XP stand for NP, VP, AP and PP Let X’ stand for N’, V’, A’ and P’

XP to X’ Collect the first level rules NP  (D) N’ VP  V’ AP  A’ PP  P’ And produce XP  (YP) X’

X’ to X’ Collect the 2 nd level rules N’  (AP) N’ or N’ (PP) V’  V’ (PP) A’  (AP) A’ P’  P’ (PP) And produce X’  (ZP) X’ or X (ZP)

X’ to X Collect the 3 rd level rules N’  N (PP) V’  V (NP) A’  A (PP) P’  P (NP) And produce X’  X (WP)

Basic observations about X and X’ X’  X (WP) X’  X’ (ZP) X is called Head Phrases must have Heads: Headedness property Category of XP and X must match: Endocentricity

Basic observations about X and X’ X’  X (WP) X’  X’ (ZP) Sisters of X are complements Roughly correspond to objects Sisters of X’ are Adjuncts PPs and Adjectives are typical adjuncts We have adjunct rules and complement rules

Structural difference between complements and adjuncts X’ WP Complement X’ X ZP XP Adjunct

Complements and Adjuncts in NPs N’ PP of poems N’ N ZP NP with red cover book

Any number of Adjuncts N’ PP of poems N’ N ZP N’ with red cover book NP from Oxford Press

Parsing Algorithm

A simplified grammar S  NP VP NP  DT N | N VP  V ADV | V

A segment of English Grammar S’  (C) S S  {NP/S’} VP VP  (AP+) (VAUX) V (AP+) ({NP/S’}) (AP+) (PP+) (AP+) NP  (D) (AP+) N (PP+) PP  P NP AP  (AP) A

Example Sentence People laugh Lexicon: People - N, V Laugh - N, V These are positions This indicate that both Noun and Verb is possible for the word “People”

Top-Down Parsing State Backup State Action ((S) 1) ((NP VP)1) - - 3a. ((DT N VP)1) ((N VP) 1) - 3b. ((N VP)1) ((VP)2) - Consume “People” 5a. ((V ADV)2) ((V)2) - 6. ((ADV)3) ((V)2) Consume “laugh” 5b. ((V)2) ((.)3) - Consume “laugh” Termination Condition : All inputs over. No symbols remaining. Note: Input symbols can be pushed back. Position of input pointer

Discussion for Top-Down Parsing This kind of searching is goal driven. Gives importance to textual precedence (rule precedence). No regard for data, a priori (useless expansions made).

Bottom-Up Parsing Some conventions: N 12 S 1? -> NP 12 ° VP 2? Represents positions End position unknown Work on the LHS done, while the work on RHS remaining

Bottom-Up Parsing (pictorial representation) S -> NP 12 VP 23 ° People Laugh N 12 N 23 V 12 V 23 NP 12 -> N 12 ° NP 23 -> N 23 ° VP 12 -> V 12 ° VP 23 -> V 23 ° S 1? -> NP 12 ° VP 2?

Problem with Top-Down Parsing Left Recursion Suppose you have A-> AB rule. Then we will have the expansion as follows: ((A)K) -> ((AB)K) -> ((ABB)K) ……..

Combining top-down and bottom-up strategies

Top-Down Bottom-Up Chart Parsing Combines advantages of top-down & bottom- up parsing. Does not work in case of left recursion. e.g. – “People laugh” People – noun, verb Laugh – noun, verb Grammar – S  NP VP NP  DT N | N VP  V ADV | V

Transitive Closure People laugh 123 S  NP VPNP  N  VP  V  NP  DT NS  NP  VPS  NP VP  NP  NVP  V ADVsuccess VP  V

Arcs in Parsing Each arc represents a chart which records Completed work (left of  ) Expected work (right of  )

Example People laughloudly 1234 S  NP VPNP  N  VP  V  VP  V ADV  NP  DT NS  NP  VPVP  V  ADVS  NP VP  NP  NVP   V ADVS  NP VP  VP   V

Dealing With Structural Ambiguity Multiple parses for a sentence The man saw the boy with a telescope. The man saw the mountain with a telescope. The man saw the boy with the ponytail. At the level of syntax, all these sentences are ambiguous. But semantics can disambiguate 2 nd & 3 rd sentence.

Prepositional Phrase (PP) Attachment Problem V – NP 1 – P – NP 2 (Here P means preposition) NP 2 attaches to NP 1 ? or NP 2 attaches to V ?

Parse Trees for a Structurally Ambiguous Sentence Let the grammar be – S  NP VP NP  DT N | DT N PP PP  P NP VP  V NP PP | V NP For the sentence, “I saw a boy with a telescope”

Parse Tree - 1 S NPVP NVNP DetNPP PNP DetN I saw a boy with atelescope

Parse Tree -2 S NPVP NVNP DetN PP PNP DetN I saw a boy with atelescope

Parsing Structural Ambiguity

Topics to be covered Dealing with Structural Ambiguity Moving towards Dependency Parsing

Parsing for Structurally Ambiguous Sentences Sentence “I saw a boy with a telescope” Grammar: S  NP VP NP  ART N | ART N PP | PRON VP  V NP PP | V NP ART  a | an | the N  boy | telescope PRON  I V  saw

Ambiguous Parses Two possible parses: PP attached with Verb (i.e. I used a telescope to see) ( S ( NP ( PRON “I” ) ) ( VP ( V “saw” ) ( NP ( (ART “a”) ( N “boy”)) ( PP (P “with”) (NP ( ART “a” ) ( N “telescope”))))) PP attached with Noun (i.e. boy had a telescope) ( S ( NP ( PRON “I” ) ) ( VP ( V “saw” ) ( NP ( (ART “a”) ( N “boy”) (PP (P “with”) (NP ( ART “a” ) ( N “telescope”))))))

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP 2( ( NP VP ) 1 )− −Use NP  ART N | ART N PP | PRON

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP 2( ( NP VP ) 1 )− −Use NP  ART N | ART N PP | PRON 3( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) −ART does not match “I”, backup state (b) used

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP 2( ( NP VP ) 1 )− −Use NP  ART N | ART N PP | PRON 3( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) −ART does not match “I”, backup state (b) used 3B3B ( ( PRON VP ) 1 )− −

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP 2( ( NP VP ) 1 )− −Use NP  ART N | ART N PP | PRON 3( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) −ART does not match “I”, backup state (b) used 3B3B ( ( PRON VP ) 1 )− − 4( ( VP ) 2 )− Consumed “I”

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP 2( ( NP VP ) 1 )− −Use NP  ART N | ART N PP | PRON 3( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) −ART does not match “I”, backup state (b) used 3B3B ( ( PRON VP ) 1 )− − 4( ( VP ) 2 )− Consumed “I” 5( ( V NP PP ) 2 )( ( V NP ) 2 ) −Verb Attachment Rule used

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP 2( ( NP VP ) 1 )− −Use NP  ART N | ART N PP | PRON 3( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) −ART does not match “I”, backup state (b) used 3B3B ( ( PRON VP ) 1 )− − 4( ( VP ) 2 )− Consumed “I” 5( ( V NP PP ) 2 )( ( V NP ) 2 ) −Verb Attachment Rule used 6( ( NP PP ) 3 )− Consumed “saw”

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP 2( ( NP VP ) 1 )− −Use NP  ART N | ART N PP | PRON 3( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) −ART does not match “I”, backup state (b) used 3B3B ( ( PRON VP ) 1 )− − 4( ( VP ) 2 )− Consumed “I” 5( ( V NP PP ) 2 )( ( V NP ) 2 ) −Verb Attachment Rule used 6( ( NP PP ) 3 )− Consumed “saw” 7( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 )

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP 2( ( NP VP ) 1 )− −Use NP  ART N | ART N PP | PRON 3( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) −ART does not match “I”, backup state (b) used 3B3B ( ( PRON VP ) 1 )− − 4( ( VP ) 2 )− Consumed “I” 5( ( V NP PP ) 2 )( ( V NP ) 2 ) −Verb Attachment Rule used 6( ( NP PP ) 3 )− Consumed “saw” 7( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8( ( N PP) 4 )− Consumed “a”

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP 2( ( NP VP ) 1 )− −Use NP  ART N | ART N PP | PRON 3( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) −ART does not match “I”, backup state (b) used 3B3B ( ( PRON VP ) 1 )− − 4( ( VP ) 2 )− Consumed “I” 5( ( V NP PP ) 2 )( ( V NP ) 2 ) −Verb Attachment Rule used 6( ( NP PP ) 3 )− Consumed “saw” 7( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8( ( N PP) 4 )− Consumed “a” 9( ( PP ) 5 )− Consumed “boy”

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP 2( ( NP VP ) 1 )− −Use NP  ART N | ART N PP | PRON 3( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) −ART does not match “I”, backup state (b) used 3B3B ( ( PRON VP ) 1 )− − 4( ( VP ) 2 )− Consumed “I” 5( ( V NP PP ) 2 )( ( V NP ) 2 ) −Verb Attachment Rule used 6( ( NP PP ) 3 )− Consumed “saw” 7( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8( ( N PP) 4 )− Consumed “a” 9( ( PP ) 5 )− Consumed “boy” 10 ( ( P NP ) )− −

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP ……… …… 7( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8( ( N PP) 4 )− Consumed “a” 9( ( PP ) 5 )− Consumed “boy” 10( ( P NP ) 5 )− − 11 ( ( NP ) 6 )− Consumed “with”

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP ……… …… 7( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8( ( N PP) 4 )− Consumed “a” 9( ( PP ) 5 )− Consumed “boy” 10( ( P NP ) 5 )− − 11 ( ( NP ) 6 )− Consumed “with” 12( ( ART N ) 6 ) (a) ( ( ART N PP ) 6 ) (b) ( ( PRON ) 6) −

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP ……… …… 7( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8( ( N PP) 4 )− Consumed “a” 9( ( PP ) 5 )− Consumed “boy” 10( ( P NP ) 5 )− − 11 ( ( NP ) 6 )− Consumed “with” 12( ( ART N ) 6 ) (a) ( ( ART N PP ) 6 ) (b) ( ( PRON ) 6) − 13( ( N ) 7 )− Consumed “a”

Top Down Parse StateBackup State ActionComments 1( ( S ) 1 )− −Use S  NP VP ……… …… 7( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8( ( N PP) 4 )− Consumed “a” 9( ( PP ) 5 )− Consumed “boy” 10( ( P NP ) 5 )− − 11 ( ( NP ) 6 )− Consumed “with” 12( ( ART N ) 6 ) (a) ( ( ART N PP ) 6 ) (b) ( ( PRON ) 6) − 13( ( N ) 7 )− Consumed “a” 14( ( − ) 8 )− Consume “telescope” Finish Parsing

Top Down Parsing - Observations Top down parsing gave us the Verb Attachment Parse Tree (i.e., I used a telescope) To obtain the alternate parse tree, the backup state in step 5 will have to be invoked Is there an efficient way to obtain all parses ?

I saw a boy with a telescope Colour Scheme : Blue for Normal Parse Green for Verb Attachment Parse Purple for Noun Attachment Parse Red for Invalid Parse Bottom Up Parse

I saw a boy with a telescope Bottom Up Parse NP 12  PRON 12 S 1?  NP 12 VP 2?

I saw a boy with a telescope Bottom Up Parse NP 12  PRON 12 S 1?  NP 12 VP 2? VP 2?  V 23 NP 3? PP ?? VP 2?  V 23 NP 3?

I saw a boy with a telescope Bottom Up Parse NP 12  PRON 12 S 1?  NP 12 VP 2? VP 2?  V 23 NP 3? PP ?? VP 2?  V 23 NP 3? NP 35  ART 34 N 45 NP 3?  ART 34 N 45 PP 5?

I saw a boy with a telescope NP 3?  ART 34 N 45 PP 5? Bottom Up Parse NP 12  PRON 12 S 1?  NP 12 VP 2? VP 2?  V 23 NP 3? PP ?? VP 2?  V 23 NP 3? NP 35  ART 34 N 45 NP 3?  ART 34 N 45 PP 5? NP 35  ART 34 N 45

I saw a boywith a telescope NP 3?  ART 34 N 45 PP 5? Bottom Up Parse NP 12  PRON 12 S 1?  NP 12 VP 2? VP 2?  V 23 NP 3? PP ?? VP 2?  V 23 NP 3? NP 35  ART 34 N 45 NP 3?  ART 34 N 45 PP 5? NP 35  ART 34 N 45 VP 25  V 23 NP 35 S 15  NP 12 VP 25 VP 2?  V 23 NP 35 PP 5?

I saw a boy with a telescope NP 3?  ART 34 N 45 PP 5? Bottom Up Parse NP 12  PRON 12 S 1?  NP 12 VP 2? VP 2?  V 23 NP 3? PP ?? VP 2?  V 23 NP 3? NP 35  ART 34 N 45 NP 3?  ART 34 N 45 PP 5? PP 5?  P 56 NP 6? NP 35  ART 34 N 45 VP 25  V 23 NP 35 S 15  NP 12 VP 25 VP 2?  V 23 NP 35 PP 5?

I saw a boywith a telescope NP 3?  ART 34 N 45 PP 5? Bottom Up Parse NP 12  PRON 12 S 1?  NP 12 VP 2? VP 2?  V 23 NP 3? PP ?? VP 2?  V 23 NP 3? NP 35  ART 34 N 45 NP 3?  ART 34 N 45 PP 5? PP 5?  P 56 NP 6? NP 35  ART 34 N 45 NP 68  ART 67 N 7? NP6 ?  ART 67 N 78 PP 8? VP 25  V 23 NP 35 S 15  NP 12 VP 25 VP 2?  V 23 NP 35 PP 5?

I saw a boywith a telescope NP 3?  ART 34 N 45 PP 5? Bottom Up Parse NP 12  PRON 12 S 1?  NP 12 VP 2? VP 2?  V 23 NP 3? PP ?? VP 2?  V 23 NP 3? NP 35  ART 34 N 45 NP 3?  ART 34 N 45 PP 5? PP 5?  P 56 NP 6? NP 35  ART 34 N 45 NP 68  ART 67 N 7? NP6 ?  ART 67 N 78 PP 8? NP 68  ART 67 N 78 VP 25  V 23 NP 35 S 15  NP 12 VP 25 VP 2?  V 23 NP 35 PP 5?

I saw a boy with a telescope NP 3?  ART 34 N 45 PP 5? Bottom Up Parse NP 12  PRON 12 S 1?  NP 12 VP 2? VP 2?  V 23 NP 3? PP ?? VP 2?  V 23 NP 3? NP 35  ART 34 N 45 NP 3?  ART 34 N 45 PP 5? PP 5?  P 56 NP 6? NP 35  ART 34 N 45 NP 68  ART 67 N 7? NP6 ?  ART 67 N 78 PP 8? NP 68  ART 67 N 78 VP 25  V 23 NP 35 S 15  NP 12 VP 25 VP 2?  V 23 NP 35 PP 5? PP 58  P 56 NP 68

I saw a boy with a telescope NP 3?  ART 34 N 45 PP 5? Bottom Up Parse NP 12  PRON 12 S 1?  NP 12 VP 2? VP 2?  V 23 NP 3? PP ?? VP 2?  V 23 NP 3? NP 35  ART 34 N 45 NP 3?  ART 34 N 45 PP 5? PP 5?  P 56 NP 6? NP 35  ART 34 N 45 NP 68  ART 67 N 7? NP6 ?  ART 67 N 78 PP 8? NP 68  ART 67 N 78 VP 25  V 23 NP 35 S 15  NP 12 VP 25 VP 2?  V 23 NP 35 PP 5? PP 58  P 56 NP 68 NP 38  ART 34 N 45 PP 58

I saw a boywith a telescope NP 3?  ART 34 N 45 PP 5? Bottom Up Parse NP 12  PRON 12 S 1?  NP 12 VP 2? VP 2?  V 23 NP 3? PP ?? VP 2?  V 23 NP 3? NP 35  ART 34 N 45 NP 3?  ART 34 N 45 PP 5? PP 5?  P 56 NP 6? NP 35  ART 34 N 45 NP 68  ART 67 N 7? NP6 ?  ART 67 N 78 PP 8? NP 68  ART 67 N 78 VP 25  V 23 NP 35 S 15  NP 12 VP 25 VP 2?  V 23 NP 35 PP 5? PP 58  P 56 NP 68 NP 38  ART 34 N 45 PP 58 VP 28  V 23 NP 38 VP 28  V 23 NP 35 PP 58

I saw a boy with a telescope NP 3?  ART 34 N 45 PP 5? Bottom Up Parse NP 12  PRON 12 S 1?  NP 12 VP 2? VP 2?  V 23 NP 3? PP ?? VP 2?  V 23 NP 3? NP 35  ART 34 N 45 NP 3?  ART 34 N 45 PP 5? PP 5?  P 56 NP 6? NP 35  ART 34 N 45 NP 68  ART 67 N 7? NP6 ?  ART 67 N 78 PP 8? NP 68  ART 67 N 78 VP 25  V 23 NP 35 S 15  NP 12 VP 25 VP 2?  V 23 NP 35 PP 5? PP 58  P 56 NP 68 NP 38  ART 34 N 45 PP 58 VP 28  V 23 NP 38 VP 28  V 23 NP 35 PP 58 S 18  NP 12 VP 28

Bottom Up Parsing - Observations Both Noun Attachment and Verb Attachment Parses obtained by simply systematically applying the rules Numbers in subscript help in verifying the parse and getting chunks from the parse

Exercise For the sentence, “The man saw the boy with a telescope” & the grammar given previously, compare the performance of top-down, bottom-up & top-down chart parsing.

Verb Alternation (1/2) (ref: Natural Language Understanding, James Allan) VerbComplement Structure Example laughEmpty (in transitive)Ram laughed findNP (transitive)Ram found the key giveNP+NP (di transitive)Ram gave Sita the paper giveNP+PP [to]Ram gave the paper to Sita ResideLoc PhraseRam resides in Mumbai putNP+loc phraseRam put the book inside the box speakPP [with]+PP[about]Ram with Sita about floods tryVP[to]Ram tried to apologise tellNP+VP[to]Ram told the man to go

Verb Alternation (1/2) VerbComplement Structure Example wishS [to]Ram wished for the man to go keepVP [ing]Ram keeps hoping for the best catchNP+VP [ing]Ram caught Shyam looking in his desk WatchNP+VP [base]Ram watched Shyam eat the pizza regretS [that]Ram regretted that he had eaten the whole thing TellNP+S [that]Ram told Sita that he was sorry SeemADJPRam seems unhappy in his new job ThinkNP+ADJPRam thinks Sita is happy KnowS [wh]Ram knows where to go