CS 182 Sections 103 - 104 Eva Mok April 21, 2004.

Slides:



Advertisements
Similar presentations
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Advertisements

Feature Structures and Parsing Unification Grammars Algorithms for NLP 18 November 2014.
BİL711 Natural Language Processing1 Problems with CFGs We know that CFGs cannot handle certain things which are available in natural languages. In particular,
Natural Language Processing - Parsing 1 - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment / Binding Bottom vs. Top Down Parsing.
Grammars, constituency and order A grammar describes the legal strings of a language in terms of constituency and order. For example, a grammar for a fragment.
Statistical NLP: Lecture 3
GRAMMAR & PARSING (Syntactic Analysis) NLP- WEEK 4.
LING NLP 1 Introduction to Computational Linguistics Martha Palmer April 19, 2006.
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
 Christel Kemke /08 COMP 4060 Natural Language Processing PARSING.
Features & Unification Ling 571 Deep Processing Techniques for NLP January 26, 2011.
Features & Unification Ling 571 Deep Processing Techniques for NLP January 31, 2011.
Parsing: Features & ATN & Prolog By
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
CS 182 Sections slides created by: Eva Mok modified by jgm April 26, 2006.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Features and Unification
Syntax and Context-Free Grammars CMSC 723: Computational Linguistics I ― Session #6 Jimmy Lin The iSchool University of Maryland Wednesday, October 7,
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
CPSC 503 Computational Linguistics
CS 4705 Lecture 11 Feature Structures and Unification Parsing.
Stochastic POS tagging Stochastic taggers choose tags that result in the highest probability: P(word | tag) * P(tag | previous n tags) Stochastic taggers.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books עיבוד שפות טבעיות - שיעור תשע Bottom Up Parsing עידו דגן.
Context-Free Grammar CSCI-GA.2590 – Lecture 3 Ralph Grishman NYU.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
PARSING David Kauchak CS457 – Fall 2011 some slides adapted from Ray Mooney.
1 Features and Unification Chapter 15 October 2012 Lecture #10.
1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.
CS 4705 Parsing More Efficiently and Accurately. Review Top-Down vs. Bottom-Up Parsers Left-corner table provides more efficient look- ahead Left recursion.
Chapter 12: FORMAL GRAMMARS OF ENGLISH Heshaam Faili University of Tehran.
CS : Language Technology for the Web/Natural Language Processing Pushpak Bhattacharyya CSE Dept., IIT Bombay Constituent Parsing and Algorithms (with.
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Linguistic Essentials
Parsing with Context-Free Grammars for ASR Julia Hirschberg CS 4706 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
CPE 480 Natural Language Processing Lecture 4: Syntax Adapted from Owen Rambow’s slides for CSc Fall 2006.
Rules, Movement, Ambiguity
The man bites the dog man bites the dog bites the dog the dog dog Parse Tree NP A N the man bites the dog V N NP S VP A 1. Sentence  noun-phrase verb-phrase.
CSA2050 Introduction to Computational Linguistics Parsing I.
Sentence Parsing Parsing 3 Dynamic Programming. Jan 2009 Speech and Language Processing - Jurafsky and Martin 2 Acknowledgement  Lecture based on  Jurafsky.
Natural Language - General
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
Section 11.3 Features structures in the Grammar ─ Jin Wang.
csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
English Syntax Read J & M Chapter 9.. Two Kinds of Issues Linguistic – what are the facts about language? The rules of syntax (grammar) Algorithmic –
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
The Neural Basis of Thought and Language Final Review Session.
CS 182 Sections Leon Barrett with slides inspired by Eva Mok and Joe Makin April 18, 2007.
CS182 Leon Barrett with slides inspired by Eva Mok and Joe Makin April 25, 2008.
The Neural Basis of Thought and Language Week 12 Metaphor and Automatic Reasoning, plus Grammars.
Natural Language Processing Vasile Rus
Lecture – VIII Monojit Choudhury RS, CSE, IIT Kharagpur
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 6/19/2018
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 11/16/2018
CS 388: Natural Language Processing: Syntactic Parsing
CSCI 5832 Natural Language Processing
CPSC 503 Computational Linguistics
David Kauchak CS159 – Spring 2019
Presentation transcript:

CS 182 Sections Eva Mok April 21, 2004

Announcements a9 out sometime Thursday, due Wednesday April 22nd, 11:59pm How’s your progress on the BBS articles?

Schedule Last Week –SHRUTI –Formal Grammar and Parsing This Week –Construction Grammar, ECG –Psychological model of sentence processing Next Week –Language learning –Learning ECG

Quiz 1.What is top-down parsing? Using a plausible CFG grammar, what is the top down parse of “Pat ate the kiwi”? 2.How well can CFGs represent English? What are some mechanisms for improvement? 3.What are constructions? 4.What is the difference between subcasing and evoking another schema or construction?

Quiz 1.What is top-down parsing? Using a plausible CFG grammar, what is the top down parse of “Harry likes the cafe”? 2.How well can CFGs represent English? What are some mechanisms for improvement? 3.What are constructions? 4.What is the difference between subcasing and evoking another schema or construction?

Grammar A grammar is a set of rules defining a formal language an example is right-regular grammar a more common example is Context-Free Grammar   β  : single non-terminal β : any combination of terminals and non-terminals S  NP VP NP  Det Noun | ProperNoun VP  Verb NP | Verb PP PP  Preposition NP Noun  kiwi | orange | store ProperNoun  Pat | I Det  a | an | the Verb  ate | went | shop Preposition  to | at

Top Down Parsing: Pat ate the kiwi start from S and apply all applicable rules forward search (use your favorite search algorithm…) S  NP VP NP  Det Noun | ProperNoun VP  Verb NP | Verb PP PP  Preposition NP Noun  kiwi | orange | store ProperNoun  Pat | I Det  a | an | the Verb  ate | went | shop Preposition  to | at S NP VP Det Noun VPProperNoun VP a Noun VPan Noun VPthe Noun VP a kiwi VPa orange VPa store VP … succeed when you encounter Pat ate the kiwi in a state without any non-terminals (DFS)

Bottom Up Parsing: Pat ate the kiwi start from the sentence and try to match non-teriminals to it backward search (use your favorite search algorithm…) S  NP VP NP  Det Noun | ProperNoun VP  Verb NP | Verb PP PP  Preposition NP Noun  kiwi | orange | store ProperNoun  Pat | I Det  a | an | the Verb  ate | went | shop Preposition  to | at succeed when you encounter S in a state by itself Pat ate the kiwi ProperNoun ate the kiwi NP Verb the kiwi NP Verb Det kiwi NP Verb Det Noun NP Verb NP NP VP NP ate the kiwi (DFS) S

Quiz 1.What is top-down parsing? Using a plausible CFG grammar, what is the top down parse of “Harry likes the cafe”? 2.How well can CFGs represent English? What are some mechanisms for improvement? 3.What are constructions? 4.What is the difference between subcasing and evoking another schema or construction?

Notice the ungrammatical and/or odd sentences that we can generate? * Pat ate a orange * Pat shop at the store * Pat went a store ? Pat ate a store ? The kiwi went to an orange S  NP VP NP  Det Noun | ProperNoun VP  Verb NP | Verb PP PP  Preposition NP Noun  kiwi | orange | store ProperNoun  Pat | I Det  a | an | the Verb  ate | went | shop Preposition  to | at need to capture agreement, subcategorization, etc you could make many versions of verbs, nouns, dets  cumbersome

Unification Grammar Basic idea: capture these agreement features for each non- terminal in feature structures Enforce constraints on these features using unification rules Pat number : SG person : 3rd agreement I number : SG person : 1st agreement Went agreement Shop agreement number : person : 1st VP  Verb NP VP.agreement ↔ Verb.agreement S  NP VP NP.agreement ↔ VP.agreement

Quiz 1.What is top-down parsing? Using a plausible CFG grammar, what is the top down parse of “Harry likes the cafe”? 2.How well can CFGs represent English? What are some mechanisms for improvement? 3.What are constructions? 4.What is the difference between subcasing and evoking another schema or construction?

Embodied constructions construction H ARRY form : /hEriy/ meaning : Harry construction C AFE form : /k h aefej/ meaning : Cafe Harry CAFE cafe ECG Notation FormMeaning Constructions have form and meaning poles that are subject to type constraints.

Quiz 1.What is top-down parsing? Using a plausible CFG grammar, what is the top down parse of “Harry likes the cafe”? 2.How well can CFGs represent English? What are some mechanisms for improvement? 3.What are constructions? 4.What is the difference between subcasing and evoking another schema or construction?

A schema hierarchy of objects (Nomi) schema Referent subcase of Entity roles category distribution boundedness number gender accessibility resolved-ref schema Referent subcase of Entity roles category distribution boundedness number gender accessibility resolved-ref schema Object subcase of Entity schema Object subcase of Entity schema Entity schema Place schema Physical-Object subcase of Object, Place schema Physical-Object subcase of Object, Place schema Animate subcase of Physical-Object roles animacy constraints animacy ← true schema Animate subcase of Physical-Object roles animacy constraints animacy ← true schema Human subcase of Animate roles sex schema Human subcase of Animate roles sex schema Nomi subcase of Human sex ← female schema Nomi subcase of Human sex ← female schema Manipulable-Object subcase of Physical-Object schema Manipulable-Object subcase of Physical-Object schema Toy subcase of Manipulable-Object schema Toy subcase of Manipulable-Object schema Ball subcase of Toy schema Ball subcase of Toy slot filler schema Cup subcase of Manipulable- Object schema Cup subcase of Manipulable- Object

The schemas we just defined Entity Place Object Physical-Object Referent AnimateManipulable-Object HumanToy NomiMotherFatherBook Cup BallBlock Food DollBed CakeJuice

A schema hierarchy of actions (Nomi) schema DirectedAction subcase of Action roles patient : Entity schema DirectedAction subcase of Action roles patient : Entity schema Action roles agent : Entity schema Action roles agent : Entity schema Move subcase of Action roles mover : Entity direction : Place schema Move subcase of Action roles mover : Entity direction : Place schema CauseMove subcase of DirectedAction, Move roles causer : Human mover : Physical-Object motion : Move constraints motion.mover ↔ mover motion.agent ↔ causer motion.direction ↔ direction agent ↔ causer patient ↔ mover schema CauseMove subcase of DirectedAction, Move roles causer : Human mover : Physical-Object motion : Move constraints motion.mover ↔ mover motion.agent ↔ causer motion.direction ↔ direction agent ↔ causer patient ↔ mover identification constraint type constraint

The schemas we just defined Action Directed Action Cause-Move Move JointMoveTransfer

Constructions, finally (Nomi) construction Ref-Expr form : Schematic-Form meaning : Referent construction Ref-Expr form : Schematic-Form meaning : Referent construction Nomi-Cn level 0 subcase of Ref-Expr form : Word self.f.orth ← "Nomi" meaning evokes Nomi as n self.m.category ↔ n self.m.resolved-ref ↔ n construction Nomi-Cn level 0 subcase of Ref-Expr form : Word self.f.orth ← "Nomi" meaning evokes Nomi as n self.m.category ↔ n self.m.resolved-ref ↔ n local name fancy way of saying that the category of the referent is Nomi construction Cup-Cn level 0 subcase of Ref-Expr form : Word self.f.orth ← “cup" meaning evokes Cup as n self.m.category ↔ n self.m.resolved-ref ↔ n construction Cup-Cn level 0 subcase of Ref-Expr form : Word self.f.orth ← “cup" meaning evokes Cup as n self.m.category ↔ n self.m.resolved-ref ↔ n

Constructions, finally (Nomi) construction Cause-Motion-Verb subcase of Motion-Verb meaning : CauseMove construction Cause-Motion-Verb subcase of Motion-Verb meaning : CauseMove construction Motion-Verb meaning : Move construction Motion-Verb meaning : Move construction Get-Cn level 0 subcase of Cause-Motion-Verb form : Word self.f.orth ← "get" construction Get-Cn level 0 subcase of Cause-Motion-Verb form : Word self.f.orth ← "get" lexical construction

Constructions, finally (Nomi) construction Transitive-Cn level 2 constructional constituents agt : Ref-Expr v : Cause-Motion-Verb obj : Ref-Expr form agt.f before v.f v.f before obj.f meaning v.m.agent ↔ agt.m.resolved-ref v.m.patient ↔ obj.m.resolved-ref construction Transitive-Cn level 2 constructional constituents agt : Ref-Expr v : Cause-Motion-Verb obj : Ref-Expr form agt.f before v.f v.f before obj.f meaning v.m.agent ↔ agt.m.resolved-ref v.m.patient ↔ obj.m.resolved-ref smaller constructions that it takes ordering constraints on the constituents