CSE391 – 2005 NLP 1 Events From KRR lecture. CSE391 – 2005 NLP 2 Ask Jeeves – A Q/A, IR ex. What do you call a successful movie? Tips on Being a Successful.

Slides:



Advertisements
Similar presentations
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING Chunking: Shallow Parsing Eric Atwell, Language Research Group.
Advertisements

The Structure of Sentences Asian 401
Semantics (Representing Meaning)
INTRODUCTION TO ARTIFICIAL INTELLIGENCE Massimo Poesio Relation Extraction.
VerbNet Martha Palmer University of Colorado LING 7800/CSCI September 16,
Syntax: The Sentence Patterns of Language
Semantic Role Labeling Abdul-Lateef Yussiff
PropBanks, 10/30/03 1 Penn Putting Meaning Into Your Trees Martha Palmer Paul Kingsbury, Olga Babko-Malaya, Scott Cotton, Nianwen Xue, Shijong Ryu, Ben.
LING NLP 1 Introduction to Computational Linguistics Martha Palmer April 19, 2006.
Steven Schoonover.  What is VerbNet?  Levin Classification  In-depth look at VerbNet  Evolution of VerbNet  What is FrameNet?  Applications.
The Relevance of a Cognitive Model of the Mental Lexicon to Automatic Word Sense Disambiguation Martha Palmer and Susan Brown University of Colorado August.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
1 NSF-ULA Sense tagging and Eventive Nouns Martha Palmer, Miriam Eckert, Jena D. Hwang, Susan Windisch Brown, Dmitriy Dligach, Jinho Choi, Nianwen Xue.
Simple Features for Chinese Word Sense Disambiguation Hoa Trang Dang, Ching-yi Chia, Martha Palmer, Fu- Dong Chiou Computer and Information Science University.
1 Regular Expressions: grep LING 5200 Computational Corpus Linguistics Martha Palmer.
Introduction to CL Session 1: 7/08/2011. What is computational linguistics? Processing natural language text by computers  for practical applications.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
Context Free Grammar S -> NP VP NP -> det (adj) N
1 CSC 594 Topics in AI – Applied Natural Language Processing Fall 2009/ Outline of English Syntax.
Stochastic POS tagging Stochastic taggers choose tags that result in the highest probability: P(word | tag) * P(tag | previous n tags) Stochastic taggers.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing INTRODUCTION Muhammed Al-Mulhem March 1, 2009.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
CIS630 1 Penn Putting Meaning Into Your Trees Martha Palmer Collaborators: Paul Kingsbury, Olga Babko-Malaya, Bert Xue, Scott Cotton Karin Kipper, Hoa.
Assessing the Impact of Frame Semantics on Textual Entailment Authors: Aljoscha Burchardt, Marco Pennacchiotti, Stefan Thater, Manfred Pinkal Saarland.
Jennie Ning Zheng Linda Melchor Ferhat Omur. Contents Introduction WordNet Application – WordNet Data Structure - WordNet FrameNet Application – FrameNet.
Natural Language Processing Artificial Intelligence CMSC February 28, 2002.
NLP. Introduction to NLP Is language more than just a “bag of words”? Grammatical rules apply to categories and groups of words, not individual words.
Semantic Role Labeling: English PropBank
CS774. Markov Random Field : Theory and Application Lecture 19 Kyomin Jung KAIST Nov
1 Regular Expressions: grep LING 5200 Computational Corpus Linguistics Martha Palmer.
11 Chapter 19 Lexical Semantics. 2 Lexical Ambiguity Most words in natural languages have multiple possible meanings. –“pen” (noun) The dog is in the.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
Notes on Pinker ch.7 Grammar, parsing, meaning. What is a grammar? A grammar is a code or function that is a database specifying what kind of sounds correspond.
Syntax Why is the structure of language (syntax) important? How do we represent syntax? What does an example grammar for English look like? What strategies.
CSE KRR 1 Knowledge Representation Encoding real world knowledge in a formalism that allows us to access it and reason with it.
Rules, Movement, Ambiguity
Natural Language Processing
Natural Language Processing
Deep structure (semantic) Structure of language Surface structure (grammatical, lexical, phonological) Semantic units have all meaning components such.
LING 6520: Comparative Topics in Linguistics (from a computational perspective) Martha Palmer Jan 15,
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
CIS630 1 Penn Different Sense Granularities Martha Palmer, Olga Babko-Malaya September 20, 2004.
English Syntax Read J & M Chapter 9.. Two Kinds of Issues Linguistic – what are the facts about language? The rules of syntax (grammar) Algorithmic –
ARDA Visit 1 Penn Lexical Semantics at Penn: Proposition Bank and VerbNet Martha Palmer, Dan Gildea, Paul Kingsbury, Olga Babko-Malaya, Bert Xue, Karin.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
Lecture 19 Word Meanings II Topics Description Logic III Overview of MeaningReadings: Text Chapter 189NLTK book Chapter 10 March 27, 2013 CSCE 771 Natural.
1 Fine-grained and Coarse-grained Word Sense Disambiguation Jinying Chen, Hoa Trang Dang, Martha Palmer August 22, 2003.
CSA3050: NLP Algorithms Sentence Grammar NLP Algorithms.
Overview of Statistical NLP IR Group Meeting March 7, 2006.
Word Sense and Subjectivity (Coling/ACL 2006) Janyce Wiebe Rada Mihalcea University of Pittsburgh University of North Texas Acknowledgements: This slide.
King Faisal University جامعة الملك فيصل Deanship of E-Learning and Distance Education عمادة التعلم الإلكتروني والتعليم عن بعد [ ] 1 King Faisal University.
CIS630, 9/13/04 1 Penn Putting Meaning into Your Trees Martha Palmer CIS630 September 13, 2004.
English Proposition Bank: Status Report
Grammar and Syntax.
Coarse-grained Word Sense Disambiguation
Lecture – VIII Monojit Choudhury RS, CSE, IIT Kharagpur
SYNTAX.
CS 388: Natural Language Processing: Word Sense Disambiguation
CSC 594 Topics in AI – Applied Natural Language Processing
CS 388: Natural Language Processing: Syntactic Parsing
WordNet: A Lexical Database for English
Grammar and Syntax.
Linguistic Essentials
Natural Language Processing
Lecture 19 Word Meanings II
CS224N Section 3: Corpora, etc.
CS224N Section 3: Project,Corpora
Presentation transcript:

CSE391 – 2005 NLP 1 Events From KRR lecture

CSE391 – 2005 NLP 2 Ask Jeeves – A Q/A, IR ex. What do you call a successful movie? Tips on Being a Successful Movie Vampire... I shall call the police. Successful Casting Call & Shoot for ``Clash of Empires''... thank everyone for their participation in the making of yesterday's movie. Demme's casting is also highly entertaining, although I wouldn't go so far as to call it successful. This movie's resemblance to its predecessor is pretty vague... VHS Movies: Successful Cold Call Selling: Over 100 New Ideas, Scripts, and Examples from the Nation's Foremost Sales Trainer. Blockbuster

CSE391 – 2005 NLP 3 Ask Jeeves – filtering w/ POS tag What do you call a successful movie? Tips on Being a Successful Movie Vampire... I shall call the police. Successful Casting Call & Shoot for ``Clash of Empires''... thank everyone for their participation in the making of yesterday's movie. Demme's casting is also highly entertaining, although I wouldn't go so far as to call it successful. This movie's resemblance to its predecessor is pretty vague... VHS Movies: Successful Cold Call Selling: Over 100 New Ideas, Scripts, and Examples from the Nation's Foremost Sales Trainer.

CSE391 – 2005 NLP 4 Filtering out “call the police” call(you,movie,what) ≠ call(you,police) Different senses, - different syntax, - different participants you movie what you police

CSE391 – 2005 NLP 5 “Meaning” Shallow semantic annotation –Captures critical dependencies, –Highlights participants –Reflects clear sense distinctions Supports training of supervised automatic taggers Works for other languages

CSE391 – 2005 NLP 6 Cornerstone: English lexical resource That provides sets of possible syntactic frames for verbs. And provides clear, replicable sense distinctions. AskJeeves: Who do you call for a good electronic lexical database for English?

CSE391 – 2005 NLP 7 WordNet – Princeton (Miller 1985, Fellbaum 1998) On-line lexical reference (dictionary) –Nouns, verbs, adjectives, and adverbs grouped into synonym sets –Other relations include hypernyms (ISA), antonyms, meronyms

CSE391 – 2005 NLP 8 WordNet

CSE391 – 2005 NLP 9 WordNet – call, 28 senses 1.name, call -- (assign a specified, proper name to; "They named their son David"; …) -> LABEL 2. call, telephone, call up, phone, ring -- (get or try to get into communication (with someone) by telephone; "I tried to call you all night"; …) ->TELECOMMUNICATE 3. call -- (ascribe a quality to or give a name of a common noun that reflects a quality; "He called me a bastard"; …) -> LABEL 4. call, send for -- (order, request, or command to come; She was called into the director's office"; "Call the police!") -> ORDER

CSE391 – 2005 NLP 10 WordNet – Princeton (Miller 1985, Fellbaum 1998) Limitations as a computational lexicon –Contains little syntactic information Comlex has syntax but no sense distinctions –No explicit lists of participants –Sense distinctions very fine-grained, –Definitions often vague Causes problems with creating training data for supervised Machine Learning – SENSEVAL2 Verbs > 16 senses (including call) Inter-annotator Agreement ITA 73%, Automatic Word Sense Disambiguation, WSD 60.2% Dang & Palmer, SIGLEX02

CSE391 – 2005 NLP 11 WordNet: - call, 28 senses WN2, WN13,WN28 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN5 WN 16WN6 WN23 WN12 WN17, WN 11 WN10, WN14, WN21, WN24

CSE391 – 2005 NLP 12 WordNet: - call, 28 senses, Senseval2 groups (engineering!) WN5, WN16,WN12 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN2 WN 13WN6 WN23 WN28 WN17, WN 11 WN10, WN14, WN21, WN24, Loud cry Label Phone/radio Bird or animal cry Request Call a loan/bond Visit Challenge Bid

CSE391 – 2005 NLP 13 Grouping improved scores: ITA 82%, MaxEnt WSD 69% Call: 31% of errors due to confusion between senses within same group 1: –name, call -- (assign a specified, proper name to; They named their son David) –call -- (ascribe a quality to or give a name of a common noun that reflects a quality; He called me a bastard) –call -- (consider or regard as being;I would not call her beautiful) –75% with training and testing on grouped senses vs. –43% with training and testing on fine-grained senses Palmer, Dang, Fellbaum,, submitted, NLE

CSE391 – 2005 NLP 14 Proposition Bank: From Sentences to Propositions (Predicates!) Powell met Zhu Rongji Proposition: meet(Powell, Zhu Rongji ) Powell met with Zhu Rongji Powell and Zhu Rongji met Powell and Zhu Rongji had a meeting... When Powell met Zhu Rongji on Thursday they discussed the return of the spy plane. meet(Powell, Zhu) discuss([Powell, Zhu], return(X, plane)) debate consult join wrestle battle meet(Somebody1, Somebody2)

CSE391 – 2005 NLP 15 Capturing semantic roles* Richard broke [ ARG1 the laser pointer.] [ARG1 The windows] were broken by the hurricane. [ARG1 The vase] broke into pieces when it toppled over. SUBJ *See also Framenet,

CSE391 – 2005 NLP 16 Word Senses in PropBank Orders to ignore word sense not feasible for 700+ verbs –Mary left the room –Mary left her daughter-in-law her pearls in her will Frameset leave.01 "move away from": Arg0: entity leaving Arg1: place left Frameset leave.02 "give": Arg0: giver Arg1: thing given Arg2: beneficiary How do these relate to traditional word senses in VerbNet and WordNet?

CSE391 – 2005 NLP 17 WordNet: - call, 28 senses, groups WN5, WN16,WN12 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN2 WN 13WN6 WN23 WN28 WN17, WN 11 WN10, WN14, WN21, WN24, Loud cry Label Phone/radio Bird or animal cry Request Call a loan/bond Visit Challenge Bid

CSE391 – 2005 NLP 18 Overlap with PropBank Framesets WN5, WN16,WN12 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN2 WN 13WN6 WN23 WN28 WN17, WN 11 WN10, WN14, WN21, WN24, Loud cry Label Phone/radio Bird or animal cry Request Call a loan/bond Visit Challenge Bid

CSE391 – 2005 NLP 19 Overlap between Senseval2 Groups and Framesets – 95% WN1 WN2 WN3 WN4 WN6 WN7 WN8 WN5 WN 9 WN10 WN11 WN12 WN13 WN 14 WN19 WN20 Frameset1 Frameset2 develop Palmer, Babko-Malaya,Dang SNLU 2004

CSE391 – 2005 NLP 20 Sense Hierarchy PropBank Framesets – coarse grained distinctions 20 Senseval 2 verbs w/ > 1 Frameset Maxent WSD system, 73.5% baseline, 90% accuracy –Sense Groups (Senseval-2) intermediate level (includes Levin classes) – 69% WordNet – fine grained distinctions, 60.2%

CSE391 – 2005 NLP 21 Maximum Entropy WSD Hoa Dang, best performer on Verbs Maximum entropy framework, p(sense|context) Contextual Linguistic Features –Topical feature for W: +2.5%, keywords (determined automatically) –Local syntactic features for W: +1.5 to +5%, presence of subject, complements, passive? words in subject, complement positions, particles, preps, etc. –Local semantic features for W: +6% Semantic class info from WordNet (synsets, etc.) Named Entity tag (PERSON, LOCATION,..) for proper Ns words within +/- 2 word window

CSE391 – 2005 NLP 22 Context Sensitivity Programming languages are Context Free Natural languages are Context Sensitive? –Movement –Features –respectively John, Mary and Bill ate peaches, pears and apples, respectively.

CSE391 – 2005 NLP 23 The Chomsky Grammar Hierarchy Regular grammars, aabbbb S → aS | nil | bS Context free grammars, aaabbb S → aSb | nil Context sensitive grammars, aaabbbccc xSy → xby Turing Machines

CSE391 – 2005 NLP 24 Recursive transition nets for CFGs s :- np,vp. np:- pronoun; noun; det,adj, noun; np,pp. S1S2S3 NP VP S S4S5S6 det noun NP noun pronoun pp adj

CSE391 – 2005 NLP 25 Most parsers are Turing Machines To give a more natural and comprehensible treatment of movement For a more efficient treatment of features Not because of respectively – most parsers can’t handle it.

CSE391 – 2005 NLP 26 Nested Dependencies and Crossing Dependencies John, Mary and Bill ate peaches, pears and apples, respectively The dog chased the cat that bit the mouse that ran. The mouse the cat the dog chased bit ran. CF CS

CSE391 – 2005 NLP 27 Movement What did John give to Mary? *Where did John give to Mary? John gave cookies to Mary. John gave to Mary.

CSE391 – 2005 NLP 28 Handling Movement: Hold registers/Slash Categories S :- Wh, S/NP S/NP:- VP S/NP :- NP VP/NP VP/NP :- Verb