Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSE391 – 2005 NLP 1 Events From KRR lecture. CSE391 – 2005 NLP 2 Ask Jeeves – A Q/A, IR ex. What do you call a successful movie? Tips on Being a Successful.

Similar presentations


Presentation on theme: "CSE391 – 2005 NLP 1 Events From KRR lecture. CSE391 – 2005 NLP 2 Ask Jeeves – A Q/A, IR ex. What do you call a successful movie? Tips on Being a Successful."— Presentation transcript:

1 CSE391 – 2005 NLP 1 Events From KRR lecture

2 CSE391 – 2005 NLP 2 Ask Jeeves – A Q/A, IR ex. What do you call a successful movie? Tips on Being a Successful Movie Vampire... I shall call the police. Successful Casting Call & Shoot for ``Clash of Empires''... thank everyone for their participation in the making of yesterday's movie. Demme's casting is also highly entertaining, although I wouldn't go so far as to call it successful. This movie's resemblance to its predecessor is pretty vague... VHS Movies: Successful Cold Call Selling: Over 100 New Ideas, Scripts, and Examples from the Nation's Foremost Sales Trainer. Blockbuster

3 CSE391 – 2005 NLP 3 Ask Jeeves – filtering w/ POS tag What do you call a successful movie? Tips on Being a Successful Movie Vampire... I shall call the police. Successful Casting Call & Shoot for ``Clash of Empires''... thank everyone for their participation in the making of yesterday's movie. Demme's casting is also highly entertaining, although I wouldn't go so far as to call it successful. This movie's resemblance to its predecessor is pretty vague... VHS Movies: Successful Cold Call Selling: Over 100 New Ideas, Scripts, and Examples from the Nation's Foremost Sales Trainer.

4 CSE391 – 2005 NLP 4 Filtering out “call the police” call(you,movie,what) ≠ call(you,police) Different senses, - different syntax, - different participants you movie what you police

5 CSE391 – 2005 NLP 5 “Meaning” Shallow semantic annotation –Captures critical dependencies, –Highlights participants –Reflects clear sense distinctions Supports training of supervised automatic taggers Works for other languages

6 CSE391 – 2005 NLP 6 Cornerstone: English lexical resource That provides sets of possible syntactic frames for verbs. And provides clear, replicable sense distinctions. AskJeeves: Who do you call for a good electronic lexical database for English?

7 CSE391 – 2005 NLP 7 WordNet – Princeton (Miller 1985, Fellbaum 1998) On-line lexical reference (dictionary) –Nouns, verbs, adjectives, and adverbs grouped into synonym sets –Other relations include hypernyms (ISA), antonyms, meronyms

8 CSE391 – 2005 NLP 8 WordNet http://www.cogsci.princeton.edu/~wn/

9 CSE391 – 2005 NLP 9 WordNet – call, 28 senses 1.name, call -- (assign a specified, proper name to; "They named their son David"; …) -> LABEL 2. call, telephone, call up, phone, ring -- (get or try to get into communication (with someone) by telephone; "I tried to call you all night"; …) ->TELECOMMUNICATE 3. call -- (ascribe a quality to or give a name of a common noun that reflects a quality; "He called me a bastard"; …) -> LABEL 4. call, send for -- (order, request, or command to come; She was called into the director's office"; "Call the police!") -> ORDER

10 CSE391 – 2005 NLP 10 WordNet – Princeton (Miller 1985, Fellbaum 1998) Limitations as a computational lexicon –Contains little syntactic information Comlex has syntax but no sense distinctions –No explicit lists of participants –Sense distinctions very fine-grained, –Definitions often vague Causes problems with creating training data for supervised Machine Learning – SENSEVAL2 Verbs > 16 senses (including call) Inter-annotator Agreement ITA 73%, Automatic Word Sense Disambiguation, WSD 60.2% Dang & Palmer, SIGLEX02

11 CSE391 – 2005 NLP 11 WordNet: - call, 28 senses WN2, WN13,WN28 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN5 WN 16WN6 WN23 WN12 WN17, WN 11 WN10, WN14, WN21, WN24

12 CSE391 – 2005 NLP 12 WordNet: - call, 28 senses, Senseval2 groups (engineering!) WN5, WN16,WN12 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN2 WN 13WN6 WN23 WN28 WN17, WN 11 WN10, WN14, WN21, WN24, Loud cry Label Phone/radio Bird or animal cry Request Call a loan/bond Visit Challenge Bid

13 CSE391 – 2005 NLP 13 Grouping improved scores: ITA 82%, MaxEnt WSD 69% Call: 31% of errors due to confusion between senses within same group 1: –name, call -- (assign a specified, proper name to; They named their son David) –call -- (ascribe a quality to or give a name of a common noun that reflects a quality; He called me a bastard) –call -- (consider or regard as being;I would not call her beautiful) –75% with training and testing on grouped senses vs. –43% with training and testing on fine-grained senses Palmer, Dang, Fellbaum,, submitted, NLE

14 CSE391 – 2005 NLP 14 Proposition Bank: From Sentences to Propositions (Predicates!) Powell met Zhu Rongji Proposition: meet(Powell, Zhu Rongji ) Powell met with Zhu Rongji Powell and Zhu Rongji met Powell and Zhu Rongji had a meeting... When Powell met Zhu Rongji on Thursday they discussed the return of the spy plane. meet(Powell, Zhu) discuss([Powell, Zhu], return(X, plane)) debate consult join wrestle battle meet(Somebody1, Somebody2)

15 CSE391 – 2005 NLP 15 Capturing semantic roles* Richard broke [ ARG1 the laser pointer.] [ARG1 The windows] were broken by the hurricane. [ARG1 The vase] broke into pieces when it toppled over. SUBJ *See also Framenet, http://www.icsi.berkeley.edu/~framenet/http://www.icsi.berkeley.edu/~framenet/

16 CSE391 – 2005 NLP 16 Word Senses in PropBank Orders to ignore word sense not feasible for 700+ verbs –Mary left the room –Mary left her daughter-in-law her pearls in her will Frameset leave.01 "move away from": Arg0: entity leaving Arg1: place left Frameset leave.02 "give": Arg0: giver Arg1: thing given Arg2: beneficiary How do these relate to traditional word senses in VerbNet and WordNet?

17 CSE391 – 2005 NLP 17 WordNet: - call, 28 senses, groups WN5, WN16,WN12 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN2 WN 13WN6 WN23 WN28 WN17, WN 11 WN10, WN14, WN21, WN24, Loud cry Label Phone/radio Bird or animal cry Request Call a loan/bond Visit Challenge Bid

18 CSE391 – 2005 NLP 18 Overlap with PropBank Framesets WN5, WN16,WN12 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN2 WN 13WN6 WN23 WN28 WN17, WN 11 WN10, WN14, WN21, WN24, Loud cry Label Phone/radio Bird or animal cry Request Call a loan/bond Visit Challenge Bid

19 CSE391 – 2005 NLP 19 Overlap between Senseval2 Groups and Framesets – 95% WN1 WN2 WN3 WN4 WN6 WN7 WN8 WN5 WN 9 WN10 WN11 WN12 WN13 WN 14 WN19 WN20 Frameset1 Frameset2 develop Palmer, Babko-Malaya,Dang SNLU 2004

20 CSE391 – 2005 NLP 20 Sense Hierarchy PropBank Framesets – coarse grained distinctions 20 Senseval 2 verbs w/ > 1 Frameset Maxent WSD system, 73.5% baseline, 90% accuracy –Sense Groups (Senseval-2) intermediate level (includes Levin classes) – 69% WordNet – fine grained distinctions, 60.2%

21 CSE391 – 2005 NLP 21 Maximum Entropy WSD Hoa Dang, best performer on Verbs Maximum entropy framework, p(sense|context) Contextual Linguistic Features –Topical feature for W: +2.5%, keywords (determined automatically) –Local syntactic features for W: +1.5 to +5%, presence of subject, complements, passive? words in subject, complement positions, particles, preps, etc. –Local semantic features for W: +6% Semantic class info from WordNet (synsets, etc.) Named Entity tag (PERSON, LOCATION,..) for proper Ns words within +/- 2 word window

22 CSE391 – 2005 NLP 22 Context Sensitivity Programming languages are Context Free Natural languages are Context Sensitive? –Movement –Features –respectively John, Mary and Bill ate peaches, pears and apples, respectively.

23 CSE391 – 2005 NLP 23 The Chomsky Grammar Hierarchy Regular grammars, aabbbb S → aS | nil | bS Context free grammars, aaabbb S → aSb | nil Context sensitive grammars, aaabbbccc xSy → xby Turing Machines

24 CSE391 – 2005 NLP 24 Recursive transition nets for CFGs s :- np,vp. np:- pronoun; noun; det,adj, noun; np,pp. S1S2S3 NP VP S S4S5S6 det noun NP noun pronoun pp adj

25 CSE391 – 2005 NLP 25 Most parsers are Turing Machines To give a more natural and comprehensible treatment of movement For a more efficient treatment of features Not because of respectively – most parsers can’t handle it.

26 CSE391 – 2005 NLP 26 Nested Dependencies and Crossing Dependencies John, Mary and Bill ate peaches, pears and apples, respectively The dog chased the cat that bit the mouse that ran. The mouse the cat the dog chased bit ran. CF CS

27 CSE391 – 2005 NLP 27 Movement What did John give to Mary? *Where did John give to Mary? John gave cookies to Mary. John gave to Mary.

28 CSE391 – 2005 NLP 28 Handling Movement: Hold registers/Slash Categories S :- Wh, S/NP S/NP:- VP S/NP :- NP VP/NP VP/NP :- Verb


Download ppt "CSE391 – 2005 NLP 1 Events From KRR lecture. CSE391 – 2005 NLP 2 Ask Jeeves – A Q/A, IR ex. What do you call a successful movie? Tips on Being a Successful."

Similar presentations


Ads by Google