Lecture 21 Computational Lexical Semantics Topics Features in NLTK III Computational Lexical Semantics Semantic Web USCReadings: NLTK book Chapter 10 Text.

Slides:



Advertisements
Similar presentations
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING Word Sense Disambiguation semantic tagging of text, for Confusion Set Disambiguation.
Advertisements

Intro to NLP - J. Eisner1 Splitting Words a.k.a. “Word Sense Disambiguation”
Word sense disambiguation and information retrieval Chapter 17 Jurafsky, D. & Martin J. H. SPEECH and LANGUAGE PROCESSING Jarmo Ritola -
Bag-of-Words Methods for Text Mining CSCI-GA.2590 – Lecture 2A
Part II. Statistical NLP Advanced Artificial Intelligence Part of Speech Tagging Wolfram Burgard, Luc De Raedt, Bernhard Nebel, Lars Schmidt-Thieme Most.
Word Sense Disambiguation Ling571 Deep Processing Techniques for NLP February 23, 2011.
The use of unlabeled data to improve supervised learning for text summarization MR Amini, P Gallinari (SIGIR 2002) Slides prepared by Jon Elsas for the.
CS Word Sense Disambiguation. 2 Overview A problem for semantic attachment approaches: what happens when a given lexeme has multiple ‘meanings’?
CS347 Review Slides (IR Part II) June 6, 2001 ©Prabhakar Raghavan.
CS 4705 Lecture 19 Word Sense Disambiguation. Overview Selectional restriction based approaches Robust techniques –Machine Learning Supervised Unsupervised.
1 Noun Homograph Disambiguation Using Local Context in Large Text Corpora Marti A. Hearst Presented by: Heng Ji Mar. 29, 2004.
Introduction to CL Session 1: 7/08/2011. What is computational linguistics? Processing natural language text by computers  for practical applications.
Taking the Kitchen Sink Seriously: An Ensemble Approach to Word Sense Disambiguation from Christopher Manning et al.
Vector Space Model CS 652 Information Extraction and Integration.
Word Sense Disambiguation CMSC 723: Computational Linguistics I ― Session #11 Jimmy Lin The iSchool University of Maryland Wednesday, November 11, 2009.
CS 4705 Word Sense Disambiguation. Overview Selectional restriction based approaches Robust techniques –Machine Learning Supervised Unsupervised –Dictionary-based.
تمرين شماره 1 درس NLP سيلابس درس NLP در دانشگاه هاي ديگر ___________________________ راحله مکي استاد درس: دکتر عبدالله زاده پاييز 85.
LSA 311 Computational Lexical Semantics Dan Jurafsky Stanford University Lecture 2: Word Sense Disambiguation.
Semi-Supervised Natural Language Learning Reading Group I set up a site at: ervised/
(Some issues in) Text Ranking. Recall General Framework Crawl – Use XML structure – Follow links to get new pages Retrieve relevant documents – Today.
Albert Gatt Corpora and Statistical Methods Lecture 9.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2005 Lecture 1 21 July 2005.
Word Sense Disambiguation. Word Sense Disambiguation (WSD) Given A word in context A fixed inventory of potential word senses Decide which sense of the.
ELN – Natural Language Processing Giuseppe Attardi
Lexical Semantics CSCI-GA.2590 – Lecture 7A
Part II. Statistical NLP Advanced Artificial Intelligence Applications of HMMs and PCFGs in NLP Wolfram Burgard, Luc De Raedt, Bernhard Nebel, Lars Schmidt-Thieme.
Personalisation Seminar on Unlocking the Secrets of the Past: Text Mining for Historical Documents Sven Steudter.
Word Sense Disambiguation Many words have multiple meanings –E.g, river bank, financial bank Problem: Assign proper sense to each ambiguous word in text.
Text Classification, Active/Interactive learning.
1 Statistical Parsing Chapter 14 October 2012 Lecture #9.
Lecture 6 Hidden Markov Models Topics Smoothing again: Readings: Chapters January 16, 2013 CSCE 771 Natural Language Processing.
1 Statistical NLP: Lecture 9 Word Sense Disambiguation.
11 Chapter 20 Computational Lexical Semantics. Supervised Word-Sense Disambiguation (WSD) Methods that learn a classifier from manually sense-tagged text.
Lecture 20a Feature Based Grammars Topics Description Logic III Overview of MeaningReadings: Text Chapter 18 NLTK book Chapter 10 March 28, 2013 CSCE 771.
Paper Review by Utsav Sinha August, 2015 Part of assignment in CS 671: Natural Language Processing, IIT Kanpur.
Word Sense Disambiguation Reading: Chap 16-17, Jurafsky & Martin Instructor: Rada Mihalcea.
W ORD S ENSE D ISAMBIGUATION By Mahmood Soltani Tehran University 2009/12/24 1.
Lecture 22 Word Similarity Topics word similarity Thesaurus based word similarity Intro. Distributional based word similarityReadings: NLTK book Chapter.
An Effective Word Sense Disambiguation Model Using Automatic Sense Tagging Based on Dictionary Information Yong-Gu Lee
CS 4705 Lecture 19 Word Sense Disambiguation. Overview Selectional restriction based approaches Robust techniques –Machine Learning Supervised Unsupervised.
October 2005CSA3180 NLP1 CSA3180 Natural Language Processing Introduction and Course Overview.
Lecture 22 Word Similarity Topics word similarity Thesaurus based word similarity Intro. Distributional based word similarityReadings: NLTK book Chapter.
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 24 (14/04/06) Prof. Pushpak Bhattacharyya IIT Bombay Word Sense Disambiguation.
Improving Named Entity Translation Combining Phonetic and Semantic Similarities Fei Huang, Stephan Vogel, Alex Waibel Language Technologies Institute School.
©2012 Paula Matuszek CSC 9010: Text Mining Applications Lab 3 Dr. Paula Matuszek (610)
KNN & Naïve Bayes Hongning Wang Today’s lecture Instance-based classifiers – k nearest neighbors – Non-parametric learning algorithm Model-based.
1 Measuring the Semantic Similarity of Texts Author : Courtney Corley and Rada Mihalcea Source : ACL-2005 Reporter : Yong-Xiang Chen.
1 Gloss-based Semantic Similarity Metrics for Predominant Sense Acquisition Ryu Iida Nara Institute of Science and Technology Diana McCarthy and Rob Koeling.
Lecture 24 Distributiona l based Similarity II Topics Distributional based word similarityReadings: NLTK book Chapter 2 (wordnet) Text Chapter 20 April.
Overview of Statistical NLP IR Group Meeting March 7, 2006.
Second Language Learning From News Websites Word Sense Disambiguation using Word Embeddings.
Lecture 1 Overview Online Resources Topics Overview Readings: Google January 16, 2013 CSCE 771 Natural Language Processing.
KNN & Naïve Bayes Hongning Wang
Problem Solving with NLTK MSE 2400 EaLiCaRA Dr. Tom Way.
Text Classification and Naïve Bayes Formalizing the Naïve Bayes Classifier.
Intro to NLP - J. Eisner1 Splitting Words a.k.a. “Word Sense Disambiguation”
CSCE 771 Natural Language Processing
Natural Language Processing (NLP)
Lecture 21 Computational Lexical Semantics
Lecture 22 Word Similarity
Statistical NLP: Lecture 9
A method for WSD on Unrestricted Text
Lecture 20a Feature Based Grammars
Lecture 22 Word Similarity
Lecture 19 Word Meanings II
Natural Language Processing (NLP)
Unsupervised Word Sense Disambiguation Using Lesk algorithm
Statistical NLP : Lecture 9 Word Sense Disambiguation
Natural Language Processing (NLP)
Presentation transcript:

Lecture 21 Computational Lexical Semantics Topics Features in NLTK III Computational Lexical Semantics Semantic Web USCReadings: NLTK book Chapter 10 Text Chapters 20 April 3, 2013 CSCE 771 Natural Language Processing

– 2 – CSCE 771 Spring 2013 Overview Last Time (Programming) Features in NLTK NL queries  SQL NLTK support for Interpretations and Models Propositional and predicate logic support Prover9Today Last Lectures slides Features in NLTK Computational Lexical SemanticsReadings: Text 19,20 NLTK Book: Chapter 10 Next Time: Computational Lexical Semantics II

– 3 – CSCE 771 Spring 2013 Model Building in NLTK - Chapter 10 continued Mace model builder lp = nltk.LogicParser() # install Mace4 config_mace4('c:\Python26\Lib\site-packages\prover9') a3 = lp.parse('exists x.(man(x) & walks(x))') c1 = lp.parse('mortal(socrates)') c2 = lp.parse('-mortal(socrates)') mb = nltk.Mace(5) print mb.build_model(None, [a3, c1]) True print mb.build_model(None, [a3, c2]) True print mb.build_model(None, [c1, c2]) False

– 4 – CSCE 771 Spring 2013 >>> a4 = lp.parse('exists y. (woman(y) & all x. (man(x) -> love(x,y)))') >>> a5 = lp.parse('man(adam)') >>> a6 = lp.parse('woman(eve)') >>> g = lp.parse('love(adam,eve)') >>> mc = nltk.MaceCommand(g, assumptions=[a4, a5, a6]) >>> mc.build_model() True

– 5 – CSCE 771 Spring The Semantics of English Sentences Principle of compositionality --

– 6 – CSCE 771 Spring 2013 Representing the λ-Calculus in NLTK (33) a.(walk(x) ∧ chew_gum(x)) b.λx.(walk(x) ∧ chew_gum(x)) c.\x.(walk(x) & chew_gum(x)) -- the NLTK way!

– 7 – CSCE 771 Spring 2013 Lambda0.py import nltk from nltk import load_parser lp = nltk.LogicParser() e = lp.parse(r'\x.(walk(x) & chew_gum(x))') print e \x.(walk(x) & chew_gum(x)) e.free() print lp.parse(r'\x.(walk(x) & chew_gum(y))') \x.(walk(x) & chew_gum(y))

– 8 – CSCE 771 Spring 2013 Simple β-reductions >>> e = lp.parse(r'\x.(walk(x) & chew_gum(x))(gerald)') >>> print e \x.(walk(x) & chew_gum(x))(gerald) >>> print e.simplify() [1] (walk(gerald) & chew_gum(gerald))

– 9 – CSCE 771 Spring 2013 Predicate reductions >>> e3 = lp.parse('\P.exists x.P(x)(\y.see(y, x))') >>> print e3 (\P.exists x.P(x))(\y.see(y,x)) >>> print e3.simplify() exists z1.see(z1,x)

– 10 – CSCE 771 Spring 2013 Figure 19.7 Inheritance of Properties Exists e,x,y Eating(e) ^ Agent(e, x) ^ Theme(e, y) “hamburger edible?” from wordnet Copyright ©2009 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. Speech and Language Processing, Second Edition Daniel Jurafsky and James H. Martin

– 11 – CSCE 771 Spring 2013 Figure 20.1 Possible sense tags for bass Chapter 20 – Word Sense disambiguation (WSD) Machine translation Supervised vs unsupervised learning Semantic concordance – corpus with words tagged with sense tags

– 12 – CSCE 771 Spring 2013 Feature Extraction for WSD Feature vectors Collocation [w i-2, POS i-2, w i-1, POS i-1, w i, POS i, w i+1, POS i+1, w i+2, POS i+2 ] [w i-2, POS i-2, w i-1, POS i-1, w i, POS i, w i+1, POS i+1, w i+2, POS i+2 ] Bag-of-words – unordered set of neighboring words Represent sets of most frequent content words with membership vector [0,0,1,0,0,0,1] – set of 3 rd and 7 th most freq. content word Window of nearby words/features

– 13 – CSCE 771 Spring 2013 Naïve Bayes Classifier w – word vector s – sense tag vector f – feature vector [w i, POS i ] for i=1, …n Approximate by frequency counts But how practical?

– 14 – CSCE 771 Spring 2013 Looking for Practical formula. Still not practical

– 15 – CSCE 771 Spring 2013 Naïve == Assume Independence Now practical, but realistic?

– 16 – CSCE 771 Spring 2013 Training = count frequencies. Maximum likelihood estimator (20.8)

– 17 – CSCE 771 Spring 2013 Decision List Classifiers Naïve Bayes  hard for humans to examine decisions and understand Decision list classifiers - like “case” statement sequence of (test, returned-sense-tag) pairs

– 18 – CSCE 771 Spring 2013 Figure 20.2 Decision List Classifier Rules

– 19 – CSCE 771 Spring 2013 WSD Evaluation, baselines, ceilings Extrinsic evaluation - evaluating embedded NLP in end- to-end applications (in vivo) Intrinsic evaluation – WSD evaluating by itself (in vitro) Sense accuracy Corpora – SemCor, SENSEVAL, SEMEVAL Baseline - Most frequent sense (wordnet sense 1) Ceiling – Gold standard – human experts with discussion and agreement

– 20 – CSCE 771 Spring 2013 Figure 20.3 Simplified Lesk Algorithm gloss/sentence overlap

– 21 – CSCE 771 Spring 2013 Simplified Lesk example The bank can guarantee deposits will eventually cover future tuition costs because it invests in adjustable rate mortgage securities.

– 22 – CSCE 771 Spring 2013 SENSEVAL competitions Check the Senseval-3 website.Senseval-3

– 23 – CSCE 771 Spring 2013 Corpus Lesk weights applied to overlap words inverse document frequency idf i = log (N docs / num docs containing w i )

– 24 – CSCE 771 Spring Selectional Restrictions and Preferences

– 25 – CSCE 771 Spring 2013 Wordnet Semantic classes of Objects

– 26 – CSCE 771 Spring 2013 Minimally Supervised WSD: Bootstrapping Yarowsky algorithm Heuritics: 1.one sense per collocations 2.one sense per discourse

– 27 – CSCE 771 Spring 2013 Figure 20.4 Two senses of plant

– 28 – CSCE 771 Spring 2013 Figure 20.5

– 29 – CSCE 771 Spring 2013 Figure 20.6 Path Based Similarity

– 30 – CSCE 771 Spring 2013 Figure 20.6 Path Based Similarity.\ sim path (c 1, c 2 )= 1/pathlen(c 1, c 2 ) (length + 1)

– 31 – CSCE 771 Spring 2013 Information Content word similarity

– 32 – CSCE 771 Spring 2013 Figure 20.7 Wordnet with P(c) values

– 33 – CSCE 771 Spring 2013 Figure 20.8

– 34 – CSCE 771 Spring 2013

– 35 – CSCE 771 Spring 2013 Figure 20.9

– 36 – CSCE 771 Spring 2013 Figure 20.10

– 37 – CSCE 771 Spring 2013 Figure 20.11

– 38 – CSCE 771 Spring 2013 Figure 20.12

– 39 – CSCE 771 Spring 2013 Figure 20.13

– 40 – CSCE 771 Spring 2013 Figure 20.14

– 41 – CSCE 771 Spring 2013 Figure 20.15

– 42 – CSCE 771 Spring 2013 Figure 20.16