Word Sense and Subjectivity Jan Wiebe Rada Mihalcea University of Pittsburgh University of North Texas.

Slides:



Advertisements
Similar presentations
Distant Supervision for Emotion Classification in Twitter posts 1/17.
Advertisements

NYU ANLP-00 1 Automatic Discovery of Scenario-Level Patterns for Information Extraction Roman Yangarber Ralph Grishman Pasi Tapanainen Silja Huttunen.
Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis Theresa Wilson Janyce Wiebe Paul Hoffmann University of Pittsburgh.
Sentiment Analysis An Overview of Concepts and Selected Techniques.
A Brief Overview. Contents Introduction to NLP Sentiment Analysis Subjectivity versus Objectivity Determining Polarity Statistical & Linguistic Approaches.
Annotating Topics of Opinions Veselin Stoyanov Claire Cardie.
LEDIR : An Unsupervised Algorithm for Learning Directionality of Inference Rules Advisor: Hsin-His Chen Reporter: Chi-Hsin Yu Date: From EMNLP.
Query Dependent Pseudo-Relevance Feedback based on Wikipedia SIGIR ‘09 Advisor: Dr. Koh Jia-Ling Speaker: Lin, Yi-Jhen Date: 2010/01/24 1.
Automatic Metaphor Interpretation as a Paraphrasing Task Ekaterina Shutova Computer Lab, University of Cambridge NAACL 2010.
Multi-Perspective Question Answering Using the OpQA Corpus Veselin Stoyanov Claire Cardie Janyce Wiebe Cornell University University of Pittsburgh.
CS Word Sense Disambiguation. 2 Overview A problem for semantic attachment approaches: what happens when a given lexeme has multiple ‘meanings’?
Sentiment Lexicon Creation from Lexical Resources BIS 2011 Bas Heerschop Erasmus School of Economics Erasmus University Rotterdam
Predicting the Semantic Orientation of Adjective Vasileios Hatzivassiloglou and Kathleen R. McKeown Presented By Yash Satsangi.
1 Noun Homograph Disambiguation Using Local Context in Large Text Corpora Marti A. Hearst Presented by: Heng Ji Mar. 29, 2004.
1 Attributions and Private States Jan Wiebe (U. Pittsburgh) Theresa Wilson (U. Pittsburgh) Claire Cardie (Cornell U.)
Learning Subjective Nouns using Extraction Pattern Bootstrapping Ellen Riloff, Janyce Wiebe, Theresa Wilson Presenter: Gabriel Nicolae.
Learning Subjective Adjectives from Corpora Janyce M. Wiebe Presenter: Gabriel Nicolae.
Semantic Video Classification Based on Subtitles and Domain Terminologies Polyxeni Katsiouli, Vassileios Tsetsos, Stathes Hadjiefthymiades P ervasive C.
Extracting Opinions, Opinion Holders, and Topics Expressed in Online News Media Text Soo-Min Kim and Eduard Hovy USC Information Sciences Institute 4676.
1 Extracting Product Feature Assessments from Reviews Ana-Maria Popescu Oren Etzioni
Mining and Summarizing Customer Reviews
Mining the Peanut Gallery: Opinion Extraction and Semantic Classification of Product Reviews K. Dave et al, WWW 2003, citations Presented by Sarah.
A Joint Model of Feature Mining and Sentiment Analysis for Product Review Rating Jorge Carrillo de Albornoz Laura Plaza Pablo Gervás Alberto Díaz Universidad.
Opinion mining in social networks Student: Aleksandar Ponjavić 3244/2014 Mentor: Profesor dr Veljko Milutinović.
Thumbs Up or Thumbs Down? Semantic Orientation Applied to Unsupervised Classification on Reviews Peter D. Turney Institute for Information Technology National.
Empirical Methods in Information Extraction Claire Cardie Appeared in AI Magazine, 18:4, Summarized by Seong-Bae Park.
Carmen Banea, Rada Mihalcea University of North Texas A Bootstrapping Method for Building Subjectivity Lexicons for Languages.
2007. Software Engineering Laboratory, School of Computer Science S E Towards Answering Opinion Questions: Separating Facts from Opinions and Identifying.
PAUL ALEXANDRU CHIRITA STEFANIA COSTACHE SIEGFRIED HANDSCHUH WOLFGANG NEJDL 1* L3S RESEARCH CENTER 2* NATIONAL UNIVERSITY OF IRELAND PROCEEDINGS OF THE.
1 Statistical NLP: Lecture 9 Word Sense Disambiguation.
Opinion Sentence Search Engine on Open-domain Blog Osamu Furuse, Nobuaki Hiroshima, Setsuo Yamada, Ryoji Kataoka NTT Cyber Solutions Laboratories, NTT.
Improving Subcategorization Acquisition using Word Sense Disambiguation Anna Korhonen and Judith Preiss University of Cambridge, Computer Laboratory 15.
Exploiting Subjectivity Classification to Improve Information Extraction Ellen Riloff University of Utah Janyce Wiebe University of Pittsburgh William.
SYMPOSIUM ON SEMANTICS IN SYSTEMS FOR TEXT PROCESSING September 22-24, Venice, Italy Combining Knowledge-based Methods and Supervised Learning for.
An Effective Word Sense Disambiguation Model Using Automatic Sense Tagging Based on Dictionary Information Yong-Gu Lee
Efficiently Computed Lexical Chains As an Intermediate Representation for Automatic Text Summarization H.G. Silber and K.F. McCoy University of Delaware.
A Bootstrapping Method for Building Subjectivity Lexicons for Languages with Scarce Resources Author: Carmen Banea, Rada Mihalcea, Janyce Wiebe Source:
Subjectivity and Sentiment Analysis Jan Wiebe Department of Computer Science CERATOPS: Center for the Extraction and Summarization of Events and Opinions.
Subjectivity and Sentiment Analysis Jan Wiebe Department of Computer Science CERATOPS: Center for the Extraction and Summarization of Events and Opinions.
14/12/2009ICON Dipankar Das and Sivaji Bandyopadhyay Department of Computer Science & Engineering Jadavpur University, Kolkata , India ICON.
Opinion Mining of Customer Feedback Data on the Web Presented By Dongjoo Lee, Intelligent Databases Systems Lab. 1 Dongjoo Lee School of Computer Science.
1 Multi-Perspective Question Answering Using the OpQA Corpus (HLT/EMNLP 2005) Veselin Stoyanov Claire Cardie Janyce Wiebe Cornell University University.
Wikipedia as Sense Inventory to Improve Diversity in Web Search Results Celina SantamariaJulio GonzaloJavier Artiles nlp.uned.es UNED,c/Juan del Rosal,
Copyright © 2013 by Educational Testing Service. All rights reserved. 14-June-2013 Detecting Missing Hyphens in Learner Text Aoife Cahill *, Martin Chodorow.
1/21 Automatic Discovery of Intentions in Text and its Application to Question Answering (ACL 2005 Student Research Workshop )
Using Semantic Relatedness for Word Sense Disambiguation
Creating Subjective and Objective Sentence Classifier from Unannotated Texts Janyce Wiebe and Ellen Riloff Department of Computer Science University of.
Evaluating an Opinion Annotation Scheme Using a New Multi- perspective Question and Answer Corpus (AAAI 2004 Spring) Veselin Stoyanov Claire Cardie Diane.
Number Sense Disambiguation Stuart Moore Supervised by: Anna Korhonen (Computer Lab)‏ Sabine Buchholz (Toshiba CRL)‏
1 Generating Comparative Summaries of Contradictory Opinions in Text (CIKM09’)Hyun Duk Kim, ChengXiang Zhai 2010/05/24 Yu-wen,Hsu.
Multi-level Bootstrapping for Extracting Parallel Sentence from a Quasi-Comparable Corpus Pascale Fung and Percy Cheung Human Language Technology Center,
Opinion Observer: Analyzing and Comparing Opinions on the Web
Learning Subjective Nouns using Extraction Pattern Bootstrapping Ellen Riloff School of Computing University of Utah Janyce Wiebe, Theresa Wilson Computing.
Improved Video Categorization from Text Metadata and User Comments ACM SIGIR 2011:Research and development in Information Retrieval - Katja Filippova -
Subjectivity Recognition on Word Senses via Semi-supervised Mincuts Fangzhong Su and Katja Markert School of Computing, University of Leeds Human Language.
SENTIWORDNET: A Publicly Available Lexical Resource for Opinion Mining
1 Gloss-based Semantic Similarity Metrics for Predominant Sense Acquisition Ryu Iida Nara Institute of Science and Technology Diana McCarthy and Rob Koeling.
7/2003EMNLP031 Learning Extraction Patterns for Subjective Expressions Ellen Riloff Janyce Wiebe University of Utah University of Pittsburgh.
Learning Extraction Patterns for Subjective Expressions 2007/10/09 DataMining Lab 안민영.
From Words to Senses: A Case Study of Subjectivity Recognition Author: Fangzhong Su & Katja Markert (University of Leeds, UK) Source: COLING 2008 Reporter:
Event-Based Extractive Summarization E. Filatova and V. Hatzivassiloglou Department of Computer Science Columbia University (ACL 2004)
Sentiment and Opinion Sep13, 2012 Analysis of Social Media Seminar William Cohen.
Learning Event Durations from Event Descriptions Feng Pan, Rutu Mulkar, Jerry R. Hobbs University of Southern California ACL ’ 06.
Word Sense and Subjectivity (Coling/ACL 2006) Janyce Wiebe Rada Mihalcea University of Pittsburgh University of North Texas Acknowledgements: This slide.
Identifying Expressions of Opinion in Context Eric Breck and Yejin Choi and Claire Cardie IJCAI 2007.
Erasmus University Rotterdam
Aspect-based sentiment analysis
Statistical NLP: Lecture 9
Unsupervised Word Sense Disambiguation Using Lesk algorithm
Statistical NLP : Lecture 9 Word Sense Disambiguation
Presentation transcript:

Word Sense and Subjectivity Jan Wiebe Rada Mihalcea University of Pittsburgh University of North Texas

Introduction  Growing interest in the automatic extraction of opinions, emotions, and sentiments in text (subjectivity)

Subjectivity Analysis: Applications  Opinion-oriented question answering: How do the Chinese regard the human rights record of the United States?  Product review mining: What features of the ThinkPad T43 do customers like and which do they dislike?  Review classification: Is a review positive or negative toward the movie?  Tracking emotions toward topics over time: Is anger ratcheting up or cooling down toward an issue or event?  Etc.

Introduction  Continuing interest in word sense –Sense annotated resources being developed for many languages » –Active participation in evaluations such as SENSEVAL

Word Sense and Subjectivity  Though both are concerned with text meaning, they have mainly been investigated independently

Subjectivity Labels on Senses Alarm, dismay, consternation – (fear resulting from the awareness of danger) Alarm, warning device, alarm system – (a device that signals the occurrence of some undesirable event) S O

Subjectivity Labels on Senses Interest, involvement -- (a sense of concern with and curiosity about someone or something; "an interest in music") Interest -- (a fixed charge for borrowing money; usually a percentage of the amount borrowed; "how much interest do you pay on your mortgage?") S O

WSD using Subjectivity Tagging The notes do not pay interest. He spins a riveting plot which grabs and holds the reader’s interest. Sense 4 “a sense of concern with and curiosity about someone or something” S Sense 1 “a fixed charge for borrowing money” O WSD System Sense 4 Sense 1? Sense 1 Sense 4?

Sense 4 “a sense of concern with and curiosity about someone or something” S Sense 1 “a fixed charge for borrowing money” O WSD using Subjectivity Tagging The notes do not pay interest. He spins a riveting plot which grabs and holds the reader’s interest. WSD System Sense 4 Sense 1? Sense 1 Sense 4? Subjectivity Classifier S O

Sense 4 “a sense of concern with and curiosity about someone or something” S Sense 1 “a fixed charge for borrowing money” O WSD using Subjectivity Tagging The notes do not pay interest. He spins a riveting plot which grabs and holds the reader’s interest. WSD System Sense 4 Sense 1? Sense 1 Sense 4? Subjectivity Classifier S O

Subjectivity Classifier Subjectivity Tagging using WSD The notes do not pay interest. He spins a riveting plot which grabs and holds the reader’s interest. O S? S O?

Subjectivity Classifier S Sense 4 “a sense of concern with and curiosity about someone or something” O Sense 1 “a fixed charge for borrowing money” Subjectivity Tagging using WSD The notes do not pay interest. He spins a riveting plot which grabs and holds the reader’s interest. WSD System Sense 4 Sense 1 O S? S O?

Subjectivity Classifier S Sense 4 “a sense of concern with and curiosity about someone or something” O Sense 1 “a fixed charge for borrowing money” Subjectivity Tagging using WSD The notes do not pay interest He spins a riveting plot which grabs and holds the reader’s interest. WSD System Sense 4 Sense 1 O S? S O?

Goals  Explore interactions between word sense and subjectivity –Can subjectivity labels be assigned to word senses? »Manually »Automatically –Can subjectivity analysis improve word sense disambiguation? –Can word sense disambiguation improve subjectivity analysis? Future work

Outline  Motivation and Goals  Assigning Subjectivity Labels to Word Senses –Manually –Automatically  Word Sense Disambiguation using Automatic Subjectivity Analysis  Conclusions

Prior Work on Subjectivity Tagging  Identifying words and phrases associated with subjectivity –Think ~ private state; Beautiful ~ positive sentiment »Hatzivassiloglou & McKeown 1997; Wiebe 2000; Kamps & Marx 2002; Turney 2002; Esuli & Sabastiani 2005; Etc  Subjectivity classification of sentences, clauses, phrases, or word instances in context –subjective/objective; positive/negative/neutral »Riloff & Wiebe 2003; Yu & Hatzivassiloglou 2003; Dave et al 2003; Hu & Liu 2004; Kim & Hovy 2004; Etc.  Here: subjectivity labels are applied to word senses

Outline  Motivation and Goals  Assigning Subjectivity Labels to Word Senses –Manually –Automatically  Word Sense Disambiguation using Automatic Subjectivity Analysis  Conclusions

Annotation Scheme  Assigning subjectivity labels to WordNet senses –S: subjective –O: objective –B: both

Annotators are given the synset and its hypernym Alarm, dismay, consternation – (fear resulting form the awareness of danger) –Fear, fearfulness, fright – (an emotion experiences in anticipation of some specific pain or danger (usually accompanied by a desire to flee or fight)) S

Subjective Sense Definition  When the sense is used in a text or conversation, we expect it to express subjectivity, and we expect the phrase/sentence containing it to be subjective.

Objective Senses: Observation  We don’t necessarily expect phrases/sentences containing objective senses to be objective –Would you actually be stupid enough to pay that rate of interest? –Will someone shut that darn alarm off?  Subjective, but not due to interest or alarm

Objective Sense Definition  When the sense is used in a text or conversation, we don’t expect it to express subjectivity and, if the phrase/sentence containing it is subjective, the subjectivity is due to something else.

Senses that are Both  Covers both subjective and objective usages  Example: absorb, suck, imbibe, soak up, sop up, suck up, draw, take in, take up – (take in, also metaphorically; “The sponge absorbs water well”; “She drew strength from the Minister’s Words”)

Annotated Data  64 words; 354 senses –Balanced subset [32 words; 138 senses]; 2 judges –The ambiguous nouns of the SENSEVAL-3 English Lexical Task [20 words; 117 senses]; 2 judges »[Mihalcea, Chklovski & Kilgarriff, 2004] –Others [12 words; 99 senses]; 1 judge

Annotated Data: Agreement Study  64 words; 354 senses –Balanced subset [32 words; 138 senses]; 2 judges »16 words have both S and O senses »16 words do not (8 only S and 8 only O) »All subsets balanced between nouns and verbs »Uncertain tags also permitted

Inter-Annotator Agreement Results  Overall: –Kappa=0.74 –Percent Agreement=85.5%

Inter-Annotator Agreement Results  Overall: –Kappa=0.74 –Percent Agreement=85.5%  Without the 12.3% cases when a judge is U: –Kappa=0.90 –Percent Agreement=95.0%

Inter-Annotator Agreement Results  Overall: –Kappa=0.74 –Percent Agreement=85.5%  16 words with S and O senses: Kappa=0.75  16 words with only S or O: Kappa=0.73 Comparable difficulty

Inter-Annotator Agreement Results  64 words; 354 senses –The ambiguous nouns of the SENSEVAL-3 English Lexical Task [20 words; 117 senses] 2 judges »U tags not permitted »Even so, Kappa=0.71

Outline  Motivation and Goals  Assigning Subjectivity Labels to Word Senses –Manually –Automatically  Word Sense Disambiguation using Automatic Subjectivity Analysis  Conclusions

Related Work  unsupervised word-sense ranking algorithm of [McCarthy et al 2004] –That task: approximate corpus frequencies of word senses –Our task: predict a word-sense property (subjectivity)  method for learning subjective adjectives of [Wiebe 2000] –That task: label words –Our task: label word senses

Overview  Main idea: assess the subjectivity of a word sense based on information about the subjectivity of –a set of distributionally similar words –in a corpus annotated with subjective expressions

MPQA Opinion Corpus  10,000 sentences from the world press annotated for subjective expressions »[Wiebe at al., 2005] »

Subjective Expressions  Subjective expressions: opinions, sentiments, speculations, etc. (private states) expressed in language

Examples  His alarm grew.  The leaders roundly condemned the Iranian President’s verbal assault on Israel.  He would be quite a catch.  That doctor is a quack.

Preliminaries: subjectivity of word w Unannotated Corpus (BNC) Lin 1998 DSW = {dsw 1, …, dsw j } Annotated Corpus (MPQA) #insts(DSW) in SE - #insts(DSW) not in SE #insts (DSW) subj(w) =

Subjectivity of word w Unannotated Corpus (BNC) DSW = {dsw 1, …, dsw j } Annotated Corpus (MPQA) [-1, 1] [highly objective, highly subjective] #insts(DSW) in SE - #insts(DSW) not in SE #insts (DSW) subj(w) =

Subjectivity of word w Unannotated Corpus (BNC) DSW = {dsw 1,dsw 2 } Annotated Corpus (MPQA) dsw 1 inst 1 dsw 1 inst 2 dsw 2 inst subj(w) = 3 = 1/3

Subjectivity of word sense w i Rather than 1, add or subtract sim(w i,dsw j ) Annotated Corpus (MPQA) dsw 1 inst 1 dsw 1 inst 2 dsw 2 inst 1 +sim(w i,dsw 1 ) -sim(w i,dsw 1 ) +sim(w i,dsw 2 ) subj(w i ) = +sim(w i,dsw 1 ) - sim(w i,dsw 1 ) + sim(w i,dsw 2 ) 2 * sim(w i,dsw 1 ) + sim(w i,dsw 2 ) [-1, 1]

Method –Step 1  Given word w  Find distributionally similar words [Lin 1998] –DSW = {dsw j | j = 1.. n} –Experiment with top 100 and 160

Method –Step 2 word w = Alarm DSW 1 Panic DSW 2 Detector Sense w 1 “fear resulting from the awareness of danger” sim(w 1,panic)sim(w 1,detector) Sense w 2 “a device that signals the occurrence of some undesirable event” sim(w 2,panic)sim(w 2, detector)

Method – Step 2  Find the similarity between each word sense and each distributionally similar word  wnss can be any concept-based similarity measure between word senses  we use Jiang & Conrath 1997

Method – Step 2  Find the similarity between each word sense and each distributionally similar word  wnss can be any concept-based similarity measure between word senses  we use Jiang & Conrath 1997

Method – Step 2  Find the similarity between each word sense and each distributionally similar word  wnss can be any concept-based similarity measure between word senses  we use Jiang & Conrath 1997

Method – Step 2  Find the similarity between each word sense and each distributionally similar word  wnss can be any concept-based similarity measure between word senses  we use Jiang & Conrath 1997

Method – Step 2  Find the similarity between each word sense and each distributionally similar word  wnss can be any concept-based similarity measure between word senses  we use Jiang & Conrath 1997

Method –Step 3 Input: word sense w i of word w DSW = {dsw j | j = 1..n} sim(w i,dsw j ) MPQA Opinion Corpus Output: subjectivity score subj(w i )

Method –Step 3 total sim = #insts(dsw j ) * sim(w i,dsw j ) subj = 0 for each dsw j in DSW: for each instance k in insts(dsw j ): if k is in a subjective expression: subj += sim(w i,dsw j ) else: subj -= sim(w i,dsw j ) subj(w i ) = subj / total sim

Method – Optional Variation w 1 dsw 1 dsw 2 dsw 3 w 2 dsw 1 dsw 2 dsw 3 w 3 dsw 1 dsw 2 dsw 3 if k is in a subjective expression: subj += sim(w i,dsw j ) else: subj -= sim(w i,dsw j ) “Selected”

Evaluation  Calculate subj scores for all word senses, and sort them  While 0 is a natural candidate for division between S and O, we perform the evaluation for different thresholds in [-1,+1]  Calculate the precision of the algorithm at different points of recall

Evaluation  Automatic assignment of subjectivity for 272 word senses (no DSW instances for 82 senses)  Baseline: random selection of S labels »Number of assigned S labels matches number of S labels in the gold standard (recall = 1.0)

Evaluation: precision/recall curves Number of distri- butionally similar words = 160

Evaluation  Break-even point »Point where precision and recall are equal

Outline  Motivation and Goals  Assigning Subjectivity Labels to Word Senses –Manually –Automatically  Word Sense Disambiguation using Automatic Subjectivity Analysis  Conclusions

Overview  Augment an existing WSD system with a feature reflecting the subjectivity of the context of the ambiguous word  Compare the performance of original and subjectivity-aware WSD systems  The ambiguous nouns of the SENSEVAL-3 English Lexical Task  SENSEVAL-3 data

Original WSD System  Integrates local and topical features: »Local: context of three words to the left and right, their part-of-speech »Topical: top five words occurring at least three times in the context of a word sense »[Ng & Lee, 1996], [Mihalcea, 2002]  Naïve Bayes classifier »[Lee & Ng, 2003]

Automatic Subjectivity Classifier  Rule-based automatic sentence classifier from [Wiebe & Riloff 2005]  Included in OpinionFinder; available at: –

Subjectivity Tagging for WSD Used to tag sentences of the SENSEVAL-3 data that contain target nouns Subjectivity Classifier Sentence j “interest” … … … Sentence k “atmosphere” “interest” Sentence i S O S …

WSD using Subjectivity Tagging Subjectivity Classifier S, O, or B S Original WSD System Subjectivity Aware WSD System Sense 4Sense 1 Sentence i “interest” Sense 1 “a sense of concern with and curiosity about someone or something” Sense 4 “a fixed charge for borrowing money”

Words with S and O Senses 4.3% error reduction; significant (p < 0.05 paired t-test) < < < < < < < < = =

Words with Only O Senses > > = < = = = = = =

Conclusions  Can subjectivity labels be assigned to word senses? –Manually »Good agreement; Kappa=0.74 »Very good when uncertain cases removed; Kappa=0.90 –Automatically »Method substantially outperforms baseline »Showed feasibility of assigning subjectivity labels to the fine-grained level of word senses

Conclusions  Can subjectivity analysis improve word sense disambiguation? –Improves performance, but mainly for words with both S and O senses (4.3% error reduction; significant (p < 0.05)) –Performance largely remains the same or degrades for words that don’t –Assign subjectivity labels to WordNet; WSD system should consult WordNet tags to decide when to pay attention to the contextual subjectivity feature.

 Thank You

Refining WordNet  Semantic Richness  Find inconsistencies and gaps –Verb assault – attack, round, assail, last out, snipe, assault (attack in speech or writing) “The editors of the left-leaning paper attacked the new House Speaker” –But no sense for the noun as in “His verbal assault was vicious”

Observation MPQA corpus  Corpus somewhat noisy for our task »MPQA annotates subjective expressions »Objective senses can appear in subjective expressions  Hypothesis: subjective senses tend to appear more often in subjective expressions than objective senses do, and so the appearance of words in subjective expressions is evidence of sense subjectivity

WSD using Subjectivity Tagging Hypothesis: instances of subjective senses are more likely to be in subjective sentences, so sentence subjectivity is an informative feature for WSD of words with both subjective and objective senses

Subjective Sense Examples  He was boiling with anger Seethe, boil – (be in an agitated emotional state; “The customer was seething with anger”) –Be – (have the quality of being; (copula, used with an adjective or a predicate noun); “John is rich”; “This is not a good answer”)

Subjective Sense Examples  What’s the catch? Catch – (a hidden drawback; “it sounds good but what’s the catch?”) Drawback – (the quality of being a hindrance; “he pointed out all the drawbacks to my plan”)  That doctor is a quack. Quack – (an untrained person who pretends to be a physician and who dispenses medical advice) –Doctor, doc, physician, MD, Dr., medico

Objective Sense Examples  The alarm went off Alarm, warning device, alarm system – (a device that signals the occurrence of some undesirable event) –Device – (an instrumentality invented for a particular purpose; “the device is small enough to wear on your wrist”; “a device intended to conserve water”  The water boiled Boil – (come to the boiling point and change from a liquid to vapor; “Water boils at 100 degrees Celsius”) –Change state, turn – (undergo a transformation or a change of position or action)

Objective Sense Examples  He sold his catch at the market Catch, haul – (the quantity that was caught; “the catch was only 10 fish”) –Indefinite quantity – (an estimated quantity)  The duck’s quack was loud and brief Quack – (the harsh sound of a duck) –Sound – (the sudden occurrence of an audible event)