Download presentation
Presentation is loading. Please wait.
Published byMarvin Hensley Modified over 9 years ago
1
Lexical Acquisition of Verb Direct- Object Selectional Preferences Based on the WordNet Hierarchy Emily Shen and Sushant Prakash
2
Selectional Preferences: V-DO Eat a carrot Drive a truck Eat a truck Drive a carrot Find general classes that a verb takes as arguments Useful for word sense disambiguation, choosing among parses, capturing some essence of semantics, etc.
3
Strategy P(v,c) = (1/N) n words(c) (1/|classes(n)|) C(v,n) S(v) = D(P(C|v)||P(C)) = c P(c|v)log[P(c|v)/P(c)] A(v,c) = P(c|v)log[P(c|v)/P(c)] / S(v) A(v,n) = max c classes(n) A(v,c) But this assumes flat set of classes – we wanted to exploit the hierarchy: Propagate probability counts to hypernyms. P mod (v,c) = P orig (v,c)+ c_kdes P orig (v,c kdes )
4
This may seem a little screwy… No discount factor for each step up No splitting the count for branches
5
Results Most selective verbs discipline, sigh, slice, shoot down, elongate Least selective verbs make, have, see, get, include Top noun classes plant – plant, explosive device transplant – kidney, internal organ, body part Tested WSD on WSJ and BLLIP. Random baseline: 26.39% P, 100% R, 41.76% F1 Flat WSJ: 28.39% P, 71.67% R, 40.67% F1 Hyper WSJ: 51.44% P, 65.21% R, 57.51% F1
6
Future Work Feed disambiguated nouns into model for training Model class to class relationships Also take into account subject
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.