Download presentation
Presentation is loading. Please wait.
Published byKory Nash Modified over 9 years ago
1
Semantics in NLP (part 2) MAS.S60 Rob Speer Catherine Havasi * Lots of slides borrowed for lots of sources! See end.
2
Are people doing logic? Language Log: “Russia sentences” – *More people have been to Russia than I have.
3
Are people doing logic? Language Log: “Russia sentences” – *More people have been to Russia than I have. – *It just so happens that more people are bitten by New Yorkers than they are by sharks.
4
Are people doing logic? The thing is, is people come up with new ways of speaking all the time.
5
More lexical semantics
6
Quantifiers Every/all: \P. \Q. all x. (P(x) -> Q(x)) A/an/some: \P. \Q. exists x. (P(x) & Q(x)) The: – \P. \Q. Q(x) – P(x) goes in the presuppositions
7
High-level overview of C&C Find the highest-probability result with coherent semantics Doesn’t this create billions of parses that need to be checked? Yes.
8
High-level overview of C&C Parses using a Combinatorial Categorial Grammar (CCG) – fancier than a CFG – includes multiple kinds of “slash rules” – lots of grad student time spent transforming Treebank MaxEnt “supertagger” tags each word with a semantic category
9
High-level overview of C&C Find the highest-probability result with coherent semantics Doesn’t this create billions of parses that need to be checked?
10
High-level overview of C&C Find the highest-probability result with coherent semantics Doesn’t this create millions of parses that need to be checked? Yes. A typical sentence uses 25 GB of RAM. That’s where the Beowulf cluster comes in.
11
Can we do this with NLTK? NLTK’s feature-based parser has some machinery for doing semantics
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.