Download presentation
Presentation is loading. Please wait.
Published byMorgan Baldwin Modified over 9 years ago
1
Semantics cCS 224n / Lx 237 Tuesday, May 11 2004 With slides borrowed from Jason Eisner
2
Objects Three Kinds: Boolean – semantic value of sentences Entities Objects, NPs Maybe space / time specifications Functions Predicates – function returning a boolean Functions might return other functions Functions might take other functions as arguments.
3
Nouns and their modifiers expert g expert(g) big fat expert g big(g) fat(g) expert(g) But: bogus expert Wrong: g bogus(g) expert(g) Right: g (bogus(expert))(g) … bogus maps to new concept Baltimore expert ( white-collar expert, TV expert …) g Related(Baltimore, g) expert(g) Or with different intonation: g (Modified-by(Baltimore, expert))(g) Can’t use Related for that case: law expert and dog catcher = g Related(law,g) expert(g) Related(dog, g) catcher(g) = dog expert and law catcher
4
We’ve discussed what semantic representations should look like. But how do we get them from sentences??? First - parse to get a syntax tree. Second - look up the semantics for each word. Third - build the semantics for each constituent Work from the bottom up The syntax tree is a “recipe” for how to do it Compositional Semantics
5
Add a “sem” feature to each context-free rule S NP loves NP S [sem=loves(x,y)] NP[sem=x] loves NP[sem=y] Meaning of S depends on meaning of NPs Compositional Semantics NP V loves VP S NP x y loves(x,y) NP the bucket V kicked VP S NP x died(x)
6
Instead of S NP loves NP S[sem=loves(x,y)] NP[sem=x] loves NP[sem=y] might want general rules like S NP VP: V[sem=loves] loves VP[sem=v(obj)] V[sem=v] NP[sem=obj] S[sem=vp(subj)] NP[sem=subj] VP[sem=vp] Now George loves Laura has sem= loves(Laura)(George) In this manner we’ll sketch a version where Still compute semantics bottom-up Grammar is in Chomsky Normal Form So each node has 2 children: 1 function & 1 argument To get its semantics, apply function to argument! Compositional Semantics
7
NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc.
8
NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) the meaning that we want here: how can we arrange to get it?
9
NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) G what function should apply to G to yield the desired blue result? (this is like division!)
10
NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) x loves(x,L) G
11
NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) x loves(x,L) G a a x loves(x,L) We’ll say that “to” is just a bit of syntax that changes a VP stem to a VP inf with the same meaning.
12
NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) x loves(x,L) G a a x loves(x,L) y x loves(x,y) L
13
NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) x loves(x,L) G a a y x loves(x,y) L x loves(x,L) x wants(x, loves(G,L) ) by analogy
14
NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. loves(G,L) x loves(x,L) G a a y x loves(x,y) L x loves(x,L) x wants(x, loves(G,L)) y x wants(x,y) by analogy
15
NP Laura V stem love VP stem VP inf T to S inf VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. x wants(x, loves(G,L)) x present( wants(x, loves(G,L)) ) NP George v x present(v(x))
16
NP Laura V stem love VP stem VP inf T to S inf VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. x present(wants(x, loves(G,L))) NP George present(wants(every(nation), loves(G,L)))) every(nation)
17
NP Laura V stem love VP stem VP inf T to S inf VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. present( x wants(x, loves(G,L))) NP George present(wants(every(nation), loves(G,L)))) every(nation) n every(n) nation
18
NP Laura V stem love VP stem VP inf T to S inf VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. NP George present(wants(every(nation), loves(G,L)))) s assert(s)
19
In Summary: From the Words NP Laura V stem love VP stem VP inf T to S inf NP George VP stem V stem want VP fin T -s S fin NP N nation Det Every START Punc. G a a y x loves(x,y) L y x wants(x,y) v x present(v(x)) everynation s assert(s) assert(present(wants(every(nation), loves(G,L))))
20
So now what? Now that we have the semantic meaning, what do we do with it? Huge literature on logical reasoning, and knowledge learning. Reasoning versus Inference “John ate a Pizza” Q:What was eaten by John? A: pizza “John ordered a pizza, but it came with anchovies. John then yelled at the waiter and stormed out.” Q: What was eaten by John? A: nothing
21
Problem 1a Write grammar rules complete with semantic translations that could be added to the grammar fragment, which will parse the above sentence and generate a semantic representation using the own predicate.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.