Download presentation
Presentation is loading. Please wait.
Published byBerenice Jordan Modified over 8 years ago
1
Entropy Reduction Model Resource: The Information Conveyed by Words in Sentences, John Hale
2
Assumptions on ambiguity resolution sentence understanders determine a syntactic structure for the perceived signal. Producer and comprehender share the same grammar. (may be probabilistic one) Comprehension is eager
3
Sentence processing is done incrementally There are combinatory relationships between words in the sentence There is a speaker intended derivation Due to ambiguity, there is a uncertainty about speaker intended derivation The uncertainty is greater in initial phases and die out gradually as more and more words are presented
4
Work done by an eager processor Uncertainty in derivation ▪ total amount of ambiguity resolution work needed to be done Reduction in uncertainty ▪ Maximal amount of work done between a word and the next one ▪ information conveyed by a word
8
Grenander’s Theorem Computes entropy of all derivation trees rooted in a non-terminal symbol Entropy is sum of ▪ Entropy of the single-rule rewrite decision and ▪ Expected entropy of any children
12
SPPVPNPPV S0.0 1.0 0.0 PP0.0 1.0 0.0 VP0.00.3 0.50.00.7 NP0.00.40.00.40.0 P V
16
A horse raced past the barn fell Left recursionReduced relative clause Past participle verb
17
You are stuck here
18
Remove left recursion
20
Non-left recursive grammar
21
0.0 1.00.0 1.00.0 1.0 0.0 1.00.0 1.00.0 0.50.0 0.5 0.0 1.0 0.0 0.12 0.0 1.00.0 0.12
22
Non-terminalEntropy 5.019 2.013 3.013 3.006 0.000 1.000 0.000 2.013 4.025
23
the horse raced past the barn fell
26
‘the’ conveys no information
27
the horse raced past the barn fell
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.