Entropy Reduction Model Resource: The Information Conveyed by Words in Sentences, John Hale
Assumptions on ambiguity resolution sentence understanders determine a syntactic structure for the perceived signal. Producer and comprehender share the same grammar. (may be probabilistic one) Comprehension is eager
Sentence processing is done incrementally There are combinatory relationships between words in the sentence There is a speaker intended derivation Due to ambiguity, there is a uncertainty about speaker intended derivation The uncertainty is greater in initial phases and die out gradually as more and more words are presented
Work done by an eager processor Uncertainty in derivation ▪ total amount of ambiguity resolution work needed to be done Reduction in uncertainty ▪ Maximal amount of work done between a word and the next one ▪ information conveyed by a word
Grenander’s Theorem Computes entropy of all derivation trees rooted in a non-terminal symbol Entropy is sum of ▪ Entropy of the single-rule rewrite decision and ▪ Expected entropy of any children
SPPVPNPPV S PP VP NP P V
A horse raced past the barn fell Left recursionReduced relative clause Past participle verb
You are stuck here
Remove left recursion
Non-left recursive grammar
Non-terminalEntropy
the horse raced past the barn fell
‘the’ conveys no information
the horse raced past the barn fell