How much do word embeddings encode about syntax? Jacob Andreas and Dan Klein UC Berkeley
Everybody loves word embeddings few most that the a each this every [Collobert 2011] [Collobert 2011, Mikolov 2013, Freitag 2004, Schuetze 1995, Turian 2010]
What might embeddings bring? Cathleen complained about the magazine’s shoddy editorial quality. Mary executive average
Today’s question Can word embeddings trained with surface context improve a state-of-the-art constituency parser? (no)
Embeddings and parsing Pre-trained word embeddings are useful for a variety of NLP tasks Can they improve a constituency parser? – (not very much) [Cite XX, Cite XX, Cite XX]
Three hypotheses Vocabulary expansion (good for OOV words) Statistic pooling (good for medium-frequency words) Embedding structure (good for features) Cathleen Mary average editorial executive transitivity tense
Vocabulary expansion: Embeddings help handling of out-of-vocabulary words Cathleen Mary
Vocabulary expansion John Mary Pierre yellow enormous hungry Cathleen
Vocabulary expansion John Mary Pierre yellow enormous hungry Cathleen complained about the magazine’s shoddy editorial quality. Cathleen Mary
Vocab. expansion results Baseline +OOV
Vocab. expansion results Baseline +OOV
Vocab. expansion results Baseline +OOV (300 sentences)
Statistic pooling hypothesis: Embeddings help handling of medium-frequency words average editorial executive
Statistic pooling executive kind giant editorial average {NN, JJ} {NN} {NN, JJ} {JJ} {NN}
Statistic pooling executive kind giant editorial average {NN, JJ} {JJ, NN} {NN, JJ}
Statistic pooling executive kind giant editorial average {NN, JJ} {NN} {NN, JJ} {JJ} {NN} editorial NN editorial NN
Statistic pooling results Baseline +Pooling
Vocab. expansion results Baseline +Pooling (300 sentences)
Embedding structure hypothesis: The organization of the embedding space directly encodes useful features transitivity tense
Embedding structure vanished dined vanishing dining devoured assassinated devouring assassinating “transitivity” “tense” dined VBD [Huang 2011]
Embedding structure vanished dined vanishing dining devoured assassinated devouring assassinating “transitivity” “tense” dined VBD [Huang XX]
Embedding structure results Baseline +Features
Embedding structure results Baseline +Features (300 sentences)
To summarize (300 sentences)
Combined results Baseline +OOV +Pooling
Vocab. expansion results Baseline (300 sentences) +OOV +Pooling
What about… Domain adaptation? (no significant gain) French? (no significant gain) Other kinds of embeddings? (no significant gain)
Why didn’t it work? Context clues often provide enough information to reason around words with incomplete / incorrect statistics Parser already has a robust OOV, small count models Sometimes “help” from embeddings is worse than nothing: bifurcate Soap homered Paschi tuning unrecognized
What about other parsers? Dependency parsers (continuous repr. as syntactic abstraction) Neural networks (continuous repr. as structural requirement) [Henderson 2004, Socher 2013] [Henderson 2004, Socher 2013, Koo 2008, Bansal 2014]
What didn’t we try? Hard clustering (some evidence that this is useful for morphologically rich languages) A nonlinear feature-based model Embeddings in higher constituents (e.g. in a CRF parser) [Candito 09]
Conclusion Embeddings provide no apparent benefit to state-of-the-art parser for: – OOV handling – Parameter pooling – Lexicon features Code online at