Download presentation
Presentation is loading. Please wait.
1
Lecture 23: More on Word Embeddings
Kai-Wei Chang University of Virginia Couse webpage: CS6501-NLP
2
Recap: Representation Learning
How to represent words/phrases/sentences? CS6501-NLP
3
Auto-encoder and auto-decoder
CS6501-NLP
4
Sequence to sequence models [Sutskever, Vinyals & Le 14]
Have been shown effective in machine translation, image captioning and and many structured tasks CS6501-NLP
5
Structured prediction ⋂ Representation Learning
NLP problems are structural Output variables are inter-correlated Need joint predictions Traditional approaches Graphical model approaches E.g., Probabilistic graphical models, structured perceptron Sequence of decisions E.g., incremental perceptron, L2S, transition- based methods SPNLP
6
Recent trends Landscape of methods in Deep⋂Structure
Deep learning/hidden representation e.g., seq2seq, RNN Deep features into factors, traditional factor graph inference e.g., LSTM+CRF, graph transformer networks Globally optimized transitional-based approaches e.g., beam-search seq2seq, SyntaxNet … SPNLP
7
How to represent words? Treat words/n-gram as tokens
Word Cluster – Brown clustering Lexical knowledge bases – WordNet, Thesaurus Word embeddings – continuous representation CS6501-NLP
8
Research on word embeddings
Multi-sense word embeddings Multi-lingual word embeddings I got high interest on my savings from the bank. My interest lies in History. CS6501-NLP
9
Research on word embeddings
Multi-sense multi-lingual word embeddings I got high interest on my savings from the bank. 我從銀行的儲蓄得到高額利息 My interest lies in History. 我的興趣在於歷史 CS6501-NLP
10
Research on word embeddings
Encode lexical information in word embeddings CS6501-NLP
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.