Download presentation
Presentation is loading. Please wait.
1
Word2Vec
2
Introduction: What is word2vec?
Motivation : Why Word2vec? Word2vec models: Continuous Bag of words Skip gram model Demo
3
What is Word2Vec? Introduced by Google in 2013
Computes vector representation of words Word meanings and relationships between words are encoded spatially Learns from input texts
4
Motivation Images; Speech easily represented in the form of vectors.
What about text? Word2vec learns word embeddings. Converts words into meaningful vectors. Basically trains a neural network with a single hidden layer to perform a certain task. Doesn’t use neural network output, but instead uses the weights learnt.These weights serve as vector representations
5
Contextual Representation
I eat an apple every day. I eat an orange every day. I like driving my car to work. Ref:
6
Word vectors
7
Learning Algorithms Continuous bag-of-words Continuous skip gram
8
Continuous Bag-of-words
10
Continuous skip-gram
12
Hidden Layer
13
How do we get word vectors?
14
Output layer
15
How is this different?
17
Demo
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.