Download presentation
Presentation is loading. Please wait.
Published bySilvia Janis Stevenson Modified over 6 years ago
1
Learning Emoji Embeddings Using Emoji Co-Occurrence Network Graph
1st International Workshop on Emoji Understanding and Applications in Social Media Stanford University, California 25th June, 2018 Presenting Author Co-author Anurag Illendula Manish Reddy Yedulla Good Afternoon. I am Anurag Illendula, 4th year undergraduate student of Indian Institute of Technology Kharagpur India. I am here to present my work on Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph. Indian Institute of Technology (IIT) Kharagpur Indian Institute of Technology (IIT) Hyderabad
2
Why Emoji Embeddings Every Deep Learning model requires us to represent an entity in numbers. Why do we need to learn emoji embeddings? Emoji Understanding in different contexts has been one of the interesting field considering the usage of emojis in social media platforms. Every machine learning or deep learning model requires us to represent each entity in numbers. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
3
Why Emoji Embeddings Every Deep Learning model requires us to represent an entity in numbers. Emoji Embeddings have increase accuracies of many Emoji Understanding tasks. Also it is already proved that these emoji embeddings have increased accuracies of emoji understanding tasks which include emoji similarity, emoji sense disambiguation, sentiment analysis. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
4
Why Emoji Embeddings Every Deep Learning model requires us to represent an entity in numbers. Emoji Embeddings have increased accuracies of many Emoji Understanding tasks. Emoji Similarity, Emoji sense disambiguation, Sentiment Analysis. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
5
Why Emoji Co-Occurrence
Emoji Co-Occurrence helps us to understand context of use of multiple emojis in social media posts. There are many embeddings models like skip-gram and CBOW models which are generally used to learn representations of any entities in finite dimensional vector space. But we observed that emoji co-occurrence can also help us learn emoji representation. Also it is noted that emoji co-occurrence helps us understand the context of use of multiple emojis. For example Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
6
Why Emoji Co-Occurrence
Emoji Co-Occurrence helps us to understand context of use of multiple emojis in social media posts. For example : In the first example where we observed that face with thermometer emoji and the spiral notepad emoji co-occurred in a tweet and it can be inferred that the tweet can be in the context where user visits a doctor for treatment. In the second example we observed that the skull emoji and the gun emoji co-occurred and we observed that the tweet is in the context where a user comes across a murder. ( ) ( ) Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
7
Why Emoji Co-Occurrence
How does emoji co-occurrence help us learn Emoji Embeddings? Now you can wonder how this feature can help us learn emoji embeddings. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
8
Why Emoji Co-Occurrence
How does emoji co-occurrence help us learn Emoji Embeddings? “I got betrayed by a , I want to kill you ” Consider the tweet Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
9
Why Emoji Co-Occurrence
How does emoji co-occurrence help us learn Emoji Embeddings? “I got betrayed by a , I want to kill you ” Here both emojis ( , ) contain the same sentiment as the overall sentiment of the tweet which is negative. here we can observe that emojis which co-occur with other emojis contain the same sentiment. Hence according to our hypothesis emojis which co-occur in a tweet contain the same sentiment as the overall sentiment of the tweet. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
10
Emoji Co-Occurrence Network
We create the emoji network using 14.3 Million tweets, each of which have multiple emojis. Each tweet generates a polygon of n edges where n is the number of emojis embedded in the tweet, each node represents one emoji and each edge represents the number of co-occurrences between the connecting emojis. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
11
Emoji Co-Occurrence Network
EMOJI POLYGONS I got betrayed by a I would die for Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
12
Emoji Co-Occurrence Network
MORE EXAMPLES Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
13
Emoji Co-Occurrence Network
By combining all the emoji polygons. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
14
Emoji Co-Occurrence Network
By combining all the emoji polygons. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
15
Emoji Co-Occurrence Network
By combining all the emoji polygons. NETWORK EMBEDDING MODEL We input this emoji co-occurrence network graph to our graph embedding model to learn 300 dimensional emoji embeddings Emoji Embeddings 300 dimensions Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
16
Network Embedding Model
Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
17
Model Parameters We define two types of measures which signify the proximity between nodes of the network. We used two different parameters to measure the proximity between nodes of the network Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
18
Model Parameters We define two types of measures which signify the proximity between nodes of the network. First Order Proximity : This is the local pairwise proximity which can be related to the weight of the edge joining emoji nodes. First Order Proximity is defined as the direct local pairwise proximity which can be related to the weight of the edge joining emoji nodes. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
19
Model Parameters We define two types of measures which signify the proximity between nodes of the network. First Order Proximity : This is the local pairwise proximity which can be related to the weight of the edge joining two emoji nodes. Second Order Proximity : This can be considered as the similarity between neighbourhood network structures. Second order proximity is considered as the similarity between neighbourhood network structures. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
20
First Order Proximity Let ui, uj represent the emoji embeddings. The joint probability which signifies the first order proximity between emoji nodes vi, vj is given by We first randomly generate emoji embeddings for emoji emoji node and subsequently learn emoji representations using optimization principles. If we assume are the emoji embeddings for nodes, then the probability distribution curve can be defined as p1(vi,vj. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
21
First Order Proximity Let ui, uj represent the emoji embeddings. The joint probability which signifies the first order proximity between emoji nodes vi, vj is given by The empirical probability between the vertices vi, vj is given by The empirical probability is defined as the weighted average between emoji node pairs. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
22
First Order Proximity Let ui, uj represent the emoji embeddings. The joint probability which signifies the first order proximity between emoji nodes vi, vj is given by The empirical probability between the vertices vi, vj is given by Hence the objective function in this case is We then try to optimize the disance between the two probability functions. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
23
First Order Proximity 3 4 Empirical Probability Curve 7 6 5 : 1 : 3
: 1 : 3 : 2 : 4 P (1,2) = 3.0/25 P (1,3) = 0 P (1,4) = 4.0/25 P (2,3) = 6.0/25 P (2,4) = 7.0/25 P (3,4) = 5.0/25 Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
24
First Order Proximity Empirical Probability Curve 3 4 7 6 5
After Optimizing the distance between probability curves Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
25
Second Order Proximity
The second order proximity of two nodes ( vi, vj ) measure the similarity of the neighbourhood network structures of respective nodes (referred as “context”) Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
26
Second Order Proximity
The second order proximity of two nodes ( vi, vj ) measures the similarity of the neighbourhood network structures of respective nodes (referred as “context”) The empirical probability between the vertices vi, vj is given by Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
27
Second Order Proximity
The second order proximity of two nodes ( vi, vj ) measures the similarity of the neighbourhood network structures of respective nodes (referred as “context”) The empirical probability between the vertices vi, vj is given by The objective function in this case is ƛi is the prestige of the vertex i which can is measured by PageRank Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
28
Model Optimization Further we use KL-Divergence for optimizing the distance between probability curves. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
29
Model Optimization Further we use KL-Divergence for optimizing the distance between probability curves. First Order Proximity Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
30
Model Optimization Further we use KL-Divergence for optimizing the distance between probability curves. First Order Proximity Second Order Proximity Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
31
Model Optimization Further we use KL-Divergence for optimization.
First Order Proximity Second Order Proximity The model is trained using RMS Propagation gradient descent algorithm with learning rate as and batch size as 128 using Tensorflow library on a cuda GPU. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
32
Experiments Sentiment Analysis Task:
How can emoji co-occurrence can help in Sentiment Analysis Task? “I got betrayed by a , I want to kill you ” Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
33
Experiments Sentiment Analysis Task:
How can emoji co-occurrence can help in Sentiment Analysis Task? “I got betrayed by a , I want to kill you ” Here both emojis ( , ) contain the same sentiment as the overall sentiment of the tweet which is negative. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
34
Experiments Sentiment Analysis Task:
How can emoji co-occurrence can help in Sentiment Analysis Task? “I got betrayed by a , I want to kill you ” Here both emojis ( , ) contain the same sentiment as the overall sentiment of the tweet. Hence we evaluate our emoji embeddings on Sentiment Analysis task. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
35
Experiments Sentiment Analysis Task:
We report our accuracies on gold standard dataset developed by Novak et al. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
36
Experiments Sentiment Analysis Task:
We report our accuracies on gold standard dataset developed by Novak et al. We use pre-trained FastText word embeddings on Wikipedia corpus to embed words into low dimensional space. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
37
Experiments Sentiment Analysis Tasks:
We calculate the bag of words vector for each tweet and then use this vector as a feature to train a SVM and a random forest model. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
38
Experiments Sentiment Analysis Tasks:
We calculate the bag of words vector for each tweet and then use this vector as a feature to train a SVM and a random forest model. Word Embeddings Classification Accuracy using RF Classification Accuracy using SVM State-of-the art results 60.7 63.6 First Order Embeddings 62.1 65.2 Second Order Embeddings 58.7 61.9 Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
39
Experiments 2. Emoji Similarity Task:
How does emoji co-occurrence help in Emoji Similarity task? Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
40
Experiments 2. Emoji Similarity Task:
How does emoji co-occurrence help in Emoji Similarity task? We have observed that ( , ) and ( , ) emoji pairs have co-occurred most times. Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
41
Experiments 2. Emoji Similarity Task:
How does emoji co-occurrence help in Emoji Similarity task? We have observed that ( , ) and ( , ) emoji pairs have co-occurred most times. Hence we evaluate our emoji embeddings on Emoji Similarity task Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
42
Experiments 2. Emoji Similarity Task:
The similarity measure between two different emojis is defined as the cosine similarity between two emoji embeddings. We evaluate our similarity values with that of the similarity measures using the semantic embeddings (SEMANTIC SIMILARITY). Similarity Values can range from (0,1) Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
43
Emoji Similarity measured using first order embeddings
Experiments Emoji Similarity Tasks: Emoji Similarity measured using first order embeddings Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
44
Emoji Similarity measured using second order embeddings
Experiments Emoji Similarity Tasks: Emoji Similarity measured using second order embeddings Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
45
Experiments Emoji To Emoji Analogy Tasks:
We extrapolate the analogical reasoning task in context of emojis by replacing words with emojis Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
46
Experiments Emoji To Emoji Analogy Tasks:
We extrapolate the analogical reasoning task in context of emojis by replacing words with emojis Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
47
Thank You Illendula, Anurag et al. Learning Emoji Embeddings using Emoji Co-Occurrence Network Graph
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.