INTERSUBJECTIVITY AND SENTIMENT: FROM LANGUAGE TO KNOWLEDGE Lin Gui (Harbin Institute of Technology), Ruifeng Xu (Harbin Institute of Technology), Yulan He(Aston University), Qin Lu (Hong Kong PolyU), Zhongyu Wei (The University of Texas at Dallas)
Outline Background Network Embedding with Intersubjectivity Sentiment with Intersubjectivity Experiment
Background The history of language The gap between language and knowledge (Dunbar and Dunbar, 1998) Subjectivity (river, lion) Intersubjectivity (nation, country)
Background Intersubjectivity suggests that the meaning of a word or a phrase is not encoded in the surface form of that language as a mapping from a term to an object or a subject. It is a commonly accepted conceptualization by a society sharing the same language.
Our Approach We proposed an intersubjectivity based sentiment classification method. 1. Construct an intersubjectivity network 2. Learning vertex representation with embedding 3. A CNN based sentiment classification with the embedding result.
Construct an intersubjectivity network
Construct an intersubjectivity network
Learning vertex representation How to define the conditional probability Two author vertices share similar subjective terms should have high conditional probability The objective function is:
A CNN based sentiment classification
Experiment The distribution of the data set
Experiment The referenced method The metrics: Paragraph vector for document modeling Recursive Neural Tensor Network Convolutional Neural Network Jointly Modeling Aspects, Ratings and Sentiments User Product Neural Network The metrics:
Experiment The performance of referenced method
Experiment The comparison with or without author modeling
Experiment The top 5 most positive/negative author in the embedding result
Experiment The distribution of terms and authors in the embedding space
Further Discussion Bigram or Unigram? Word embedding?
Thanks!