Download presentation
Presentation is loading. Please wait.
Published byJovana Potokar Modified over 6 years ago
1
Improving Knowledge Graph Embedding Using Simple Constraints
Boyang Ding, Quan Wang, Bin Wang, Li Guo Institute of Information Engineering, Chinese Academy of Sciences School of Cyber Security, University of Chinese Academy of Sciences State Key Laboratory of Information Security, Chinese Academy of Sciences ACLβ18
2
Knowledge Graph Embedding
Scoring Model
3
Knowledge Graph Embedding
Deep Neural Network R-GCN ConvE Extra Information Entity Types: TKRL Relation Paths: PTransE Textual Descriptions: DKRL, TEKE Logical Rules: KALE, RUGE, πΆπππππΈπ₯ π
4
KG Embedding with Constraints
Non-Negativity Constraints entity vectors stay within hypercube [0, 1] π sparsity and interpretability Approximate Entailment Constraints encode entailment into vectors effectivity
5
Basic Embedding Model----ComplEx
Problem Set , Entity , Relation Representation Vector , Complex Function Score Function
6
Non-Negativity of Entity Representation
Simple Constraint
7
Approximate Entailment for Relations
e.g. BornInCountry and Nationality
8
Approximate Entailment for Relations
9
Approximate Entailment for Relations
10
Overall model ComplEx Loss L2 Regularization Slack Penalty Approximate
Entailment Non-Negativity
11
Rewrite
12
Experiment1--Link Prediction
Datasets WN18: from WordNet FB15K: from Freebase DB100K: created on core DBpedia Entailment Mining AMIE+ on training set
13
Link Prediction
14
Link Prediction
15
Experiment2--Entity Representations Analysis
Compact and Interpretable Representations
16
Entity Representations Analysis
Semantic Purity
17
Experiment3--Relation Representations
18
Contributions Constraints imposed directly on entity and relation representations without grounding, easy to scale up to large KGs Constraints derived automatically from statistical properties, universal, no manual effort and applicable to almost all KGs Individual representation for each entity, and successfully predictions between unpaired entities
19
Supplement about Embedding
SGNS(skip-gram negative sampling) vectors are arranged along a primary axis and mostly non-negative from Mimno and Thompson, EMNLPβ17 Multiplicative entitiy embedding get more compact(conicity increase and average vector length decrease) with increasing negative samples from Chandrahas, ACLβ18
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.