Knowledge graphs (KGs) typically contain temporal facts indicating relationships among entities at different times. Due to their incompleteness, several approaches have been proposed to infer new facts for a KG based on the existing ones–a problem known as KG completion. KG embedding approaches have proved effective for KG completion, however, they have been developed mostly for static KGs. Developing temporal KG embedding models is an increasingly important problem. In this paper, we build novel models for temporal KG completion through equipping static models with a diachronic entity embedding function which provides the characteristics of entities at any point in time. This is in contrast to the existing temporal KG embedding approaches where only static entity features are provided. The proposed embedding function is model-agnostic and can be potentially combined with any static model. We prove that combining it with SimplE, a recent model for static KG embedding, results in a fully expressive model for temporal KG completion. Our experiments indicate the superiority of our proposal compared to existing baselines.
View the code Github.
Bibtex
@inproceedings{goel2020diachronic,
title={Diachronic Embedding for Temporal Knowledge Graph Completion},
author={Goel, Rishab and Kazemi, Seyed Mehran and Brubaker, Marcus and Poupart, Pascal},
booktitle={Thirty-Fourth AAAI Conference on Artificial Intelligence},
year={2020}
}
Related Research
-
Stay Positive: Knowledge Graph Embedding Without Negative Sampling
Stay Positive: Knowledge Graph Embedding Without Negative Sampling
A. Hajimoradlou, and S. M. Kazemi. International Conference on Machine Learning Workshop on Graph Representation Learning and Beyond (ICML)
Publications
-
Our NeurIPS 2021 Reading List
Our NeurIPS 2021 Reading List
Y. Cao, K. Y. C. Lui, T. Durand, J. He, P. Xu, N. Mehrasa, A. Radovic, A. Lehrmann, R. Deng, A. Abdi, M. Schlegel, and S. Liu.
Computer Vision; Data Visualization; Graph Representation Learning; Learning And Generalization; Natural Language Processing; Optimization; Reinforcement Learning; Time series Modelling; Unsupervised Learning
Research
-
SLAPS: Self-Supervision Improves Structure Learning for Graph Neural Networks
SLAPS: Self-Supervision Improves Structure Learning for Graph Neural Networks
B. Fatemi, L. El Asri, and S. M. Kazemi. Conference on Neural Information Processing Systems (NeurIPS)
Publications