Learning from History: Modeling Temporal Knowledge Graphs with Sequential Copy-Generation Networks
DOI:
https://doi.org/10.1609/aaai.v35i5.16604Keywords:
Linked Open Data, Knowledge Graphs & KB CompletioAbstract
Large knowledge graphs often grow to store temporal facts that model the dynamic relations or interactions of entities along the timeline. Since such temporal knowledge graphs often suffer from incompleteness, it is important to develop time-aware representation learning models that help to infer the missing temporal facts. While the temporal facts are typically evolving, it is observed that many facts often show a repeated pattern along the timeline, such as economic crises and diplomatic activities. This observation indicates that a model could potentially learn much from the known facts appeared in history. To this end, we propose a new representation learning model for temporal knowledge graphs, namely CyGNet, based on a novel time-aware copy-generation mechanism. CyGNet is not only able to predict future facts from the whole entity vocabulary, but also capable of identifying facts with repetition and accordingly predicting such future facts with reference to the known facts in the past. We evaluate the proposed method on the knowledge graph completion task using five benchmark datasets. Extensive experiments demonstrate the effectiveness of CyGNet for predicting future facts with repetition as well as de novo fact prediction.Downloads
Published
2021-05-18
How to Cite
Zhu, C., Chen, M., Fan, C., Cheng, G., & Zhang, Y. (2021). Learning from History: Modeling Temporal Knowledge Graphs with Sequential Copy-Generation Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 35(5), 4732-4740. https://doi.org/10.1609/aaai.v35i5.16604
Issue
Section
AAAI Technical Track on Data Mining and Knowledge Management