Transformer-Style Relational Reasoning with Dynamic Memory Updating for Temporal Network Modeling

Authors

  • Dongkuan Xu The Pennsylvania State University
  • Junjie Liang The Pennsylvania State University
  • Wei Cheng NEC Laboratories America, Inc.
  • Hua Wei The Pennsylvania State University
  • Haifeng Chen NEC Laboratories America, Inc.
  • Xiang Zhang The Pennsylvania State University

DOI:

https://doi.org/10.1609/aaai.v35i5.16583

Keywords:

Mining of Spatial, Temporal or Spatio-Temporal Da

Abstract

Network modeling aims to learn the latent representations of nodes such that the representations preserve both network structures and node attribute information. This problem is fundamental due to its prevalence in numerous domains. However, existing approaches either target the static networks or struggle to capture the complicated temporal dependency, while most real-world networks evolve over time and the success of network modeling hinges on the understanding of how entities are temporally connected. In this paper, we present TRRN, a transformer-style relational reasoning network with dynamic memory updating, to deal with the above challenges. TRRN employs multi-head self-attention to reason over a set of memories, which provides a multitude of shortcut paths for information to flow from past observations to the current latent representations. By utilizing the policy networks augmented with differentiable binary routers, TRRN estimates the possibility of each memory being activated and dynamically updates the memories at the time steps when they are most relevant. We evaluate TRRN with the tasks of node classification and link prediction on four real temporal network datasets. Experimental results demonstrate the consistent performance gains for TRRN over the leading competitors.

Downloads

Published

2021-05-18

How to Cite

Xu, D., Liang, J., Cheng, W., Wei, H., Chen, H., & Zhang, X. (2021). Transformer-Style Relational Reasoning with Dynamic Memory Updating for Temporal Network Modeling. Proceedings of the AAAI Conference on Artificial Intelligence, 35(5), 4546-4554. https://doi.org/10.1609/aaai.v35i5.16583

Issue

Section

AAAI Technical Track on Data Mining and Knowledge Management