Neural Relational Inference with Efficient Message Passing Mechanisms

Authors

  • Siyuan Chen Sun Yat-sen University
  • Jiahai Wang Sun Yat-sen University
  • Guoqing Li Sun Yat-sen University

Keywords:

Graph-based Machine Learning

Abstract

Many complex processes can be viewed as dynamical systems of interacting agents. In many cases, only the state sequences of individual agents are observed, while the interacting relations and the dynamical rules are unknown. The neural relational inference (NRI) model adopts graph neural networks that pass messages over a latent graph to jointly learn the relations and the dynamics based on the observed data. However, NRI infers the relations independently and suffers from error accumulation in multi-step prediction at dynamics learning procedure. Besides, relation reconstruction without prior knowledge becomes more difficult in more complex systems. This paper introduces efficient message passing mechanisms to the graph neural networks with structural prior knowledge to address these problems. A relation interaction mechanism is proposed to capture the coexistence of all relations, and a spatio-temporal message passing mechanism is proposed to use historical information to alleviate error accumulation. Additionally, the structural prior knowledge, symmetry as a special case, is introduced for better relation prediction in more complex systems. The experimental results on simulated physics systems show that the proposed method outperforms existing state-of-the-art methods.

Downloads

Published

2021-05-18

How to Cite

Chen, S., Wang, J., & Li, G. (2021). Neural Relational Inference with Efficient Message Passing Mechanisms. Proceedings of the AAAI Conference on Artificial Intelligence, 35(8), 7055-7063. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/16868

Issue

Section

AAAI Technical Track on Machine Learning I