Neural Relational Inference with Efficient Message Passing Mechanisms


  • Siyuan Chen Sun Yat-sen University
  • Jiahai Wang Sun Yat-sen University
  • Guoqing Li Sun Yat-sen University



Graph-based Machine Learning


Many complex processes can be viewed as dynamical systems of interacting agents. In many cases, only the state sequences of individual agents are observed, while the interacting relations and the dynamical rules are unknown. The neural relational inference (NRI) model adopts graph neural networks that pass messages over a latent graph to jointly learn the relations and the dynamics based on the observed data. However, NRI infers the relations independently and suffers from error accumulation in multi-step prediction at dynamics learning procedure. Besides, relation reconstruction without prior knowledge becomes more difficult in more complex systems. This paper introduces efficient message passing mechanisms to the graph neural networks with structural prior knowledge to address these problems. A relation interaction mechanism is proposed to capture the coexistence of all relations, and a spatio-temporal message passing mechanism is proposed to use historical information to alleviate error accumulation. Additionally, the structural prior knowledge, symmetry as a special case, is introduced for better relation prediction in more complex systems. The experimental results on simulated physics systems show that the proposed method outperforms existing state-of-the-art methods.




How to Cite

Chen, S., Wang, J., & Li, G. (2021). Neural Relational Inference with Efficient Message Passing Mechanisms. Proceedings of the AAAI Conference on Artificial Intelligence, 35(8), 7055-7063.



AAAI Technical Track on Machine Learning I