Scalable Graph Networks for Particle Simulations

Authors

  • Karolis Martinkus ETH Zurich
  • Aurelien Lucchi ETH Zurich
  • Nathanaël Perraudin Swiss Data Science Center

DOI:

https://doi.org/10.1609/aaai.v35i10.17078

Keywords:

Graph-based Machine Learning, Natural Sciences, Scalability of ML Systems, Applications

Abstract

Learning system dynamics directly from observations is a promising direction in machine learning due to its potential to significantly enhance our ability to understand physical systems. However, the dynamics of many real-world systems are challenging to learn due to the presence of nonlinear potentials and a number of interactions that scales quadratically with the number of particles N, as in the case of the N-body problem. In this work we introduce an approach that transforms a fully-connected interaction graph into a hierarchical one which reduces the number of edges to O(N). This results in a linear time and space complexity while the pre-computation of the hierarchical graph requires O(N log (N)) time and O(N) space. Using our approach, we are able to train models on much larger particle counts, even on a single GPU. We evaluate how the phase space position accuracy and energy conservation depend on the number of simulated particles. Our approach retains high accuracy and efficiency even on large-scale gravitational N-body simulations which are impossible to run on a single machine if a fully-connected graph is used. Similar results are also observed when simulating Coulomb interactions. Furthermore, we make several important observations regarding the performance of this new hierarchical model, including: i) its accuracy tends to improve with the number of particles in the simulation and ii) its generalisation to unseen particle counts is also much better than for models that use all O(N^2) interactions.

Downloads

Published

2021-05-18

How to Cite

Martinkus, K., Lucchi, A., & Perraudin, N. (2021). Scalable Graph Networks for Particle Simulations. Proceedings of the AAAI Conference on Artificial Intelligence, 35(10), 8912-8920. https://doi.org/10.1609/aaai.v35i10.17078

Issue

Section

AAAI Technical Track on Machine Learning III