On Oversquashing in Graph Neural Networks Through the Lens of Dynamical Systems

Authors

  • Alessio Gravina University of Pisa
  • Moshe Eliasof University of Cambridge
  • Claudio Gallicchio University of Pisa
  • Davide Bacciu University of Pisa
  • Carola-Bibiane Schönlieb University of Cambridge

DOI:

https://doi.org/10.1609/aaai.v39i16.33858

Abstract

A common problem in Message-Passing Neural Networks is oversquashing -- the limited ability to facilitate effective information flow between distant nodes. Oversquashing is attributed to the exponential decay in information transmission as node distances increase. This paper introduces a novel perspective to address oversquashing, leveraging dynamical systems properties of global and local non-dissipativity, that enable the maintenance of a constant information flow rate. We present SWAN, a uniquely parameterized GNN model with antisymmetry both in space and weight domains, as a means to obtain non-dissipativity. Our theoretical analysis asserts that by implementing these properties, SWAN offers an enhanced ability to transmit information over extended distances. Empirical evaluations on synthetic and real-world benchmarks that emphasize long-range interactions validate the theoretical understanding of SWAN, and its ability to mitigate oversquashing.

Downloads

Published

2025-04-11

How to Cite

Gravina, A., Eliasof, M., Gallicchio, C., Bacciu, D., & Schönlieb, C.-B. (2025). On Oversquashing in Graph Neural Networks Through the Lens of Dynamical Systems. Proceedings of the AAAI Conference on Artificial Intelligence, 39(16), 16906–16914. https://doi.org/10.1609/aaai.v39i16.33858

Issue

Section

AAAI Technical Track on Machine Learning II