Beyond GNNs: An Efficient Architecture for Graph Problems

Authors

  • Pranjal Awasthi Google Research
  • Abhimanyu Das Google Research
  • Sreenivas Gollapudi Google Research

DOI:

https://doi.org/10.1609/aaai.v36i6.20548

Keywords:

Machine Learning (ML)

Abstract

Despite their popularity for graph structured data, existing Graph Neural Networks (GNNs) have inherent limitations for fundamental graph problems such as shortest paths, k-connectivity, minimum spanning tree and minimum cuts. In these instances, it is known that one needs GNNs of high depth, scaling at a polynomial rate with the number of nodes n, to provably encode the solution space, in turn affecting their statistical efficiency. In this work we propose a new hybrid architecture to overcome this limitation. Our proposed architecture that we call as GNNplus networks involve a combination of multiple parallel low depth GNNs along with simple pooling layers involving low depth fully connected networks. We provably demonstrate that for many graph problems, the solution space can be encoded by GNNplus networks using depth that scales only poly-logarithmically in the number of nodes. This also has statistical advantages that we demonstrate via generalization bounds for GNNplus networks. We empirically show the effectiveness of our proposed architecture for a variety of graph problems and real world classification problems.

Downloads

Published

2022-06-28

How to Cite

Awasthi, P., Das, A., & Gollapudi, S. (2022). Beyond GNNs: An Efficient Architecture for Graph Problems. Proceedings of the AAAI Conference on Artificial Intelligence, 36(6), 6019-6027. https://doi.org/10.1609/aaai.v36i6.20548

Issue

Section

AAAI Technical Track on Machine Learning I