BScNets: Block Simplicial Complex Neural Networks

Authors

  • Yuzhou Chen Princeton University
  • Yulia R. Gel The University of Texas at Dallas
  • H. Vincent Poor Princeton University

DOI:

https://doi.org/10.1609/aaai.v36i6.20583

Keywords:

Machine Learning (ML)

Abstract

Simplicial neural networks (SNNs) have recently emerged as a new direction in graph learning which expands the idea of convolutional architectures from node space to simplicial complexes on graphs. Instead of predominantly assessing pairwise relations among nodes as in the current practice, simplicial complexes allow us to describe higher-order interactions and multi-node graph structures. By building upon connection between the convolution operation and the new block Hodge-Laplacian, we propose the first SNN for link prediction. Our new Block Simplicial Complex Neural Networks (BScNets) model generalizes existing graph convolutional network (GCN) frameworks by systematically incorporating salient interactions among multiple higher-order graph structures of different dimensions. We discuss theoretical foundations behind BScNets and illustrate its utility for link prediction on eight real-world and synthetic datasets. Our experiments indicate that BScNets outperforms the state-of-the-art models by a significant margin while maintaining low computation costs. Finally, we show utility of BScNets as a new promising alternative for tracking spread of infectious diseases such as COVID-19 and measuring the effectiveness of the healthcare risk mitigation strategies.

Downloads

Published

2022-06-28

How to Cite

Chen, Y., Gel, Y. R., & Poor, H. V. (2022). BScNets: Block Simplicial Complex Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 36(6), 6333-6341. https://doi.org/10.1609/aaai.v36i6.20583

Issue

Section

AAAI Technical Track on Machine Learning I