Self-Adaptive Graph Mixture of Models

Authors

  • Mohit Meena Fujitsu Research of India, Bangalore
  • Yash Punjabi Fujitsu Research of India, Bangalore
  • Abhishek A Fujitsu Research of India, Bangalore
  • Vishal Sharma Fujitsu Research of India, Bangalore
  • Mahesh Chandran Fujitsu Research of India, Bangalore

DOI:

https://doi.org/10.1609/aaai.v40i29.39615

Abstract

Graph Neural Networks (GNNs) have emerged as powerful tools for learning over graph-structured data, yet recent studies have shown that their performance gains are beginning to plateau. In many cases, well-established models such as GCN and GAT, when appropriately tuned, can match or even exceed the performance of more complex, state-of-the-art architectures. This trend highlights a key limitation in the current landscape: the difficulty of selecting the most suitable model for a given graph task or dataset. To address this, we propose Self-Adaptive Graph Mixture of Models (SAGMM), a modular and practical framework that learns to automatically select and combine the most appropriate GNN models from a diverse pool of architectures. Unlike prior mixture-of-experts approaches that rely on variations of a single base model, SAGMM leverages architectural diversity and a topology-aware attention gating mechanism to adaptively assign experts to each node based on the structure of the input graph. To improve efficiency, SAGMM includes a pruning mechanism that reduces the number of active experts during training and inference without compromising performance. We also explore a training-efficient variant in which expert models are pretrained and frozen, and only the gating and task-specific layers are trained. We evaluate SAGMM on 16 benchmark datasets covering node classification, graph classification, regression, and link prediction tasks, and demonstrate that it consistently outperforms or matches leading GNN baselines and prior mixture-based methods, offering a robust and adaptive solution for real-world graph learning.

Published

2026-03-14

How to Cite

Meena, M., Punjabi, Y., A, A., Sharma, V., & Chandran, M. (2026). Self-Adaptive Graph Mixture of Models. Proceedings of the AAAI Conference on Artificial Intelligence, 40(29), 24344-24352. https://doi.org/10.1609/aaai.v40i29.39615

Issue

Section

AAAI Technical Track on Machine Learning VI