Hypergraph Neural Architecture Search

Authors

  • Wei Lin Xiamen University
  • Xu Peng Xiamen University
  • Zhengtao Yu Kunming University of Science and Technology
  • Taisong Jin Xiamen University

DOI:

https://doi.org/10.1609/aaai.v38i12.29290

Keywords:

ML: Graph-based Machine Learning, DMKM: Graph Mining, Social Network Analysis & Community, SO: Other Foundations of Search & Optimization

Abstract

In recent years, Hypergraph Neural Networks (HGNNs) have achieved considerable success by manually designing architectures, which are capable of extracting effective patterns with high-order interactions from non-Euclidean data. However, such mechanism is extremely inefficient, demanding tremendous human efforts to tune diverse model parameters. In this paper, we propose a novel Hypergraph Neural Architecture Search (HyperNAS) to automatically design the optimal HGNNs. The proposed model constructs a search space suitable for hypergraphs, and derives hypergraph architectures through differentiable search strategies. A hypergraph structure-aware distance criterion is introduced as a guideline for obtaining an optimal hypergraph architecture via the leave-one-out method. Experimental results for node classification on benchmark Cora, Citeseer, Pubmed citation networks and hypergraph datasets show that HyperNAS outperforms existing HGNNs models and graph NAS methods.

Published

2024-03-24

How to Cite

Lin, W., Peng, X., Yu, Z., & Jin, T. (2024). Hypergraph Neural Architecture Search. Proceedings of the AAAI Conference on Artificial Intelligence, 38(12), 13837-13845. https://doi.org/10.1609/aaai.v38i12.29290

Issue

Section

AAAI Technical Track on Machine Learning III