HyperAim: Hypergraph Contrastive Learning with Adaptive Multi-frequency Filters
DOI:
https://doi.org/10.1609/aaai.v40i27.39472Abstract
Unsupervised hypergraph representation learning has recently gained traction for its ability to model complex high-order interactions without requiring labeled data. However, existing contrastive learning methods typically overlook the frequency diversity inherent in hypergraph signals. To address this issue, we propose HyperAim, a contrastive learning framework that integrates adaptive multi-frequency filtering through both decoupled and coupled designs. Specifically, HyperAim employs two decoupled channels with polynomial low-pass and high-pass filters to separately capture distinct frequency components, and a third channel based on framelet decomposition that adaptively fuses multi-frequency signals in a coupled manner. A frequency-aware contrastive learning strategy is introduced to align representations across views using a combination of InfoNCE loss and pseudo-label-guided supervision. Extensive experiments across 12 benchmark datasets, covering both homophilic and heterophilic hypergraphs, demonstrate the consistent superiority of HyperAim over 17 baselines. Ablation studies further confirm the benefits of explicitly modeling and aligning frequency-specific representations.Published
2026-03-14
How to Cite
Li, M., Zhao, R., Yan, Z., Bai, L., Cui, L., & Cao, F. (2026). HyperAim: Hypergraph Contrastive Learning with Adaptive Multi-frequency Filters. Proceedings of the AAAI Conference on Artificial Intelligence, 40(27), 23063-23070. https://doi.org/10.1609/aaai.v40i27.39472
Issue
Section
AAAI Technical Track on Machine Learning IV