Heterophily-aware Contrastive Learning for Heterophilic Hypergraphs

Authors

  • Ming Li Zhejiang Key Laboratory of Intelligent Education Technology and Application, Zhejiang Normal University
  • Yongqi Li School of Computer Science and Technology, Zhejiang Normal University
  • Yuting Chen Centre for Learning Sciences and Technologies, The Chinese University of Hong Kong
  • Feilong Cao School of Mathematical Sciences, Zhejiang Normal University
  • Ke Lv School of Engineering Science, University of Chinese Academy of Sciences Peng Cheng Laboratory

DOI:

https://doi.org/10.1609/aaai.v40i27.39473

Abstract

Hypergraph neural networks (HNNs) have emerged as powerful tools for modeling high-order relationships in complex systems. However, most existing HNNs are designed under the assumption of homophily, which does not hold in many real-world scenarios where connected nodes often exhibit diverse semantics, i.e., heterophily. This inconsistency leads to suboptimal aggregation and degraded performance, especially in low-label regimes. While a few recent methods have attempted to enhance heterophilic hypergraph learning, they often rely heavily on label supervision and overlook the potential of self-supervised techniques. In this paper, we propose HeroCL, a heterophily-aware contrastive learning framework that improves hypergraph representation under both structural heterogeneity and label scarcity. Specifically, HeroCL integrates a multi-hop neighbor encoding module to capture informative higher-order context and incorporates two complementary contrastive objectives, label-aware and structure-aware, to guide representation learning from both semantic and relational perspectives. A multi-granularity contrastive strategy is introduced to exploit latent signals across multiple neighborhood levels. Extensive experiments on several benchmark datasets against 11 existing baselines demonstrate that HeroCL achieves consistent and significant performance gains, particularly under strong heterophily and limited supervision, validating its robustness and effectiveness.

Downloads

Published

2026-03-14

How to Cite

Li, M., Li, Y., Chen, Y., Cao, F., & Lv, K. (2026). Heterophily-aware Contrastive Learning for Heterophilic Hypergraphs. Proceedings of the AAAI Conference on Artificial Intelligence, 40(27), 23071-23078. https://doi.org/10.1609/aaai.v40i27.39473

Issue

Section

AAAI Technical Track on Machine Learning IV