Hyperbolic-Enhanced Mixture-of-Experts Mamba for Sequential Recommendation

Authors

  • Yuwen Liu China University of Petroleum (East China) Shandong Key Laboratory of Intelligent Oil and Gas Industrial Software
  • Lianyong Qi China University of Petroleum (East China) Shandong Key Laboratory of Intelligent Oil and Gas Industrial Software
  • Xingyuan Mao China University of Petroleum (East China) Shandong Key Laboratory of Intelligent Oil and Gas Industrial Software
  • Weiming Liu ByteDance Inc.
  • Xuhui Fan Macquarie University
  • Qiang Ni Lancaster University
  • Xuyun Zhang Macquarie University
  • Yang Zhang University of North Texas
  • Yuan Tian Nanjing Institute of Technology
  • Amin Beheshti Macquarie University

DOI:

https://doi.org/10.1609/aaai.v40i18.38567

Abstract

Sequential recommendation has emerged as a fundamental task in various domains, aiming to predict a user's next interaction based on historical behavior. Recent advances in deep sequence models, particularly Transformer-based architectures and the more recent Mamba, have substantially pushed the boundaries of sequential modeling performance. However, existing methods still face two critical challenges. First, many current approaches overlook the hierarchical structures and high-order dependencies among items, typically restricting representation learning to conventional Euclidean spaces, which limits their capacity to capture complex relational information. Second, although Mamba excels at long-range dependency modeling, its reliance on static Feed-Forward Networks (FFNs) hinders its ability to dynamically adapt to evolving user preferences across diverse contexts. To address these limitations, we propose a Hyperbolic-Enhanced Mixture-of-Experts Mamba recommender (HM2Rec) for sequential recommendation. HM2Rec first encodes user-item relationships through hyperbolic graph convolution to exploit hierarchical structure more effectively. Then, a Variational Graph Auto-Encoder (VGAE) is employed to reconstruct node embeddings, improving structural robustness. To further enhance sequential modeling, we integrate Rotary Positional Encoding (RoPE) into Mamba to better capture relative position dependencies, and replace the FFN with Mixture-of-Expert (MOE) module, enabling dynamic and personalized expert selection for each token. Our extensive experiments on four widely-used public datasets demonstrate that HM2Rec outperforms several advanced baseline models.

Downloads

Published

2026-03-14

How to Cite

Liu, Y., Qi, L., Mao, X., Liu, W., Fan, X., Ni, Q., … Beheshti, A. (2026). Hyperbolic-Enhanced Mixture-of-Experts Mamba for Sequential Recommendation. Proceedings of the AAAI Conference on Artificial Intelligence, 40(18), 15403–15411. https://doi.org/10.1609/aaai.v40i18.38567

Issue

Section

AAAI Technical Track on Data Mining & Knowledge Management II