Length-Adaptive Interest Network for Balancing Long and Short Sequence Modeling in CTR Prediction
DOI:
https://doi.org/10.1609/aaai.v40i34.40094Abstract
User behavior sequences in modern recommendation systems exhibit significant length heterogeneity, ranging from sparse short-term interactions to rich long-term histories. While longer sequences provide more context, we observe that increasing the maximum input sequence length in existing CTR models paradoxically degrades performance for short-sequence users due to attention polarization and length imbalance in training data. To address this, we propose LAIN (Length-Adaptive Interest Network), a plug-and-play framework that explicitly incorporates sequence length as a conditioning signal to balance long- and short-sequence modeling. LAIN consists of three lightweight components: a Spectral Length Encoder that maps length into continuous representations, Length-Conditioned Prompting that injects global contextual cues into both long- and short-term behavior branches, and Length-Modulated Attention that adaptively adjusts attention sharpness based on sequence length. Extensive experiments on three real-world benchmarks across five strong CTR backbones show that LAIN consistently improves overall performance, achieving up to 1.15% AUC gain and 2.25% log loss reduction. Notably, our method significantly improves accuracy for short-sequence users without sacrificing long-sequence effectiveness. Our work offers a general, efficient, and deployable solution to mitigate length-induced bias in sequential recommendation.Published
2026-03-14
How to Cite
Zhang, Z., Du, Z., Zhu, J., Tang, J., Lu, F., Jiaheng, W., … Dong, Z. (2026). Length-Adaptive Interest Network for Balancing Long and Short Sequence Modeling in CTR Prediction. Proceedings of the AAAI Conference on Artificial Intelligence, 40(34), 28627–28635. https://doi.org/10.1609/aaai.v40i34.40094
Issue
Section
AAAI Technical Track on Machine Learning XI