Rethinking and Improving Student Learning and Forgetting Processes for Attention based Knowledge Tracing Models
DOI:
https://doi.org/10.1609/aaai.v39i27.34998Abstract
Knowledge tracing (KT) models students' knowledge states and predicts their future performance based on their historical interaction data. However, attention based KT models struggle to accurately capture diverse forgetting behaviors in ever-growing interaction sequences. First, existing models use uniform time decay matrices, conflating forgetting representations with problem relevance. Second, the fixed-length window prediction paradigm fails to model continuous forgetting processes in expanding sequences. To address these challenges, this paper introduces LefoKT, a unified architecture that enhances attention based KT models by incorporating proposed relative forgetting attention. LefoKT improves forgetting modeling through relative forgetting attention to decouple forgetting patterns from problem relevance. It also enhances attention based KT models' length extrapolation capability for capturing continuous forgetting processes in ever-growing interaction sequences. Extensive experimental results on three datasets validate the effectiveness of LefoKT.Downloads
Published
2025-04-11
How to Cite
Bai, Y., Li, X., Liu, Z., Huang, Y., Tian, M., & Luo, W. (2025). Rethinking and Improving Student Learning and Forgetting Processes for Attention based Knowledge Tracing Models. Proceedings of the AAAI Conference on Artificial Intelligence, 39(27), 27822–27830. https://doi.org/10.1609/aaai.v39i27.34998
Issue
Section
AAAI Technical Track on AI for Social Impact Track