Time Series Class-Incremental Learning via Confidence-guided Mask Distillation and Prototype-guided Contrastive Learning

Authors

  • Yu Liu Dalian University of Technology
  • Haoqin Yang Dalian University of Technology
  • Jinping Sui Dalian Naval Academy
  • Hui Wang Dalian University of Technology
  • Haipeng Li Dalian University of Technology
  • Weimin Wang Dalian University of Technology
  • Qi Jia Dalian University of Technology

DOI:

https://doi.org/10.1609/aaai.v40i28.39568

Abstract

Class-incremental learning (CIL) has recently gained great attention in the field of time series classification. Existing CIL methods based on knowledge distillation exhibit impressive ability to retain prior knowledge and overcome catastrophic forgetting, however, their effectiveness faces major challenges posed by time series data. Since temporal data is more susceptible to sensor errors and electronic noise, the distillation process may be significantly affected by noisy knowledge transfer. To address this issue, we propose a novel confidence-guided mask distillation (CMD) framework, to prevent the noisy inheritance during distillation. The core of CMD lies in a dynamic masking mechanism guided by prediction confidence, capable of allocating higher weights to high-confidence time series and substantially suppressing the influence of low-confidence ones. Additionally, different from prior work simply passing a set of feature prototypes to the classifier, we develop prototype-guided contrastive learning (PCL) to alleviate the classifier bias on new classes, through extra contrastive constraints to push away the feature distributions of old feature prototypes from those of new classes features. Extensive experiments on three time-series datasets demonstrate that, our method significantly outperforms other replay-free CIL approaches in raising average accuracy, as well as decreasing forgetting rate.

Downloads

Published

2026-03-14

How to Cite

Liu, Y., Yang, H., Sui, J., Wang, H., Li, H., Wang, W., & Jia, Q. (2026). Time Series Class-Incremental Learning via Confidence-guided Mask Distillation and Prototype-guided Contrastive Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 40(28), 23917–23925. https://doi.org/10.1609/aaai.v40i28.39568

Issue

Section

AAAI Technical Track on Machine Learning V