Isolation and Impartial Aggregation: A Paradigm of Incremental Learning without Interference
DOI:
https://doi.org/10.1609/aaai.v37i8.26216Keywords:
ML: Lifelong and Continual Learning, CV: Representation Learning for Vision, ML: Classification and Regression, ML: Transfer, Domain Adaptation, Multi-Task LearningAbstract
This paper focuses on the prevalent stage interference and stage performance imbalance of incremental learning. To avoid obvious stage learning bottlenecks, we propose a new incremental learning framework, which leverages a series of stage-isolated classifiers to perform the learning task at each stage, without interference from others. To be concrete, to aggregate multiple stage classifiers as a uniform one impartially, we first introduce a temperature-controlled energy metric for indicating the confidence score levels of the stage classifiers. We then propose an anchor-based energy self-normalization strategy to ensure the stage classifiers work at the same energy level. Finally, we design a voting-based inference augmentation strategy for robust inference. The proposed method is rehearsal-free and can work for almost all incremental learning scenarios. We evaluate the proposed method on four large datasets. Extensive results demonstrate the superiority of the proposed method in setting up new state-of-the-art overall performance. Code is available at https://github.com/iamwangyabin/ESN.Downloads
Published
2023-06-26
How to Cite
Wang, Y., Ma, Z., Huang, Z., Wang, Y., Su, Z., & Hong, X. (2023). Isolation and Impartial Aggregation: A Paradigm of Incremental Learning without Interference. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 10209-10217. https://doi.org/10.1609/aaai.v37i8.26216
Issue
Section
AAAI Technical Track on Machine Learning III