Isolation and Impartial Aggregation: A Paradigm of Incremental Learning without Interference


  • Yabin Wang Xi'an Jiaotong University Singapore Management University
  • Zhiheng Ma Shenzhen Institute of Advanced Technology,Chinese Academy of Sciences
  • Zhiwu Huang Singapore Management University University of Southampton
  • Yaowei Wang Peng Cheng Laboratory
  • Zhou Su Xi'an Jiaotong University
  • Xiaopeng Hong Harbin Institute of Technology Peng Cheng Laboratory



ML: Lifelong and Continual Learning, CV: Representation Learning for Vision, ML: Classification and Regression, ML: Transfer, Domain Adaptation, Multi-Task Learning


This paper focuses on the prevalent stage interference and stage performance imbalance of incremental learning. To avoid obvious stage learning bottlenecks, we propose a new incremental learning framework, which leverages a series of stage-isolated classifiers to perform the learning task at each stage, without interference from others. To be concrete, to aggregate multiple stage classifiers as a uniform one impartially, we first introduce a temperature-controlled energy metric for indicating the confidence score levels of the stage classifiers. We then propose an anchor-based energy self-normalization strategy to ensure the stage classifiers work at the same energy level. Finally, we design a voting-based inference augmentation strategy for robust inference. The proposed method is rehearsal-free and can work for almost all incremental learning scenarios. We evaluate the proposed method on four large datasets. Extensive results demonstrate the superiority of the proposed method in setting up new state-of-the-art overall performance. Code is available at




How to Cite

Wang, Y., Ma, Z., Huang, Z., Wang, Y., Su, Z., & Hong, X. (2023). Isolation and Impartial Aggregation: A Paradigm of Incremental Learning without Interference. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 10209-10217.



AAAI Technical Track on Machine Learning III