Resolving Task Confusion in Dynamic Expansion Architectures for Class Incremental Learning

Authors

  • Bingchen Huang Shanghai Key Lab of Intelligent Information Processing, School of Computer Science, Fudan University Shanghai Collaborative Innovation Center on Intelligent Visual Computing
  • Zhineng Chen Shanghai Key Lab of Intelligent Information Processing, School of Computer Science, Fudan University Shanghai Collaborative Innovation Center on Intelligent Visual Computing
  • Peng Zhou University of Maryland, College Park, MD, USA
  • Jiayin Chen Shanghai Key Lab of Intelligent Information Processing, School of Computer Science, Fudan University Shanghai Collaborative Innovation Center on Intelligent Visual Computing
  • Zuxuan Wu Shanghai Key Lab of Intelligent Information Processing, School of Computer Science, Fudan University Shanghai Collaborative Innovation Center on Intelligent Visual Computing

DOI:

https://doi.org/10.1609/aaai.v37i1.25170

Keywords:

CV: Object Detection & Categorization, ML: Lifelong and Continual Learning

Abstract

The dynamic expansion architecture is becoming popular in class incremental learning, mainly due to its advantages in alleviating catastrophic forgetting. However, task confu- sion is not well assessed within this framework, e.g., the discrepancy between classes of different tasks is not well learned (i.e., inter-task confusion, ITC), and certain prior- ity is still given to the latest class batch (i.e., old-new con- fusion, ONC). We empirically validate the side effects of the two types of confusion. Meanwhile, a novel solution called Task Correlated Incremental Learning (TCIL) is pro- posed to encourage discriminative and fair feature utilization across tasks. TCIL performs a multi-level knowledge distil- lation to propagate knowledge learned from old tasks to the new one. It establishes information flow paths at both fea- ture and logit levels, enabling the learning to be aware of old classes. Besides, attention mechanism and classifier re- scoring are applied to generate more fair classification scores. We conduct extensive experiments on CIFAR100 and Ima- geNet100 datasets. The results demonstrate that TCIL con- sistently achieves state-of-the-art accuracy. It mitigates both ITC and ONC, while showing advantages in battle with catas- trophic forgetting even no rehearsal memory is reserved. Source code: https://github.com/YellowPancake/TCIL.

Downloads

Published

2023-06-26

How to Cite

Huang, B., Chen, Z., Zhou, P., Chen, J., & Wu, Z. (2023). Resolving Task Confusion in Dynamic Expansion Architectures for Class Incremental Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 37(1), 908-916. https://doi.org/10.1609/aaai.v37i1.25170

Issue

Section

AAAI Technical Track on Computer Vision I