Mixture Uniform Distribution Modeling and Asymmetric Mix Distillation for Class Incremental Learning

Authors

  • Sunyuan Qiang Macau University of Science and Technology
  • Jiayi Hou Lafayette College
  • Jun Wan Macau University of Science and Technology National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences School of Artificial Intelligence, University of Chinese Academy of Sciences
  • Yanyan Liang Macau University of Science and Technology
  • Zhen Lei National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences School of Artificial Intelligence, University of Chinese Academy of Sciences
  • Du Zhang Macau University of Science and Technology

DOI:

https://doi.org/10.1609/aaai.v37i8.26137

Keywords:

ML: Lifelong and Continual Learning

Abstract

Exemplar rehearsal-based methods with knowledge distillation (KD) have been widely used in class incremental learning (CIL) scenarios. However, they still suffer from performance degradation because of severely distribution discrepancy between training and test set caused by the limited storage memory on previous classes. In this paper, we mathematically model the data distribution and the discrepancy at the incremental stages with mixture uniform distribution (MUD). Then, we propose the asymmetric mix distillation method to uniformly minimize the error of each class from distribution discrepancy perspective. Specifically, we firstly promote mixup in CIL scenarios with the incremental mix samplers and incremental mix factor to calibrate the raw training data distribution. Next, mix distillation label augmentation is incorporated into the data distribution to inherit the knowledge information from the previous models. Based on the above augmented data distribution, our trained model effectively alleviates the performance degradation and extensive experimental results validate that our method exhibits superior performance on CIL benchmarks.

Downloads

Published

2023-06-26

How to Cite

Qiang, S., Hou, J., Wan, J., Liang, Y., Lei, Z., & Zhang, D. (2023). Mixture Uniform Distribution Modeling and Asymmetric Mix Distillation for Class Incremental Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 9498-9506. https://doi.org/10.1609/aaai.v37i8.26137

Issue

Section

AAAI Technical Track on Machine Learning III