Continual Learning by Using Information of Each Class Holistically

Authors

  • Wenpeng Hu Peking University
  • Qi Qin Peking University
  • Mengyu Wang Peking University
  • Jinwen Ma Peking University
  • Bing Liu UIC

Keywords:

Classification and Regression

Abstract

Continual learning (CL) incrementally learns a sequence of tasks while solving the catastrophic forgetting (CF) problem. Existing methods mainly try to deal with CF directly. In this paper, we propose to avoid CF by considering the features of each class holistically rather than only the discriminative information for classifying the classes seen so far. This latter approach is prone to CF because the discriminative information for old classes may not be sufficiently discriminative for the new class to be learned. Consequently, in learning each new task, the network parameters for previous tasks have to be revised, which causes CF. With the holistic consideration, after adding new tasks, the system can still do well for previous tasks. The proposed technique is called Per-class Continual Learning (PCL). PCL has two key novelties. (1) It proposes a one-class learning based technique for CL, which considers features of each class holistically and represents a new approach to solving the CL problem. (2) It proposes a method to extract discriminative information after training to further improve the accuracy. Empirical evaluation shows that PCL markedly outperforms the state-of-the-art baselines for one or more classes per task. More tasks also result in more gains.

Downloads

Published

2021-05-18

How to Cite

Hu, W., Qin, Q., Wang, M., Ma, J., & Liu, B. (2021). Continual Learning by Using Information of Each Class Holistically. Proceedings of the AAAI Conference on Artificial Intelligence, 35(9), 7797-7805. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/16952

Issue

Section

AAAI Technical Track on Machine Learning II