Few-Shot Class-Incremental Learning via Relation Knowledge Distillation

Authors

  • Songlin Dong College of Artificial Intelligence, Xi'an Jiaotong University
  • Xiaopeng Hong School of Cyber Science and Engineering, Xi'an Jiaotong University
  • Xiaoyu Tao College of Artificial Intelligence, Xi'an Jiaotong University.
  • Xinyuan Chang School of Software Engineering, Xi'an Jiaotong University
  • Xing Wei School of Software Engineering, Xi'an Jiaotong University
  • Yihong Gong School of Software Engineering, Xi'an Jiaotong University

DOI:

https://doi.org/10.1609/aaai.v35i2.16213

Keywords:

Object Detection & Categorization

Abstract

In this paper, we focus on the challenging few-shot class incremental learning (FSCIL) problem, which requires to transfer knowledge from old tasks to new ones and solves catastrophic forgetting. We propose the exemplar relation distillation incremental learning framework to balance the tasks of old-knowledge preserving and new-knowledge adaptation. First, we construct an exemplar relation graph to represent the knowledge learned by the original network and update gradually for new tasks learning. Then an exemplar relation loss function for discovering the relation knowledge between different classes is introduced to learn and transfer the structural information in relation graph. A large number of experiments demonstrate that relation knowledge does exist in the exemplars and our approach outperforms other state-of-the-art class-incremental learning methods on the CIFAR100, miniImageNet, and CUB200 datasets.

Downloads

Published

2021-05-18

How to Cite

Dong, S., Hong, X., Tao, X., Chang, X., Wei, X., & Gong, Y. (2021). Few-Shot Class-Incremental Learning via Relation Knowledge Distillation. Proceedings of the AAAI Conference on Artificial Intelligence, 35(2), 1255-1263. https://doi.org/10.1609/aaai.v35i2.16213

Issue

Section

AAAI Technical Track on Computer Vision I