Few-Shot Lifelong Learning
DOI:
https://doi.org/10.1609/aaai.v35i3.16334Keywords:
Object Detection & CategorizationAbstract
Many real-world classification problems often have classes with very few labeled training samples. Moreover, all possible classes may not be initially available for training, and may be given incrementally. Deep learning models need to deal with this two-fold problem in order to perform well in real-life situations. In this paper, we propose a novel Few-Shot Lifelong Learning (FSLL) method that enables deep learning models to perform lifelong/continual learning on few-shot data. Our method selects very few parameters from the model for training every new set of classes instead of training the full model. This helps in preventing overfitting. We choose the few parameters from the model in such a way that only the currently unimportant parameters get selected. By keeping the important parameters in the model intact, our approach minimizes catastrophic forgetting. Furthermore, we minimize the cosine similarity between the new and the old class prototypes in order to maximize their separation, thereby improving the classification performance. We also show that integrating our method with self-supervision improves the model performance significantly. We experimentally show that our method significantly outperforms existing methods on the miniImageNet, CIFAR-100, and CUB-200 datasets. Specifically, we outperform the state-of-the-art method by an absolute margin of 19.27% for the CUB dataset.Downloads
Published
2021-05-18
How to Cite
Mazumder, P., Singh, P., & Rai, P. (2021). Few-Shot Lifelong Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 35(3), 2337-2345. https://doi.org/10.1609/aaai.v35i3.16334
Issue
Section
AAAI Technical Track on Computer Vision II