Less-Forgetful Learning for Domain Expansion in Deep Neural Networks

Authors

  • Heechul Jung Korea Advanced Institute of Science and Technology
  • Jeongwoo Ju Korea Advanced Institute of Science and Technology
  • Minju Jung Korea Advanced Institute of Science and Technology
  • Junmo Kim Korea Advanced Institute of Science and Technology

DOI:

https://doi.org/10.1609/aaai.v32i1.11769

Keywords:

deep learning, image classification, domain expansion, catastrophic forgetting, continual learning

Abstract

Expanding the domain that deep neural network has already learned without accessing old domain data is a challenging task because deep neural networks forget previously learned information when learning new data from a new domain. In this paper, we propose a less-forgetful learning method for the domain expansion scenario. While existing domain adaptation techniques solely focused on adapting to new domains, the proposed technique focuses on working well with both old and new domains without needing to know whether the input is from the old or new domain. First, we present two naive approaches which will be problematic, then we provide a new method using two proposed properties for less-forgetful learning. Finally, we prove the effectiveness of our method through experiments on image classification tasks. All datasets used in the paper, will be released on our website for someone's follow-up study.

Downloads

Published

2018-04-29

How to Cite

Jung, H., Ju, J., Jung, M., & Kim, J. (2018). Less-Forgetful Learning for Domain Expansion in Deep Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11769