Leveraging Old Knowledge to Continually Learn New Classes in Medical Images

Authors

  • Evelyn Chee National University of Singapore
  • Mong Li Lee National University of Singapore
  • Wynne Hsu National University of Singapore

DOI:

https://doi.org/10.1609/aaai.v37i12.26659

Keywords:

General

Abstract

Class-incremental continual learning is a core step towards developing artificial intelligence systems that can continuously adapt to changes in the environment by learning new concepts without forgetting those previously learned. This is especially needed in the medical domain where continually learning from new incoming data is required to classify an expanded set of diseases. In this work, we focus on how old knowledge can be leveraged to learn new classes without catastrophic forgetting. We propose a framework that comprises of two main components: (1) a dynamic architecture with expanding representations to preserve previously learned features and accommodate new features; and (2) a training procedure alternating between two objectives to balance the learning of new features while maintaining the model’s performance on old classes. Experiment results on multiple medical datasets show that our solution is able to achieve superior performance over state-of-the-art baselines in terms of class accuracy and forgetting.

Downloads

Published

2023-06-26

How to Cite

Chee, E., Lee, M. L., & Hsu, W. (2023). Leveraging Old Knowledge to Continually Learn New Classes in Medical Images. Proceedings of the AAAI Conference on Artificial Intelligence, 37(12), 14178-14186. https://doi.org/10.1609/aaai.v37i12.26659

Issue

Section

AAAI Special Track on AI for Social Impact