Sparse Coding in a Dual Memory System for Lifelong Learning

Authors

  • Fahad Sarfraz Navinfo Europe TUE
  • Elahe Arani Navinfo Europe TUE
  • Bahram Zonooz Navinfo Europe TUE

DOI:

https://doi.org/10.1609/aaai.v37i8.26161

Keywords:

ML: Lifelong and Continual Learning, ML: Bio-Inspired Learning

Abstract

Efficient continual learning in humans is enabled by a rich set of neurophysiological mechanisms and interactions between multiple memory systems. The brain efficiently encodes information in non-overlapping sparse codes, which facilitates the learning of new associations faster with controlled interference with previous associations. To mimic sparse coding in DNNs, we enforce activation sparsity along with a dropout mechanism which encourages the model to activate similar units for semantically similar inputs and have less overlap with activation patterns of semantically dissimilar inputs. This provides us with an efficient mechanism for balancing the reusability and interference of features, depending on the similarity of classes across tasks. Furthermore, we employ sparse coding in a multiple-memory replay mechanism. Our method maintains an additional long-term semantic memory that aggregates and consolidates information encoded in the synaptic weights of the working model. Our extensive evaluation and characteristics analysis show that equipped with these biologically inspired mechanisms, the model can further mitigate forgetting. Code available at \url{https://github.com/NeurAI-Lab/SCoMMER}.

Downloads

Published

2023-06-26

How to Cite

Sarfraz, F., Arani, E., & Zonooz, B. (2023). Sparse Coding in a Dual Memory System for Lifelong Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 9714-9722. https://doi.org/10.1609/aaai.v37i8.26161

Issue

Section

AAAI Technical Track on Machine Learning III