Learning Dynamic Latent Spaces for Lifelong Generative Modelling

Authors

  • Fei Ye University of york
  • Adrian G. Bors University of York

DOI:

https://doi.org/10.1609/aaai.v37i9.26291

Keywords:

ML: Lifelong and Continual Learning, ML: Ensemble Methods, ML: Representation Learning, ML: Transfer, Domain Adaptation, Multi-Task Learning

Abstract

Task Free Continual Learning (TFCL) aims to capture novel concepts from non-stationary data streams without forgetting previously learned knowledge. Mixture models, which add new components when certain conditions are met, have shown promising results in TFCL tasks. However, such approaches do not make use of the knowledge already accumulated for positive knowledge transfer. In this paper, we develop a new model, namely the Online Recursive Variational Autoencoder (ORVAE). ORVAE utilizes the prior knowledge by selectively incorporating the newly learnt information, by adding new components, according to the knowledge already known from the past learnt data. We introduce a new attention mechanism to regularize the structural latent space in which the most important information is reused while the information that interferes with novel samples is inactivated. The proposed attention mechanism can maximize the benefit from the forward transfer for learning novel information without forgetting previously learnt knowledge. We perform several experiments which show that ORVAE achieves state-of-the-art results under TFCL.

Downloads

Published

2023-06-26

How to Cite

Ye, F., & Bors, A. G. (2023). Learning Dynamic Latent Spaces for Lifelong Generative Modelling. Proceedings of the AAAI Conference on Artificial Intelligence, 37(9), 10891-10899. https://doi.org/10.1609/aaai.v37i9.26291

Issue

Section

AAAI Technical Track on Machine Learning IV