MetaZSCIL: A Meta-Learning Approach for Generalized Zero-Shot Class Incremental Learning

Authors

  • Yanan Wu Beijing Jiaotong University
  • Tengfei Liang Beijing Jiaotong University
  • Songhe Feng School of Computer and Information Technology, Beijing Jiaotong University
  • Yi Jin Beijing JiaoTong University
  • Gengyu Lyu Beijing University of Technology
  • Haojun Fei 360 DigiTech, Inc.
  • Yang Wang Concordia University

DOI:

https://doi.org/10.1609/aaai.v37i9.26238

Keywords:

ML: Multi-Instance/Multi-View Learning, CV: Image and Video Retrieval

Abstract

Generalized zero-shot learning (GZSL) aims to recognize samples whose categories may not have been seen at training. Standard GZSL cannot handle dynamic addition of new seen and unseen classes. In order to address this limitation, some recent attempts have been made to develop continual GZSL methods. However, these methods require end-users to continuously collect and annotate numerous seen class samples, which is unrealistic and hampers the applicability in the real-world. Accordingly, in this paper, we propose a more practical and challenging setting named Generalized Zero-Shot Class Incremental Learning (CI-GZSL). Our setting aims to incrementally learn unseen classes without any training samples, while recognizing all classes previously encountered. We further propose a bi-level meta-learning based method called MetaZSCIL to directly optimize the network to learn how to incrementally learn. Specifically, we sample sequential tasks from seen classes during the offline training to simulate the incremental learning process. For each task, the model is learned using a meta-objective such that it is capable to perform fast adaptation without forgetting. Note that our optimization can be flexibly equipped with most existing generative methods to tackle CI-GZSL. This work introduces a feature generative framework that leverages visual feature distribution alignment to produce replayed samples of previously seen classes to reduce catastrophic forgetting. Extensive experiments conducted on five widely used benchmarks demonstrate the superiority of our proposed method.

Downloads

Published

2023-06-26

How to Cite

Wu, Y., Liang, T., Feng, S., Jin, Y., Lyu, G., Fei, H., & Wang, Y. (2023). MetaZSCIL: A Meta-Learning Approach for Generalized Zero-Shot Class Incremental Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 37(9), 10408-10416. https://doi.org/10.1609/aaai.v37i9.26238

Issue

Section

AAAI Technical Track on Machine Learning IV