Scaling Few-Shot Learning for the Open World

Authors

  • Zhipeng Lin National University of Defense Technology
  • Wenjing Yang National University of Defense Technology
  • Haotian Wang Academy of Military Science National University of Defense Technology
  • Haoang Chi National University of Defense Technology
  • Long Lan National University of Defense Technology
  • Ji Wang National University of Defense Technology

DOI:

https://doi.org/10.1609/aaai.v38i12.29291

Keywords:

ML: Transfer, Domain Adaptation, Multi-Task Learning, ML: Representation Learning

Abstract

Few-shot learning (FSL) aims to enable learning models with the ability to automatically adapt to novel (unseen) domains in open-world scenarios. Nonetheless, there exists a significant disparity between the vast number of new concepts encountered in the open world and the restricted available scale of existing FSL works, which primarily focus on a limited number of novel classes. Such a gap hinders the practical applicability of FSL in realistic scenarios. To bridge this gap, we propose a new problem named Few-Shot Learning with Many Novel Classes (FSL-MNC) by substantially enlarging the number of novel classes, exceeding the count in the traditional FSL setup by over 500-fold. This new problem exhibits two major challenges, including the increased computation overhead during meta-training and the degraded classification performance by the large number of classes during meta-testing. To overcome these challenges, we propose a Simple Hierarchy Pipeline (SHA-Pipeline). Due to the inefficiency of traditional protocols of EML, we re-design a lightweight training strategy to reduce the overhead brought by much more novel classes. To capture discriminative semantics across numerous novel classes, we effectively reconstruct and leverage the class hierarchy information during meta-testing. Experiments show that the proposed SHA-Pipeline significantly outperforms not only the ProtoNet baseline but also the state-of-the-art alternatives across different numbers of novel classes.

Published

2024-03-24

How to Cite

Lin, Z., Yang, W., Wang, H., Chi, H., Lan, L., & Wang, J. (2024). Scaling Few-Shot Learning for the Open World. Proceedings of the AAAI Conference on Artificial Intelligence, 38(12), 13846-13854. https://doi.org/10.1609/aaai.v38i12.29291

Issue

Section

AAAI Technical Track on Machine Learning III