Any-Way Meta Learning

Authors

  • JunHoo Lee Seoul National University
  • Yearim Kim Seoul National University
  • Hyunho Lee Seoul National University
  • Nojun Kwak Seoul National University

DOI:

https://doi.org/10.1609/aaai.v38i12.29242

Keywords:

ML: Deep Learning Algorithms, ML: Representation Learning

Abstract

Although meta-learning seems promising performance in the realm of rapid adaptability, it is constrained by fixed cardinality. When faced with tasks of varying cardinalities that were unseen during training, the model lacks its ability. In this paper, we address and resolve this challenge by harnessing `label equivalence' emerged from stochastic numeric label assignments during episodic task sampling. Questioning what defines ``true" meta-learning, we introduce the ``any-way" learning paradigm, an innovative model training approach that liberates model from fixed cardinality constraints. Surprisingly, this model not only matches but often outperforms traditional fixed-way models in terms of performance, convergence speed, and stability. This disrupts established notions about domain generalization. Furthermore, we argue that the inherent label equivalence naturally lacks semantic information. To bridge this semantic information gap arising from label equivalence, we further propose a mechanism for infusing semantic class information into the model. This would enhance the model's comprehension and functionality. Experiments conducted on renowned architectures like MAML and ProtoNet affirm the effectiveness of our method.

Published

2024-03-24

How to Cite

Lee, J., Kim, Y., Lee, H., & Kwak, N. (2024). Any-Way Meta Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 38(12), 13400-13408. https://doi.org/10.1609/aaai.v38i12.29242

Issue

Section

AAAI Technical Track on Machine Learning III