Memory-Reduced Meta-Learning with Guaranteed Convergence

Authors

  • Honglin Yang Department of Automation, Xiamen University Key Laboratory of Multimedia Trusted Perception and Efficient Computing, Ministry of Education of China
  • Ji Ma Department of Automation, Xiamen University Key Laboratory of Multimedia Trusted Perception and Efficient Computing, Ministry of Education of China
  • Xiao Yu Institute of Artificial Intelligence, Xiamen University Key Laboratory of Multimedia Trusted Perception and Efficient Computing, Ministry of Education of China

DOI:

https://doi.org/10.1609/aaai.v39i20.35501

Abstract

The optimization-based meta-learning approach is gaining increased traction because of its unique ability to quickly adapt to a new task using only small amounts of data. However, existing optimization-based meta-learning approaches, such as MAML, ANIL and their variants, generally employ backpropagation for upper-level gradient estimation, which requires using historical lower-level parameters/gradients and thus increases computational and memory overhead in each iteration. In this paper, we propose a meta-learning algorithm that can avoid using historical parameters/gradients and significantly reduce memory costs in each iteration compared to existing optimization-based meta-learning approaches. In addition to memory reduction, we prove that our proposed algorithm converges sublinearly with the iteration number of upper-level optimization, and the convergence error decays sublinearly with the batch size of sampled tasks. In the specific case in terms of deterministic meta-learning, we also prove that our proposed algorithm converges to an exact solution. Moreover, we quantify the computational complexity of the algorithm, which matches existing convergence results on meta-learning even without using any historical parameters/gradients. Experimental results on meta-learning benchmarks confirm the efficacy of our proposed algorithm.

Downloads

Published

2025-04-11

How to Cite

Yang, H., Ma, J., & Yu, X. (2025). Memory-Reduced Meta-Learning with Guaranteed Convergence. Proceedings of the AAAI Conference on Artificial Intelligence, 39(20), 21938–21946. https://doi.org/10.1609/aaai.v39i20.35501

Issue

Section

AAAI Technical Track on Machine Learning VI