Riemannian Geometric-based Meta Learning
DOI:
https://doi.org/10.1609/aaai.v39i19.34185Abstract
Meta-learning, or "learning to learn," aims to enable models to quickly adapt to new tasks with minimal data. While traditional methods like Model-Agnostic Meta-Learning (MAML) optimize parameters in Euclidean space, they often struggle to capture complex learning dynamics, particularly in few-shot learning scenarios. To address this limitation, we propose Stiefel-MAML, which integrates Riemannian geometry by optimizing within the Stiefel manifold, a space that naturally enforces orthogonality constraints. By leveraging the geometric structure of the Stiefel manifold, we improve parameter expressiveness and enable more efficient optimization through Riemannian gradient calculations and retraction operations. We also introduce a novel kernel-based loss function defined on the Stiefel manifold, further enhancing the model’s ability to explore the parameter space. Experimental results on benchmark datasets—including Omniglot, Mini-ImageNet, FC-100, and CUB—demonstrate that Stiefel-MAML consistently outperforms traditional MAML, achieving superior performance across various few-shot learning tasks. Our findings highlight the potential of Riemannian geometry to enhance meta-learning, paving the way for future research on optimizing over different geometric structures.Published
2025-04-11
How to Cite
Park, J., Lee, Y., Kim, T.-J., & Choi, J.-H. (2025). Riemannian Geometric-based Meta Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 39(19), 19839–19847. https://doi.org/10.1609/aaai.v39i19.34185
Issue
Section
AAAI Technical Track on Machine Learning V