Dual-Level Curriculum Meta-Learning for Noisy Few-Shot Learning Tasks
DOI:
https://doi.org/10.1609/aaai.v38i13.29392Keywords:
ML: Other Foundations of Machine LearningAbstract
Few-shot learning (FSL) is essential in many practical applications. However, the limited training examples make the models more vulnerable to label noise, which can lead to poor generalization capability. To address this critical challenge, we propose a curriculum meta-learning model that employs a novel dual-level class-example sampling strategy to create a robust curriculum for adaptive task distribution formulation and robust model training. The dual-level framework proposes a heuristic class sampling criterion that measures pairwise class boundary complexity to form a class curriculum; it uses effective example sampling through an under-trained proxy model to form an example curriculum. By utilizing both class-level and example-level information, our approach is more robust to handle limited training data and noisy labels that commonly occur in few-shot learning tasks. The model has efficient convergence behavior, which is verified through rigorous convergence analysis. Additionally, we establish a novel error bound through a hierarchical PAC-Bayesian analysis for curriculum meta-learning under noise. We conduct extensive experiments that demonstrate the effectiveness of our framework in outperforming existing noisy few-shot learning methods under various few-shot classification benchmarks. Our code is available at https://github.com/ritmininglab/DCML.Downloads
Published
2024-03-24
How to Cite
Que, X., & Yu, Q. (2024). Dual-Level Curriculum Meta-Learning for Noisy Few-Shot Learning Tasks. Proceedings of the AAAI Conference on Artificial Intelligence, 38(13), 14740-14748. https://doi.org/10.1609/aaai.v38i13.29392
Issue
Section
AAAI Technical Track on Machine Learning IV