Bi-Level Optimization for Semi-Supervised Learning with Pseudo-Labeling
DOI:
https://doi.org/10.1609/aaai.v39i16.33887Abstract
Semi-supervised learning (SSL) is a fundamental task in machine learning, empowering models to extract valuable insights from datasets with limited labeled samples and a large amount of unlabeled data. Although pseudo-labeling is a widely used approach for SSL that generates pseudo-labels for unlabeled data and leverages them as ground truth labels for training, traditional pseudo-labeling techniques often face challenges that significantly decrease the quality of pseudo-labels and hence the overall model performance. In this paper, we propose a novel Bi-level Optimization method for Pseudo-label Learning (BOPL) to boost semi-supervised training. It treats pseudo-labels as latent variables, and optimizes the model parameters and pseudo-labels jointly within a bi-level optimization framework. By enabling direct optimization over the pseudo-labels towards maximizing the prediction model performance, the method is expected to produce high-quality pseudo-labels. To evaluate the effectiveness of the proposed approach, we conduct extensive experiments on multiple SSL benchmarks. The experimental results show the proposed BOPL outperforms the state-of-the-art SSL techniques.Downloads
Published
2025-04-11
How to Cite
Heidari, M., & Guo, Y. (2025). Bi-Level Optimization for Semi-Supervised Learning with Pseudo-Labeling. Proceedings of the AAAI Conference on Artificial Intelligence, 39(16), 17168–17176. https://doi.org/10.1609/aaai.v39i16.33887
Issue
Section
AAAI Technical Track on Machine Learning II