Deep Active Learning with Noise Stability
DOI:
https://doi.org/10.1609/aaai.v38i12.29270Keywords:
ML: Active Learning, ML: Deep Learning AlgorithmsAbstract
Uncertainty estimation for unlabeled data is crucial to active learning. With a deep neural network employed as the backbone model, the data selection process is highly challenging due to the potential over-confidence of the model inference. Existing methods resort to special learning fashions (e.g. adversarial) or auxiliary models to address this challenge. This tends to result in complex and inefficient pipelines, which would render the methods impractical. In this work, we propose a novel algorithm that leverages noise stability to estimate data uncertainty. The key idea is to measure the output derivation from the original observation when the model parameters are randomly perturbed by noise. We provide theoretical analyses by leveraging the small Gaussian noise theory and demonstrate that our method favors a subset with large and diverse gradients. Our method is generally applicable in various tasks, including computer vision, natural language processing, and structural data analysis. It achieves competitive performance compared against state-of-the-art active learning baselines.Downloads
Published
2024-03-24
How to Cite
Li, X., Yang, P., Gu, Y., Zhan, X., Wang, T., Xu, M., & Xu, C. (2024). Deep Active Learning with Noise Stability. Proceedings of the AAAI Conference on Artificial Intelligence, 38(12), 13655-13663. https://doi.org/10.1609/aaai.v38i12.29270
Issue
Section
AAAI Technical Track on Machine Learning III