STARS: Spatial-Temporal Active Re-sampling for Label-Efficient Learning from Noisy Annotations

Authors

  • Dayou Yu Rochester Institute of Technology
  • Weishi Shi University of North Texas
  • Qi Yu Rochester Institute of Technology

DOI:

https://doi.org/10.1609/aaai.v37i9.26301

Keywords:

ML: Active Learning

Abstract

Active learning (AL) aims to sample the most informative data instances for labeling, which makes the model fitting data efficient while significantly reducing the annotation cost. However, most existing AL models make a strong assumption that the annotated data instances are always assigned correct labels, which may not hold true in many practical settings. In this paper, we develop a theoretical framework to formally analyze the impact of noisy annotations and show that systematically re-sampling guarantees to reduce the noise rate, which can lead to improved generalization capability. More importantly, the theoretical framework demonstrates the key benefit of conducting active re-sampling on label-efficient learning, which is critical for AL. The theoretical results also suggest essential properties of an active re-sampling function with a fast convergence speed and guaranteed error reduction. This inspires us to design a novel spatial-temporal active re-sampling function by leveraging the important spatial and temporal properties of maximum-margin classifiers. Extensive experiments conducted on both synthetic and real-world data clearly demonstrate the effectiveness of the proposed active re-sampling function.

Downloads

Published

2023-06-26

How to Cite

Yu, D., Shi, W., & Yu, Q. (2023). STARS: Spatial-Temporal Active Re-sampling for Label-Efficient Learning from Noisy Annotations. Proceedings of the AAAI Conference on Artificial Intelligence, 37(9), 10980-10988. https://doi.org/10.1609/aaai.v37i9.26301

Issue

Section

AAAI Technical Track on Machine Learning IV