Entropic Open-Set Active Learning

Authors

  • Bardia Safaei Johns Hopkins University
  • Vibashan VS Johns Hopkins University
  • Celso M. de Melo DEVCOM Army Research Laboratory
  • Vishal M. Patel Johns Hopkins University

DOI:

https://doi.org/10.1609/aaai.v38i5.28269

Keywords:

CV: Learning & Optimization for CV, ML: Active Learning

Abstract

Active Learning (AL) aims to enhance the performance of deep models by selecting the most informative samples for annotation from a pool of unlabeled data. Despite impressive performance in closed-set settings, most AL methods fail in real-world scenarios where the unlabeled data contains unknown categories. Recently, a few studies have attempted to tackle the AL problem for the open-set setting. However, these methods focus more on selecting known samples and do not efficiently utilize unknown samples obtained during AL rounds. In this work, we propose an Entropic Open-set AL (EOAL) framework which leverages both known and unknown distributions effectively to select informative samples during AL rounds. Specifically, our approach employs two different entropy scores. One measures the uncertainty of a sample with respect to the known-class distributions. The other measures the uncertainty of the sample with respect to the unknown-class distributions. By utilizing these two entropy scores we effectively separate the known and unknown samples from the unlabeled data resulting in better sampling. Through extensive experiments, we show that the proposed method outperforms existing state-of-the-art methods on CIFAR-10, CIFAR-100, and TinyImageNet datasets. Code is available at https://github.com/bardisafa/EOAL.

Published

2024-03-24

How to Cite

Safaei, B., VS, V., de Melo, C. M., & Patel, V. M. (2024). Entropic Open-Set Active Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 38(5), 4686-4694. https://doi.org/10.1609/aaai.v38i5.28269

Issue

Section

AAAI Technical Track on Computer Vision IV