Three Heads Are Better than One: Complementary Experts for Long-Tailed Semi-supervised Learning
DOI:
https://doi.org/10.1609/aaai.v38i13.29334Keywords:
ML: Semi-Supervised Learning, ML: Classification and RegressionAbstract
We address the challenging problem of Long-Tailed Semi-Supervised Learning (LTSSL) where labeled data exhibit imbalanced class distribution and unlabeled data follow an unknown distribution. Unlike in balanced SSL, the generated pseudo-labels are skewed towards head classes, intensifying the training bias. Such a phenomenon is even amplified as more unlabeled data will be mislabeled as head classes when the class distribution of labeled and unlabeled datasets are mismatched. To solve this problem, we propose a novel method named ComPlementary Experts (CPE). Specifically, we train multiple experts to model various class distributions, each of them yielding high-quality pseudo-labels within one form of class distribution. Besides, we introduce Classwise Batch Normalization for CPE to avoid performance degradation caused by feature distribution mismatch between head and non-head classes. CPE achieves state-of-the-art performances on CIFAR-10-LT, CIFAR-100-LT, and STL-10-LT dataset benchmarks. For instance, on CIFAR-10-LT, CPE improves test accuracy by over >2.22% compared to baselines. Code is available at https://github.com/machengcheng2016/CPE-LTSSL.Downloads
Published
2024-03-24
How to Cite
Ma, C., Elezi, I., Deng, J., Dong, W., & Xu, C. (2024). Three Heads Are Better than One: Complementary Experts for Long-Tailed Semi-supervised Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 38(13), 14229-14237. https://doi.org/10.1609/aaai.v38i13.29334
Issue
Section
AAAI Technical Track on Machine Learning IV