Ranking Info Noise Contrastive Estimation: Boosting Contrastive Learning via Ranked Positives
Keywords:Computer Vision (CV), Machine Learning (ML)
AbstractThis paper introduces Ranking Info Noise Contrastive Estimation (RINCE), a new member in the family of InfoNCE losses that preserves a ranked ordering of positive samples. In contrast to the standard InfoNCE loss, which requires a strict binary separation of the training pairs into similar and dissimilar samples, RINCE can exploit information about a similarity ranking for learning a corresponding embedding space. We show that the proposed loss function learns favorable embeddings compared to the standard InfoNCE whenever at least noisy ranking information can be obtained or when the definition of positives and negatives is blurry. We demonstrate this for a supervised classification task with additional superclass labels and noisy similarity scores. Furthermore, we show that RINCE can also be applied to unsupervised training with experiments on unsupervised representation learning from videos. In particular, the embedding yields higher classification accuracy, retrieval rates and performs better on out-of-distribution detection than the standard InfoNCE loss.
How to Cite
Hoffmann, D. T., Behrmann, N., Gall, J., Brox, T., & Noroozi, M. (2022). Ranking Info Noise Contrastive Estimation: Boosting Contrastive Learning via Ranked Positives. Proceedings of the AAAI Conference on Artificial Intelligence, 36(1), 897-905. https://doi.org/10.1609/aaai.v36i1.19972
AAAI Technical Track on Computer Vision I