Ranking Info Noise Contrastive Estimation: Boosting Contrastive Learning via Ranked Positives

Authors

  • David T. Hoffmann University of Freiburg Bosch Center for Artificial Intelligence
  • Nadine Behrmann Bosch Center for Artificial Intelligence
  • Juergen Gall University of Bonn
  • Thomas Brox University of Freiburg
  • Mehdi Noroozi Bosch Center for Artificial Intelligence

DOI:

https://doi.org/10.1609/aaai.v36i1.19972

Keywords:

Computer Vision (CV), Machine Learning (ML)

Abstract

This paper introduces Ranking Info Noise Contrastive Estimation (RINCE), a new member in the family of InfoNCE losses that preserves a ranked ordering of positive samples. In contrast to the standard InfoNCE loss, which requires a strict binary separation of the training pairs into similar and dissimilar samples, RINCE can exploit information about a similarity ranking for learning a corresponding embedding space. We show that the proposed loss function learns favorable embeddings compared to the standard InfoNCE whenever at least noisy ranking information can be obtained or when the definition of positives and negatives is blurry. We demonstrate this for a supervised classification task with additional superclass labels and noisy similarity scores. Furthermore, we show that RINCE can also be applied to unsupervised training with experiments on unsupervised representation learning from videos. In particular, the embedding yields higher classification accuracy, retrieval rates and performs better on out-of-distribution detection than the standard InfoNCE loss.

Downloads

Published

2022-06-28

How to Cite

Hoffmann, D. T., Behrmann, N., Gall, J., Brox, T., & Noroozi, M. (2022). Ranking Info Noise Contrastive Estimation: Boosting Contrastive Learning via Ranked Positives. Proceedings of the AAAI Conference on Artificial Intelligence, 36(1), 897-905. https://doi.org/10.1609/aaai.v36i1.19972

Issue

Section

AAAI Technical Track on Computer Vision I