Beyond Learning Features: Training a Fully-Functional Classifier with ZERO Instance-Level Labels

Authors

  • Deepak Babu Sam Indian Institute of Science
  • Abhinav Agarwalla Indian Institute of Science
  • Venkatesh Babu Radhakrishnan Indian Institute of Science

DOI:

https://doi.org/10.1609/aaai.v36i2.20113

Keywords:

Computer Vision (CV), Machine Learning (ML)

Abstract

We attempt to train deep neural networks for classification without using any labeled data. Existing unsupervised methods, though mine useful clusters or features, require some annotated samples to facilitate the final task-specific predictions. This defeats the true purpose of unsupervised learning and hence we envisage a paradigm of `true' self-supervision, where absolutely no annotated instances are used for training a classifier. The proposed method first pretrains a deep network through self-supervision and performs clustering on the learned features. A classifier layer is then appended to the self-supervised network and is trained by matching the distribution of the predictions to that of a predefined prior. This approach leverages the distribution of labels for supervisory signals and consequently, no image-label pair is needed. Experiments reveal that the method works on major nominal as well as ordinal classification datasets and delivers significant performance.

Downloads

Published

2022-06-28

How to Cite

Sam, D. B., Agarwalla, A., & Radhakrishnan, V. B. (2022). Beyond Learning Features: Training a Fully-Functional Classifier with ZERO Instance-Level Labels. Proceedings of the AAAI Conference on Artificial Intelligence, 36(2), 2162-2170. https://doi.org/10.1609/aaai.v36i2.20113

Issue

Section

AAAI Technical Track on Computer Vision II