On Trivial Solution and High Correlation Problems in Deep Supervised Hashing

Authors

  • Yuchen Guo Tsinghua Univerisity
  • Xin Zhao Tsinghua Univerisity
  • Guiguang Ding Tsinghua Univerisity
  • Jungong Han Lancaster University

Keywords:

Hashing, Deep Learning, Neural Network

Abstract

Deep supervised hashing (DSH), which combines binary learning and convolutional neural network, has attracted considerable research interests and achieved promising performance for highly efficient image retrieval. In this paper, we show that the widely used loss functions, pair-wise loss and triplet loss, suffer from the trivial solution problem and usually lead to highly correlated bits in practice, limiting the performance of DSH. One important reason is that it is difficult to incorporate proper constraints into the loss functions under the mini-batch based optimization algorithm. To tackle these problems, we propose to adopt ensemble learning strategy for deep model training. We found out that this simple strategy is capable of effectively decorrelating different bits, making the hashcodes more informative. Moreover, it is very easy to parallelize the training and support incremental model learning, which are very useful for real-world applications but usually ignored by existing DSH approaches. Experiments on benchmarks demonstrate the proposed ensemble based DSH can improve the performance of DSH approaches significant.

Downloads

Published

2018-04-26

How to Cite

Guo, Y., Zhao, X., Ding, G., & Han, J. (2018). On Trivial Solution and High Correlation Problems in Deep Supervised Hashing. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/11855

Issue

Section

Main Track: Machine Learning Applications