Accelerated Training for Massive Classification via Dynamic Class Selection

Authors

  • Xingcheng Zhang The Chinese University of Hong Kong
  • Lei Yang The Chinese University of Hong Kong
  • Junjie Yan SenseTime Group Limited
  • Dahua Lin The Chinese University of Hong Kong

DOI:

https://doi.org/10.1609/aaai.v32i1.12337

Keywords:

Classification, Deep Learning, Softmax

Abstract

Massive classification, a classification task defined over a vast number of classes (hundreds of thousands or even millions), has become an essential part of many real-world systems, such as face recognition. Existing methods, including the deep networks that achieved remarkable success in recent years, were mostly devised for problems with a moderate number of classes. They would meet with substantial difficulties, e.g., excessive memory demand and computational cost, when applied to massive problems. We present a new method to tackle this problem. This method can efficiently and accurately identify a small number of "active classes" for each mini-batch, based on a set of dynamic class hierarchies constructed on the fly. We also develop an adaptive allocation scheme thereon, which leads to a better tradeoff between performance and cost. On several large-scale benchmarks, our method significantly reduces the training cost and memory demand, while maintaining competitive performance.

Downloads

Published

2018-04-27

How to Cite

Zhang, X., Yang, L., Yan, J., & Lin, D. (2018). Accelerated Training for Massive Classification via Dynamic Class Selection. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.12337