Adaptive Low-Precision Training for Embeddings in Click-Through Rate Prediction

Authors

  • Shiwei Li Huazhong University of Science and Technology
  • Huifeng Guo Huawei Noah's Ark Lab
  • Lu Hou Huawei Noah's Ark Lab
  • Wei Zhang Huawei Noah's Ark Lab
  • Xing Tang Huawei Noah's Ark Lab
  • Ruiming Tang Huawei Noah's Ark Lab
  • Rui Zhang Tsinghua University
  • Ruixuan Li Huazhong University of Science and Technology

DOI:

https://doi.org/10.1609/aaai.v37i4.25564

Keywords:

DMKM: Recommender Systems, DMKM: Data Compression

Abstract

Embedding tables are usually huge in click-through rate (CTR) prediction models. To train and deploy the CTR models efficiently and economically, it is necessary to compress their embedding tables. To this end, we formulate a novel quantization training paradigm to compress the embeddings from the training stage, termed low-precision training (LPT). Also, we provide theoretical analysis on its convergence. The results show that stochastic weight quantization has a faster convergence rate and a smaller convergence error than deterministic weight quantization in LPT. Further, to reduce accuracy degradation, we propose adaptive low-precision training (ALPT) which learns the step size (i.e., the quantization resolution). Experiments on two real-world datasets confirm our analysis and show that ALPT can significantly improve the prediction accuracy, especially at extremely low bit width. For the first time in CTR models, we successfully train 8-bit embeddings without sacrificing prediction accuracy.

Downloads

Published

2023-06-26

How to Cite

Li, S., Guo, H., Hou, L., Zhang, W., Tang, X., Tang, R., Zhang, R., & Li, R. (2023). Adaptive Low-Precision Training for Embeddings in Click-Through Rate Prediction. Proceedings of the AAAI Conference on Artificial Intelligence, 37(4), 4435-4443. https://doi.org/10.1609/aaai.v37i4.25564

Issue

Section

AAAI Technical Track on Data Mining and Knowledge Management