ACMo: Angle-Calibrated Moment Methods for Stochastic Optimization

Authors

  • Xunpeng Huang ByteDance AI Lab
  • Runxin Xu Peking University
  • Hao Zhou ByteDance AI Lab
  • Zhe Wang Ohio State University
  • Zhengyang Liu Beijing Institute of Technology
  • Lei Li ByteDance AI Lab

DOI:

https://doi.org/10.1609/aaai.v35i9.16959

Keywords:

(Deep) Neural Network Algorithms, Optimization

Abstract

Stochastic gradient descent (SGD) is a widely used method for its outstanding generalization ability and simplicity. Adaptive gradient methods have been proposed to further accelerate the optimization process. In this paper, we revisit existing adaptive gradient optimization methods with a new interpretation. Such new perspective leads to a refreshed understanding of the roles of second moments in stochastic optimization. Based on this, we propose Angle-Calibration Moment method (ACMo), a novel stochastic optimization method. It enjoys the benefits of second moments with only first moment updates. Theoretical analysis shows that ACMo is able to achieve the same convergence rate as mainstream adaptive methods. Experiments on a variety of CV and NLP tasks demonstrate that ACMo has a comparable convergence to state-of-the-art Adam-type optimizers, and even a better generalization performance in most cases. The code is available at https://github.com/Xunpeng746/ACMo.

Downloads

Published

2021-05-18

How to Cite

Huang, X., Xu, R., Zhou, H., Wang, Z., Liu, Z., & Li, L. (2021). ACMo: Angle-Calibrated Moment Methods for Stochastic Optimization. Proceedings of the AAAI Conference on Artificial Intelligence, 35(9), 7857-7864. https://doi.org/10.1609/aaai.v35i9.16959

Issue

Section

AAAI Technical Track on Machine Learning II