SPAN: A Stochastic Projected Approximate Newton Method

Authors

  • Xunpeng Huang Bytedance AI Lab
  • Xianfeng Liang University of Science and Technology China
  • Zhengyang Liu Bytedance AI Lab
  • Lei Li Bytedance AI Lab
  • Yue Yu Tsinghua University
  • Yitan Li Bytedance AI Lab

DOI:

https://doi.org/10.1609/aaai.v34i02.5511

Abstract

Second-order optimization methods have desirable convergence properties. However, the exact Newton method requires expensive computation for the Hessian and its inverse. In this paper, we propose SPAN, a novel approximate and fast Newton method. SPAN computes the inverse of the Hessian matrix via low-rank approximation and stochastic Hessian-vector products. Our experiments on multiple benchmark datasets demonstrate that SPAN outperforms existing first-order and second-order optimization methods in terms of the convergence wall-clock time. Furthermore, we provide a theoretical analysis of the per-iteration complexity, the approximation error, and the convergence rate. Both the theoretical analysis and experimental results show that our proposed method achieves a better trade-off between the convergence rate and the per-iteration efficiency.

Downloads

Published

2020-04-03

How to Cite

Huang, X., Liang, X., Liu, Z., Li, L., Yu, Y., & Li, Y. (2020). SPAN: A Stochastic Projected Approximate Newton Method. Proceedings of the AAAI Conference on Artificial Intelligence, 34(02), 1520-1527. https://doi.org/10.1609/aaai.v34i02.5511

Issue

Section

AAAI Technical Track: Constraint Satisfaction and Optimization