Incremental Quasi-Newton Methods with Faster Superlinear Convergence Rates

Authors

  • Zhuanghua Liu National University of Singapore, CNRS@CREATE LTD, 1 Create Way, #08-01 CREATE Tower, Singapore 138602
  • Luo Luo Fudan University
  • Bryan Kian Hsiang Low National University of Singapore

DOI:

https://doi.org/10.1609/aaai.v38i13.29319

Keywords:

ML: Optimization

Abstract

We consider the finite-sum optimization problem, where each component function is strongly convex and has Lipschitz continuous gradient and Hessian. The recently proposed incremental quasi-Newton method is based on BFGS update and achieves a local superlinear convergence rate that is dependent on the condition number of the problem. This paper proposes a more efficient quasi-Newton method by incorporating the symmetric rank-1 update into the incremental framework, which results in the condition-number-free local superlinear convergence rate. Furthermore, we can boost our method by applying the block update on the Hessian approximation, which leads to an even faster local convergence rate. The numerical experiments show the proposed methods significantly outperform the baseline methods.

Downloads

Published

2024-03-24

How to Cite

Liu, Z., Luo, L., & Low, B. K. H. (2024). Incremental Quasi-Newton Methods with Faster Superlinear Convergence Rates. Proceedings of the AAAI Conference on Artificial Intelligence, 38(13), 14097-14105. https://doi.org/10.1609/aaai.v38i13.29319

Issue

Section

AAAI Technical Track on Machine Learning IV