Structured BFGS Method for Optimal Doubly Stochastic Matrix Approximation

Authors

  • Dejun Chu Hefei University of Technology
  • Changshui Zhang Tsinghua University
  • Shiliang Sun East China Normal University
  • Qing Tao Army Academy of Artillery and Air Defense

DOI:

https://doi.org/10.1609/aaai.v37i6.25877

Keywords:

ML: Optimization, CSO: Constraint Optimization, CSO: Constraint Programming

Abstract

Doubly stochastic matrix plays an essential role in several areas such as statistics and machine learning. In this paper we consider the optimal approximation of a square matrix in the set of doubly stochastic matrices. A structured BFGS method is proposed to solve the dual of the primal problem. The resulting algorithm builds curvature information into the diagonal components of the true Hessian, so that it takes only additional linear cost to obtain the descent direction based on the gradient information without having to explicitly store the inverse Hessian approximation. The cost is substantially fewer than quadratic complexity of the classical BFGS algorithm. Meanwhile, a Newton-based line search method is presented for finding a suitable step size, which in practice uses the existing knowledge and takes only one iteration. The global convergence of our algorithm is established. We verify the advantages of our approach on both synthetic data and real data sets. The experimental results demonstrate that our algorithm outperforms the state-of-the-art solvers and enjoys outstanding scalability.

Downloads

Published

2023-06-26

How to Cite

Chu, D., Zhang, C., Sun, S., & Tao, Q. (2023). Structured BFGS Method for Optimal Doubly Stochastic Matrix Approximation. Proceedings of the AAAI Conference on Artificial Intelligence, 37(6), 7193-7201. https://doi.org/10.1609/aaai.v37i6.25877

Issue

Section

AAAI Technical Track on Machine Learning I