Breaking the Convergence Barrier: Optimization via Fixed-Time Convergent Flows

Authors

  • Param Budhraja Indian Institute of Technology, Kharagpur
  • Mayank Baranwal Tata Consultancy Services Research, Mumbai
  • Kunal Garg University of California, Santa Cruz
  • Ashish Hota Indian Institute of Technology, Kharagpur

DOI:

https://doi.org/10.1609/aaai.v36i6.20559

Keywords:

Machine Learning (ML)

Abstract

Accelerated gradient methods are the cornerstones of large-scale, data-driven optimization problems that arise naturally in machine learning and other fields concerning data analysis. We introduce a gradient-based optimization framework for achieving acceleration, based on the recently introduced notion of fixed-time stability of dynamical systems. The method presents itself as a generalization of simple gradient-based methods suitably scaled to achieve convergence to the optimizer in a fixed-time, independent of the initialization. We achieve this by first leveraging a continuous-time framework for designing fixed-time stable dynamical systems, and later providing a consistent discretization strategy, such that the equivalent discrete-time algorithm tracks the optimizer in a practically fixed number of iterations. We also provide a theoretical analysis of the convergence behavior of the proposed gradient flows, and their robustness to additive disturbances for a range of functions obeying strong convexity, strict convexity, and possibly nonconvexity but satisfying the Polyak-Łojasiewicz inequality. We also show that the regret bound on the convergence rate is constant by virtue of the fixed-time convergence. The hyperparameters have intuitive interpretations and can be tuned to fit the requirements on the desired convergence rates. We validate the accelerated convergence properties of the proposed schemes on a range of numerical examples against the state-of-the-art optimization algorithms. Our work provides insights on developing novel optimization algorithms via discretization of continuous-time flows.

Downloads

Published

2022-06-28

How to Cite

Budhraja, P., Baranwal, M., Garg, K., & Hota, A. (2022). Breaking the Convergence Barrier: Optimization via Fixed-Time Convergent Flows. Proceedings of the AAAI Conference on Artificial Intelligence, 36(6), 6115-6122. https://doi.org/10.1609/aaai.v36i6.20559

Issue

Section

AAAI Technical Track on Machine Learning I