A Fully Single Loop Algorithm for Bilevel Optimization without Hessian Inverse

Authors

  • Junyi Li University of Pittsburgh
  • Bin Gu MBZUAI
  • Heng Huang University of Pittsburgh

DOI:

https://doi.org/10.1609/aaai.v36i7.20706

Keywords:

Machine Learning (ML)

Abstract

In this paper, we propose a novel Hessian inverse free Fully Single Loop Algorithm (FSLA) for bilevel optimization problems. Classic algorithms for bilevel optimization admit a double loop structure which is computationally expensive. Recently, several single loop algorithms have been proposed with optimizing the inner and outer variable alternatively. However, these algorithms not yet achieve fully single loop. As they overlook the loop needed to evaluate the hyper-gradient for a given inner and outer state. In order to develop a fully single loop algorithm, we first study the structure of the hyper-gradient and identify a general approximation formulation of hyper-gradient computation that encompasses several previous common approaches, e.g. back-propagation through time, conjugate gradient, etc. Based on this formulation, we introduce a new state variable to maintain the historical hyper-gradient information. Combining our new formulation with the alternative update of the inner and outer variables, we propose an efficient fully single loop algorithm. We theoretically show that the error generated by the new state can be bounded and our algorithm converges. Finally, we verify the efficacy our algorithm empirically through multiple bilevel optimization based machine learning tasks. A long version of this paper can be found in: https://arxiv.org/abs/2112.04660.

Downloads

Published

2022-06-28

How to Cite

Li, J., Gu, B., & Huang, H. (2022). A Fully Single Loop Algorithm for Bilevel Optimization without Hessian Inverse. Proceedings of the AAAI Conference on Artificial Intelligence, 36(7), 7426-7434. https://doi.org/10.1609/aaai.v36i7.20706

Issue

Section

AAAI Technical Track on Machine Learning II