Efficient Riemannian Meta-Optimization by Implicit Differentiation


  • Xiaomeng Fan Beijing Institute of Technology
  • Yuwei Wu Beijing Institute of Technology
  • Zhi Gao Beijing Institute of Technology
  • Yunde Jia Beijing Institute of Technology
  • Mehrtash Harandi Monash University




Constraint Satisfaction And Optimization (CSO), Machine Learning (ML)


To solve optimization problems with nonlinear constrains, the recently developed Riemannian meta-optimization methods show promise, which train neural networks as an optimizer to perform optimization on Riemannian manifolds. A key challenge is the heavy computational and memory burdens, because computing the meta-gradient with respect to the optimizer involves a series of time-consuming derivatives, and stores large computation graphs in memory. In this paper, we propose an efficient Riemannian meta-optimization method that decouples the complex computation scheme from the meta-gradient. We derive Riemannian implicit differentiation to compute the meta-gradient by establishing a link between Riemannian optimization and the implicit function theorem. As a result, the updating our optimizer is only related to the final two iterations, which in turn speeds up our method and reduces the memory footprint significantly. We theoretically study the computational load and memory footprint of our method for long optimization trajectories, and conduct an empirical study to demonstrate the benefits of the proposed method. Evaluations of three optimization problems on different Riemannian manifolds show that our method achieves state-of-the-art performance in terms of the convergence speed and the quality of optima.




How to Cite

Fan, X., Wu, Y., Gao, Z., Jia, Y., & Harandi, M. (2022). Efficient Riemannian Meta-Optimization by Implicit Differentiation. Proceedings of the AAAI Conference on Artificial Intelligence, 36(4), 3733-3740. https://doi.org/10.1609/aaai.v36i4.20287



AAAI Technical Track on Constraint Satisfaction and Optimization