A Stochastic Momentum Accelerated Quasi-Newton Method for Neural Networks (Student Abstract)

Authors

  • S. Indrapriyadarsini Shizuoka University, Japan
  • Shahrzad Mahboubi Shonan Institute of Technology, Japan
  • Hiroshi Ninomiya Shonan Institute of Technology, Japan
  • Takeshi Kamio Hiroshima City University, Japan
  • Hideki Asai Shizuoka University, Japan

DOI:

https://doi.org/10.1609/aaai.v36i11.21623

Keywords:

Neural Networks, Stochastic Method, Momentum Acceleration, Online Training, Nesterov's Accelerated Gradient, Quasi-Newton, Limited Memory

Abstract

Incorporating curvature information in stochastic methods has been a challenging task. This paper proposes a momentum accelerated BFGS quasi-Newton method in both its full and limited memory forms, for solving stochastic large scale non-convex optimization problems in neural networks (NN).

Downloads

Published

2022-06-28

How to Cite

Indrapriyadarsini, S., Mahboubi, S., Ninomiya, H., Kamio, T., & Asai, H. (2022). A Stochastic Momentum Accelerated Quasi-Newton Method for Neural Networks (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 36(11), 12973-12974. https://doi.org/10.1609/aaai.v36i11.21623