A Stochastic Momentum Accelerated Quasi-Newton Method for Neural Networks (Student Abstract)
DOI:
https://doi.org/10.1609/aaai.v36i11.21623Keywords:
Neural Networks, Stochastic Method, Momentum Acceleration, Online Training, Nesterov's Accelerated Gradient, Quasi-Newton, Limited MemoryAbstract
Incorporating curvature information in stochastic methods has been a challenging task. This paper proposes a momentum accelerated BFGS quasi-Newton method in both its full and limited memory forms, for solving stochastic large scale non-convex optimization problems in neural networks (NN).Downloads
Published
2022-06-28
How to Cite
Indrapriyadarsini, S., Mahboubi, S., Ninomiya, H., Kamio, T., & Asai, H. (2022). A Stochastic Momentum Accelerated Quasi-Newton Method for Neural Networks (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 36(11), 12973-12974. https://doi.org/10.1609/aaai.v36i11.21623
Issue
Section
AAAI Student Abstract and Poster Program