Improving First-Order Optimization Algorithms (Student Abstract)
This paper presents a simple and intuitive technique to accelerate the convergence of first-order optimization algorithms. The proposed solution modifies the update rule, based on the variation of the direction of the gradient and the previous step taken during training. Results after tests show that the technique has the potential to significantly improve the performance of existing first-order optimization algorithms.
How to Cite
Ange, T., & Roger, N. (2020). Improving First-Order Optimization Algorithms (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 34(10), 13935-13936. https://doi.org/10.1609/aaai.v34i10.7240
Student Abstract Track