Stochastic Non-Convex Ordinal Embedding With Stabilized Barzilai-Borwein Step Size

Authors

  • Ke Ma Institute of Information Engineering, Chinese Academy of Sciences; School of Cyber Security, University of Chinese Academy of Sciences
  • Jinshan Zeng School of Computer Information Engineering, Jiangxi Normal University; Hong Kong University of Science and Technology
  • Jiechao Xiong Tencent AI Lab
  • Qianqian Xu Institute of Information Engineering, Chinese Academy of Sciences
  • Xiaochun Cao Institute of Information Engineering, Chinese Academy of Sciences
  • Wei Liu Tencent AI Lab
  • Yuan Yao Hong Kong University of Science and Technology

DOI:

https://doi.org/10.1609/aaai.v32i1.11599

Keywords:

Stochastic Optimization, Non-convex Optimization, Barzilai-Borwein Step Size

Abstract

Learning representation from relative similarity comparisons, often called ordinal embedding, gains rising attention in recent years. Most of the existing methods are batch methods designed mainly based on the convex optimization, say, the projected gradient descent method. However, they are generally time-consuming due to that the singular value decomposition (SVD) is commonly adopted during the update, especially when the data size is very large. To overcome this challenge, we propose a stochastic algorithm called SVRG-SBB, which has the following features: (a) SVD-free via dropping convexity, with good scalability by the use of stochastic algorithm, i.e., stochastic variance reduced gradient (SVRG), and (b) adaptive step size choice via introducing a new stabilized Barzilai-Borwein (SBB) method as the original version for convex problems might fail for the considered stochastic non-convex optimization problem. Moreover, we show that the proposed algorithm converges to a stationary point at a rate O(1/T) in our setting, where T is the number of total iterations. Numerous simulations and real-world data experiments are conducted to show the effectiveness of the proposed algorithm via comparing with the state-of-the-art methods, particularly, much lower computational cost with good prediction performance.

Downloads

Published

2018-04-29

How to Cite

Ma, K., Zeng, J., Xiong, J., Xu, Q., Cao, X., Liu, W., & Yao, Y. (2018). Stochastic Non-Convex Ordinal Embedding With Stabilized Barzilai-Borwein Step Size. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11599