SC2Net: Sparse LSTMs for Sparse Coding

Authors

  • Joey Tianyi Zhou Institute of High Performance Computing, A*STAR
  • Kai Di Institute of High Performance Computing, A*STAR
  • Jiawei Du Institute of High Performance Computing, A*STAR
  • Xi Peng College of Computer Science, Sichuan University
  • Hao Yang Amazon, Seattle
  • Sinno Jialin Pan Nanyang Technological University
  • Ivor Tsang University of Technology Sydney
  • Yong Liu Institute of High Performance Computing, A*STAR
  • Zheng Qin Institute of High Performance Computing, A*STAR
  • Rick Siow Mong Goh Institute of High Performance Computing, A*STAR

DOI:

https://doi.org/10.1609/aaai.v32i1.11721

Keywords:

sparse representation, LSTM, RNN, model-based optimization

Abstract

The iterative hard-thresholding algorithm (ISTA) is one of the most popular optimization solvers to achieve sparse codes. However, ISTA suffers from following problems: 1) ISTA employs non-adaptive updating strategy to learn the parameters on each dimension with a fixed learning rate. Such a strategy may lead to inferior performance due to the scarcity of diversity; 2) ISTA does not incorporate the historical information into the updating rules, and the historical information has been proven helpful to speed up the convergence. To address these challenging issues, we propose a novel formulation of ISTA (named as adaptive ISTA) by introducing a novel \textit{adaptive momentum vector}. To efficiently solve the proposed adaptive ISTA, we recast it as a recurrent neural network unit and show its connection with the well-known long short term memory (LSTM) model. With a new proposed unit, we present a neural network (termed SC2Net) to achieve sparse codes in an end-to-end manner. To the best of our knowledge, this is one of the first works to bridge the $\ell_1$-solver and LSTM, and may provide novel insights in understanding model-based optimization and LSTM. Extensive experiments show the effectiveness of our method on both unsupervised and supervised tasks.

Downloads

Published

2018-04-29

How to Cite

Zhou, J. T., Di, K., Du, J., Peng, X., Yang, H., Pan, S. J., Tsang, I., Liu, Y., Qin, Z., & Goh, R. S. M. (2018). SC2Net: Sparse LSTMs for Sparse Coding. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11721