Accelerated Sparse Linear Regression via Random Projection

Authors

  • Weizhong Zhang Zhejiang University
  • Lijun Zhang Nanjing University
  • Rong Jin Alibaba Group
  • Deng Cai Zhejiang University
  • Xiaofei He Zhejiang University

DOI:

https://doi.org/10.1609/aaai.v30i1.10260

Keywords:

Lasso, Convex Optimization, Machine Learning

Abstract

In this paper, we present an accelerated numerical method based on random projection for sparse linear regression. Previous studies have shown that under appropriate conditions, gradient-based methods enjoy a geometric convergence rate when applied to this problem. However, the time complexity of evaluating the gradient is as large as $\mathcal{O}(nd)$, where $n$ is the number of data points and $d$ is the dimensionality, making those methods inefficient for large-scale and high-dimensional dataset. To address this limitation, we first utilize random projection to find a rank-$k$ approximator for the data matrix, and reduce the cost of gradient evaluation to $\mathcal{O}(nk+dk)$, a significant improvement when $k$ is much smaller than $d$ and $n$. Then, we solve the sparse linear regression problem via a proximal gradient method with a homotopy strategy to generate sparse intermediate solutions. Theoretical analysis shows that our method also achieves a global geometric convergence rate, and moreover the sparsity of all the intermediate solutions are well-bounded over the iterations. Finally, we conduct experiments to demonstrate the efficiency of the proposed method.

Downloads

Published

2016-03-02

How to Cite

Zhang, W., Zhang, L., Jin, R., Cai, D., & He, X. (2016). Accelerated Sparse Linear Regression via Random Projection. Proceedings of the AAAI Conference on Artificial Intelligence, 30(1). https://doi.org/10.1609/aaai.v30i1.10260

Issue

Section

Technical Papers: Machine Learning Methods