Noise-Robust Semi-Supervised Learning by Large-Scale Sparse Coding

Authors

  • Zhiwu Lu Renmin University of China
  • Xin Gao King Abdullah University of Science and Technology
  • Liwei Wang Peking University
  • Ji-Rong Wen Renmin University of China
  • Songfang Huang IBM China Research Lab

DOI:

https://doi.org/10.1609/aaai.v29i1.9551

Abstract

This paper presents a large-scale sparse coding algorithm to deal with the challenging problem of noise-robust semi-supervised learning over very large data with only few noisy initial labels. By giving an L1-norm formulation of Laplacian regularization directly based upon the manifold structure of the data, we transform noise-robust semi-supervised learning into a generalized sparse coding problem so that noise reduction can be imposed upon the noisy initial labels. Furthermore, to keep the scalability of noise-robust semi-supervised learning over very large data, we make use of both nonlinear approximation and dimension reduction techniques to solve this generalized sparse coding problem in linear time and space complexity. Finally, we evaluate the proposed algorithm in the challenging task of large-scale semi-supervised image classification with only few noisy initial labels. The experimental results on several benchmark image datasets show the promising performance of the proposed algorithm.

Downloads

Published

2015-02-21

How to Cite

Lu, Z., Gao, X., Wang, L., Wen, J.-R., & Huang, S. (2015). Noise-Robust Semi-Supervised Learning by Large-Scale Sparse Coding. Proceedings of the AAAI Conference on Artificial Intelligence, 29(1). https://doi.org/10.1609/aaai.v29i1.9551

Issue

Section

Main Track: Novel Machine Learning Algorithms