Convex Sparse Coding, Subspace Learning, and Semi-Supervised Extensions

Authors

  • Xinhua Zhang University of Alberta
  • Yaoliang Yu University of Alberta
  • Martha White University of Alberta
  • Ruitong Huang University of Alberta
  • Dale Schuurmans University of Alberta

DOI:

https://doi.org/10.1609/aaai.v25i1.7935

Abstract

Automated feature discovery is a fundamental problem in machine learning. Although classical feature discovery methods do not guarantee optimal solutions in general, it has been recently noted that certain subspace learning and sparse coding problems can be solved efficiently, provided the number of features is not restricted a priori. We provide an extended characterization of this optimality result and describe the nature of the solutions under an expanded set of practical contexts. In particular, we apply the framework to a semi-supervised learning problem, and demonstrate that feature discovery can co-occur with input reconstruction and supervised training while still admitting globally optimal solutions. A comparison to existing semi-supervised feature discovery methods shows improved generalization and efficiency.

Downloads

Published

2011-08-04

How to Cite

Zhang, X., Yu, Y., White, M., Huang, R., & Schuurmans, D. (2011). Convex Sparse Coding, Subspace Learning, and Semi-Supervised Extensions. Proceedings of the AAAI Conference on Artificial Intelligence, 25(1), 567–573. https://doi.org/10.1609/aaai.v25i1.7935

Issue

Section

AAAI Technical Track: Machine Learning