Semi-Supervised Classifications via Elastic and Robust Embedding

Authors

  • Yun Liu University of Texas at Arlington
  • Yiming Guo Illinois Institute of Technology
  • Hua Wang Colorado School of Mines
  • Feiping Nie University of Texas at Arlington
  • Heng Huang University of Texas at Arlington

DOI:

https://doi.org/10.1609/aaai.v31i1.10946

Keywords:

Semi-Supervised Classifications, Elastic and Robust Embedding

Abstract

Transductive semi-supervised learning can only predict labels for unlabeled data appearing in training data, and can not predict labels for testing data never appearing in training set. To handle this out-of-sample problem, many inductive methods make a constraint such that the predicted label matrix should be exactly equal to a linear model. In practice, this constraint might be too rigid to capture the manifold structure of data. In this paper, we relax this rigid constraint and propose to use an elastic constraint on the predicted label matrix such that the manifold structure can be better explored. Moreover, since unlabeled data are often very abundant in practice and usually there are some outliers, we use a non-squared loss instead of the traditional squared loss to learn a robust model. The derived problem, although is convex, has so many nonsmooth terms, which make it very challenging to solve. In the paper, we propose an efficient optimization algorithm to solve a more general problem, based on which we find the optimal solution to the derived problem.

Downloads

Published

2017-02-13

How to Cite

Liu, Y., Guo, Y., Wang, H., Nie, F., & Huang, H. (2017). Semi-Supervised Classifications via Elastic and Robust Embedding. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.10946