Multiclass Capped ℓp-Norm SVM for Robust Classifications

Authors

  • Feiping Nie Northwestern Polytechnical University
  • Xiaoqian Wang University of Texas at Arlington
  • Heng Huang University of Texas at Arlington

DOI:

https://doi.org/10.1609/aaai.v31i1.10948

Keywords:

Capped Lp-Norm SVM, Robust Classification

Abstract

Support vector machine (SVM) model is one of most successful machine learning methods and has been successfully applied to solve numerous real-world application. Because the SVM methods use the hinge loss or squared hinge loss functions for classifications, they usually outperform other classification approaches, e.g. the least square loss function based methods. However, like most supervised learning algorithms, they learn classifiers based on the labeled data in training set without specific strategy to deal with the noise data. In many real-world applications, we often have data outliers in train set, which could misguide the classifiers learning, such that the classification performance is suboptimal. To address this problem, we proposed a novel capped Lp-norm SVM classification model by utilizing the capped `p-norm based hinge loss in the objective which can deal with both light and heavy outliers. We utilize the new formulation to naturally build the multiclass capped Lp-norm SVM. More importantly, we derive a novel optimization algorithms to efficiently minimize the capped Lp-norm based objectives, and also rigorously prove the convergence of proposed algorithms. We present experimental results showing that employing the new capped Lp-norm SVM method can consistently improve the classification performance, especially in the cases when the data noise level increases.

Downloads

Published

2017-02-13

How to Cite

Nie, F., Wang, X., & Huang, H. (2017). Multiclass Capped ℓp-Norm SVM for Robust Classifications. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.10948