Random Gradient Descent Tree: A Combinatorial Approach for SVM with Outliers

Authors

  • Hu Ding State University of New York at Buffalo
  • Jinhui Xu State University of New York at Buffalo

DOI:

https://doi.org/10.1609/aaai.v29i1.9571

Keywords:

SVM, Outliers, Robust algorithms, Random sampling, Gradient Descent, Boosting

Abstract

Support Vector Machine (SVM) is a fundamental technique in machine learning. A long time challenge facing SVM is how to deal with outliers (caused by mislabeling), as they could make the classes in SVM nonseparable. Existing techniques, such as soft margin SVM, ν-SVM, and Core-SVM, can alleviate the problem to certain extent, but cannot completely resolve the issue. Recently, there are also techniques available for explicit outlier removal. But they suffer from high time complexity and cannot guarantee quality of solution. In this paper, we present a new combinatorial approach, called Random Gradient Descent Tree (or RGD-tree), to explicitly deal with outliers; this results in a new algorithm called RGD-SVM. Our technique yields provably good solution and can be efficiently implemented for practical purpose. The time and space complexities of our approach only linearly depend on the input size and the dimensionality of the space, which are significantly better than existing ones. Experiments on benchmark datasets suggest that our technique considerably outperforms several popular techniques in most of the cases.

Downloads

Published

2015-02-21

How to Cite

Ding, H., & Xu, J. (2015). Random Gradient Descent Tree: A Combinatorial Approach for SVM with Outliers. Proceedings of the AAAI Conference on Artificial Intelligence, 29(1). https://doi.org/10.1609/aaai.v29i1.9571

Issue

Section

Main Track: Novel Machine Learning Algorithms