Selecting Proper Multi-Class SVM Training Methods

Authors

  • Yawen Chen South China University of Technology
  • Zeyi Wen National University of Singapore
  • Jian Chen South China University of Technology
  • Jin Huang South China Normal University

DOI:

https://doi.org/10.1609/aaai.v32i1.12156

Keywords:

Multi-class SVMs, one-versus-one, one-versus-all, structural SVMs

Abstract

Support Vector Machines (SVMs) are excellent candidate solutions to solving multi-class problems, and multi-class SVMs can be trained by several different methods. Different training methods commonly produce SVMs with different effectiveness, and no multi-class SVM training method always outperforms other multi-class SVM training methods on all problems. This raises difficulty for practitioners to choose the best training method for a given problem. In this work, we propose a Multi-class Method Selection (MMS) approach to help users select the most appropriate method among one-versus-one (OVO), one-versus-all (OVA) and structural SVMs (SSVMs) for a given problem. Our key idea is to select the training method based on the distribution of training data and the similarity between different classes. Using the distribution and class similarity, we estimate the unclassifiable rate of each multi-class SVM training method, and select the training method with the minimum unclassifiable rate. Our initial findings show: (i) SSVMs with linear kernel perform worse than OVO and OVA; (ii) MMS often produces SVM classifiers that can confidently classify unseen instances.

Downloads

Published

2018-04-29

How to Cite

Chen, Y., Wen, Z., Chen, J., & Huang, J. (2018). Selecting Proper Multi-Class SVM Training Methods. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.12156