Discriminative Vanishing Component Analysis

Authors

  • Chenping Hou National University of Defense Technology
  • Feiping Nie Northwestern Polytechnical University
  • Dacheng Tao University of Technology, Sydney

DOI:

https://doi.org/10.1609/aaai.v30i1.10223

Abstract

Vanishing Component Analysis (VCA) is a recently proposed prominent work in machine learning. It narrows the gap between tools and computational algebra: the vanishing ideal and its applications to classification problem. In this paper, we will analyze VCA in the kernel view, which is also another important research direction in machine learning. Under a very weak assumption, we provide a different point of view to VCA and make the kernel trick on VCA become possible. We demonstrate that the projection matrix derived by VCA is located in the same space as that of Kernel Principal Component Analysis (KPCA) with a polynomial kernel. Two groups of projections can express each other by linear transformation. Furthermore, we prove that KPCA and VCA have identical discriminative power, provided that the ratio trace criteria is employed as the measurement. We also show that the kernel formulated by the inner products of VCA's projections can be expressed by the KPCA's kernel linearly. Based on the analysis above, we proposed a novel Discriminative Vanishing Component Analysis (DVCA) approach. Experimental results are provided for demonstration.

Downloads

Published

2016-02-21

How to Cite

Hou, C., Nie, F., & Tao, D. (2016). Discriminative Vanishing Component Analysis. Proceedings of the AAAI Conference on Artificial Intelligence, 30(1). https://doi.org/10.1609/aaai.v30i1.10223

Issue

Section

Technical Papers: Machine Learning Methods