Eigenvalues Ratio for Kernel Selection of Kernel Methods


  • Yong Liu Tianjin University
  • Shizhong Liao Tianjin University




kernel selection, matrix spectra, error bound


The selection of kernel function which determines the mapping between the input space and the feature space is of crucial importance to kernel methods. Existing kernel selection approaches commonly use some measures of generalization error, which are usually difficult to estimate and have slow convergence rates. In this paper, we propose a novel measure, called eigenvalues ratio (ER), of the tight bound of generalization error for kernel selection. ER is the ration between the sum of the main eigenvalues and that of the tail eigenvalues of the kernel matrix. Defferent from most of existing measures, ER is defined on the kernel matrxi, so it can be estimated easily from the available training data, which makes it usable for kernel selection. We establish tight ER-based generalization error bounds of order $O(\frac{1}{n})$ for several kernel-based methods under certain general conditions, while for most of existing measures, the convergence rate is at most $O(\frac{1}{\sqrt{n}})$. Finally, to guarantee good generalization performance, we propose a novel kernel selection criterion by minimizing the derived tight generalization error bounds. Theoretical analysis and experimental results demonstrate that our kernel selection criterion is a good choice for kernel seletion.




How to Cite

Liu, Y., & Liao, S. (2015). Eigenvalues Ratio for Kernel Selection of Kernel Methods. Proceedings of the AAAI Conference on Artificial Intelligence, 29(1). https://doi.org/10.1609/aaai.v29i1.9554



Main Track: Novel Machine Learning Algorithms