On Power-Law Kernels, Corresponding Reproducing Kernel Hilbert Space and Applications

Authors

  • Debarghya Ghoshdastidar Indian Institute of Science, Bangalore
  • Ambedkar Dukkipati Indian Institute of Science, Bangalore

DOI:

https://doi.org/10.1609/aaai.v27i1.8555

Keywords:

Kernels, Tsallis distributions

Abstract

The role of kernels is central to machine learning. Motivated by the importance of power-law distributions in statistical modeling, in this paper, we propose the notion of power-law kernels to investigate power-laws in learning problem. We propose two power-law kernels by generalizing Gaussian and Laplacian kernels. This generalization is based on distributions, arising out of maximization of a generalized information measure known as nonextensive entropy that is very well studied in statistical mechanics. We prove that the proposed kernels are positive definite, and provide some insights regarding the corresponding Reproducing Kernel Hilbert Space (RKHS). We also study practical significance of both kernels in classification and regression, and present some simulation results.

Downloads

Published

2013-06-30

How to Cite

Ghoshdastidar, D., & Dukkipati, A. (2013). On Power-Law Kernels, Corresponding Reproducing Kernel Hilbert Space and Applications. Proceedings of the AAAI Conference on Artificial Intelligence, 27(1), 365-371. https://doi.org/10.1609/aaai.v27i1.8555