Learning the Kernel Matrix with Low-Rank Multiplicative Shaping

Authors

  • Tomer Levinboim University of Southern California
  • Fei Sha University of Southern California

DOI:

https://doi.org/10.1609/aaai.v26i1.8306

Keywords:

Multiple Kernel Learning, Low-Rank Matrix

Abstract

Selecting the optimal kernel is an important and difficult challenge in applying kernel methods to pattern recognition. To address this challenge, multiple kernel learning (MKL) aims to learn a kernel from a combination of base kernel functions that perform optimally on the task. In this paper, we propose a novel MKL-themed approach to combine base kernels that are multiplicatively shaped with low-rank positive semidefinitve matrices. The proposed approach generalizes several popular MKL methods and thus provides more flexibility in modeling data. Computationally, we show how these low-rank matrices can be learned efficiently from data using convex quadratic programming. Empirical studies on several standard benchmark datasets for MKL show that the new approach often improves prediction accuracy statistically significantly over very competitive single kernel and other MKL methods.

Downloads

Published

2021-09-20

How to Cite

Levinboim, T., & Sha, F. (2021). Learning the Kernel Matrix with Low-Rank Multiplicative Shaping. Proceedings of the AAAI Conference on Artificial Intelligence, 26(1), 984-990. https://doi.org/10.1609/aaai.v26i1.8306

Issue

Section

AAAI Technical Track: Machine Learning