Non-parametric Representation Learning with Kernels

Authors

  • Pascal Esser Technical University of Munich
  • Maximilian Fleissner Technical University of Munich
  • Debarghya Ghoshdastidar Technical University of Munich

DOI:

https://doi.org/10.1609/aaai.v38i11.29077

Keywords:

ML: Unsupervised & Self-Supervised Learning, ML: Kernel Methods, ML: Representation Learning

Abstract

Unsupervised and self-supervised representation learning has become popular in recent years for learning useful features from unlabelled data. Representation learning has been mostly developed in the neural network literature, and other models for representation learning are surprisingly unexplored. In this work, we introduce and analyze several kernel-based representation learning approaches: Firstly, we define two kernel Self-Supervised Learning (SSL) models using contrastive loss functions and secondly, a Kernel Autoencoder (AE) model based on the idea of embedding and reconstructing data. We argue that the classical representer theorems for supervised kernel machines are not always applicable for (self-supervised) representation learning, and present new representer theorems, which show that the representations learned by our kernel models can be expressed in terms of kernel matrices. We further derive generalisation error bounds for representation learning with kernel SSL and AE, and empirically evaluate the performance of these methods in both small data regimes as well as in comparison with neural network based models.

Downloads

Published

2024-03-24

How to Cite

Esser, P., Fleissner, M., & Ghoshdastidar, D. (2024). Non-parametric Representation Learning with Kernels. Proceedings of the AAAI Conference on Artificial Intelligence, 38(11), 11910-11918. https://doi.org/10.1609/aaai.v38i11.29077

Issue

Section

AAAI Technical Track on Machine Learning II