Generalization Bounds for Inductive Matrix Completion in Low-Noise Settings

Authors

  • Antoine Ledent Singapore Management University
  • Rodrigo Alves Czech Technical University
  • Yunwen Lei Hong Kong Baptist University
  • Yann Guermeur CNRS
  • Marius Kloft Technische Universität Kaiserslautern

DOI:

https://doi.org/10.1609/aaai.v37i7.26018

Keywords:

ML: Learning Theory, ML: Matrix & Tensor Methods, ML: Other Foundations of Machine Learning

Abstract

We study inductive matrix completion (matrix completion with side information) under an i.i.d. subgaussian noise assumption at a low noise regime, with uniform sampling of the entries. We obtain for the first time generalization bounds with the following three properties: (1) they scale like the standard deviation of the noise and in particular approach zero in the exact recovery case; (2) even in the presence of noise, they converge to zero when the sample size approaches infinity; and (3) for a fixed dimension of the side information, they only have a logarithmic dependence on the size of the matrix. Differently from many works in approximate recovery, we present results both for bounded Lipschitz losses and for the absolute loss, with the latter relying on Talagrand-type inequalities. The proofs create a bridge between two approaches to the theoretical analysis of matrix completion, since they consist in a combination of techniques from both the exact recovery literature and the approximate recovery literature.

Downloads

Published

2023-06-26

How to Cite

Ledent, A., Alves, R., Lei, Y., Guermeur, Y., & Kloft, M. (2023). Generalization Bounds for Inductive Matrix Completion in Low-Noise Settings. Proceedings of the AAAI Conference on Artificial Intelligence, 37(7), 8447-8455. https://doi.org/10.1609/aaai.v37i7.26018

Issue

Section

AAAI Technical Track on Machine Learning II