KernelMatmul: Scaling Gaussian Processes to Large Time Series

Authors

  • Tilman Hoffbauer Chair for AI Methodology, RWTH Aachen University, Germany
  • Holger H. Hoos Chair for AI Methodology, RWTH Aachen University, Germany Leiden Institute of Advanced Computer Science, Leiden University, The Netherlands University of British Columbia, Canada
  • Jakob Bossek Chair for Machine Learning and Optimisation, Paderborn University, Germany

DOI:

https://doi.org/10.1609/aaai.v39i16.33893

Abstract

Time series forecasting requires reliable uncertainty estimates. Gaussian process regression provides a powerful framework for modelling this in a probabilistic fashion. However, its application to large time series is challenging, due to its cubic time complexity and quadratic memory requirement. In this work, we present KernelMatmul, a novel method that accelerates Gaussian process inference and thus facilitates scaling of Gaussian process regression to large, irregularly sampled and multi-output time series. Leveraging conjugate gradients in combination with sparsity approximation, KernelMatmul achieves time and memory complexity linear in the number of samples. We thoroughly benchmark our new method against multiple baselines to demonstrate its benefits and limitations, both in efficiency and accuracy.

Published

2025-04-11

How to Cite

Hoffbauer, T., Hoos, H. H., & Bossek, J. (2025). KernelMatmul: Scaling Gaussian Processes to Large Time Series. Proceedings of the AAAI Conference on Artificial Intelligence, 39(16), 17223–17230. https://doi.org/10.1609/aaai.v39i16.33893

Issue

Section

AAAI Technical Track on Machine Learning II