TY - JOUR AU - Tompkins, Anthony AU - Ramos, Fabio PY - 2018/04/29 Y2 - 2024/03/28 TI - Fourier Feature Approximations for Periodic Kernels in Time-Series Modelling JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 32 IS - 1 SE - AAAI Technical Track: Machine Learning DO - 10.1609/aaai.v32i1.11696 UR - https://ojs.aaai.org/index.php/AAAI/article/view/11696 SP - AB - <p> Gaussian Processes (GPs) provide an extremely powerful mechanism to model a variety of problems but incur an O(N<sup>3</sup>) complexity in the number of data samples. Common approximation methods rely on what are often termed inducing points but still typically incur an O(NM<sup>2</sup>) complexity in the data and corresponding inducing points. Using Random Fourier Feature (RFF) maps, we overcome this by transforming the problem into a Bayesian Linear Regression formulation upon which we apply a Bayesian Variational treatment that also allows learning the corresponding kernel hyperparameters, likelihood and noise parameters. In this paper we introduce an alternative method using Fourier series to obtain spectral representations of common kernels, in particular for periodic warpings, which surprisingly have a convergent, non-random form using special functions, requiring fewer spectral features to approximate their corresponding kernel to high accuracy. Using this, we can fuse the Random Fourier Feature spectral representations of common kernels with their periodic counterparts to show how they can more effectively and expressively learn patterns in time-series for both interpolation and extrapolation. This method combines robustness, scalability and equally importantly, interpretability through a symbolic declarative grammar that is both functionally and humanly intuitive — a property that is crucial for explainable decision making. Using probabilistic programming and Variational Inference we are able to efficiently optimise over these rich functional representations. We show significantly improved Gram matrix approximation errors, and also demonstrate the method in several time-series problems comparing other commonly used approaches such as recurrent neural networks. </p> ER -