Longitudinal Deep Kernel Gaussian Process Regression

Authors

  • Junjie Liang Pennsylvania State University
  • Yanting Wu Pennsylvania State University
  • Dongkuan Xu Pennsylvania State University
  • Vasant G Honavar Pennsylvania State University

Keywords:

Kernel Methods, Representation Learning

Abstract

Gaussian processes offer an attractive framework for predictive modeling from longitudinal data, \ie irregularly sampled, sparse observations from a set of individuals over time. However, such methods have two key shortcomings: (i) They rely on ad hoc heuristics or expensive trial and error to choose the effective kernels, and (ii) They fail to handle multilevel correlation structure in the data. We introduce Longitudinal deep kernel Gaussian process regression (L-DKGPR) to overcome these limitations by fully automating the discovery of complex multilevel correlation structure from longitudinal data. Specifically, L-DKGPR eliminates the need for ad hoc heuristics or trial and error using a novel adaptation of deep kernel learning that combines the expressive power of deep neural networks with the flexibility of non-parametric kernel methods. L-DKGPR effectively learns the multilevel correlation with a novel additive kernel that simultaneously accommodates both time-varying and the time-invariant effects. We derive an efficient algorithm to train L-DKGPR using latent space inducing points and variational inference. Results of extensive experiments on several benchmark data sets demonstrate that L-DKGPR significantly outperforms the state-of-the-art longitudinal data analysis (LDA) methods.

Downloads

Published

2021-05-18

How to Cite

Liang, J., Wu, Y., Xu, D., & Honavar, V. G. (2021). Longitudinal Deep Kernel Gaussian Process Regression. Proceedings of the AAAI Conference on Artificial Intelligence, 35(10), 8556-8564. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17038

Issue

Section

AAAI Technical Track on Machine Learning III