The Kernel Kalman Rule — Efficient Nonparametric Inference with Recursive Least Squares

Authors

  • Gregor Gebhardt Technische Universität Darmstadt
  • Andras Kupcsik National University of Singapore
  • Gerhard Neumann University of Lincoln

DOI:

https://doi.org/10.1609/aaai.v31i1.11051

Keywords:

kernel methods, nonparametric inference, RKHS, filtering, model learning, probabilistic reasoning

Abstract

Nonparametric inference techniques provide promising tools for probabilistic reasoning in high-dimensional nonlinear systems.Most of these techniques embed distributions into reproducing kernel Hilbert spaces (RKHS) and rely on the kernel Bayes' rule (KBR) to manipulate the embeddings. However, the computational demands of the KBR scale poorly with the number of samples and the KBR often suffers from numerical instabilities. In this paper, we present the kernel Kalman rule (KKR) as an alternative to the KBR.The derivation of the KKR is based on recursive least squares, inspired by the derivation of the Kalman innovation update.We apply the KKR to filtering tasks where we use RKHS embeddings to represent the belief state, resulting in the kernel Kalman filter (KKF).We show on a nonlinear state estimation task with high dimensional observations that our approach provides a significantly improved estimation accuracy while the computational demands are significantly decreased.

Downloads

Published

2017-02-12

How to Cite

Gebhardt, G., Kupcsik, A., & Neumann, G. (2017). The Kernel Kalman Rule — Efficient Nonparametric Inference with Recursive Least Squares. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.11051

Issue

Section

AAAI Technical Track: Reasoning under Uncertainty