Particle Filter Recurrent Neural Networks

Authors

  • Xiao Ma National University of Singapore
  • Peter Karkus National University of Singapore
  • David Hsu National University of Singapore
  • Wee Sun Lee National University of Singapore

DOI:

https://doi.org/10.1609/aaai.v34i04.5952

Abstract

Recurrent neural networks (RNNs) have been extraordinarily successful for prediction with sequential data. To tackle highly variable and multi-modal real-world data, we introduce Particle Filter Recurrent Neural Networks (PF-RNNs), a new RNN family that explicitly models uncertainty in its internal structure: while an RNN relies on a long, deterministic latent state vector, a PF-RNN maintains a latent state distribution, approximated as a set of particles. For effective learning, we provide a fully differentiable particle filter algorithm that updates the PF-RNN latent state distribution according to the Bayes rule. Experiments demonstrate that the proposed PF-RNNs outperform the corresponding standard gated RNNs on a synthetic robot localization dataset and 10 real-world sequence prediction datasets for text classification, stock price prediction, etc.

Downloads

Published

2020-04-03

How to Cite

Ma, X., Karkus, P., Hsu, D., & Lee, W. S. (2020). Particle Filter Recurrent Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 5101-5108. https://doi.org/10.1609/aaai.v34i04.5952

Issue

Section

AAAI Technical Track: Machine Learning