Variational BOLT: Approximate Learning in Factorial Hidden Markov Models With Application to Energy Disaggregation

Authors

  • Henning Lange Carnegie Mellon University
  • Mario Berges Carnegie Mellon University

DOI:

https://doi.org/10.1609/aaai.v32i1.11342

Keywords:

energy disaggregation, factorial hidden markov models, variational inference, neural networks

Abstract

The learning problem for Factorial Hidden Markov Models with discrete and multi-variate latent variables remains a challenge. Inference of the latent variables required for the E-step of Expectation Minimization algorithms is usually computationally intractable. In this paper we propose a variational learning algorithm mimicking the Baum-Welch algorithm. By approximating the filtering distribution with a variational distribution parameterized by a recurrent neural network, the computational complexity of the learning problem as a function of the number of hidden states can be reduced to quasilinear instead of quadratic time as required by traditional algorithms such as Baum-Welch whilst making minimal independence assumptions. We evaluate the performance of the resulting algorithm, which we call Variational BOLT, in the context of unsupervised end-to-end energy disaggregation. We conduct experiments on the publicly available REDD dataset and show competitive results when compared with a supervised inference approach and state-of-the-art results in an unsupervised setting.

Downloads

Published

2018-04-25

How to Cite

Lange, H., & Berges, M. (2018). Variational BOLT: Approximate Learning in Factorial Hidden Markov Models With Application to Energy Disaggregation. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11342

Issue

Section

Computational Sustainability and Artificial Intelligence