Tensor-Variate Restricted Boltzmann Machines

Authors

  • Tu Nguyen Deakin University
  • Truyen Tran Deakin University and Curtin University
  • Dinh Phung Deakin University
  • Svetha Venkatesh Deakin University

DOI:

https://doi.org/10.1609/aaai.v29i1.9553

Keywords:

tensor, rbm, restricted boltzmann machine, tvrbm, multiplicative interaction, eeg

Abstract

Restricted Boltzmann Machines (RBMs) are an important class of latent variable models for representing vector data. An under-explored area is multimode data, where each data point is a matrix or a tensor. Standard RBMs applying to such data would require vectorizing matrices and tensors, thus resulting in unnecessarily high dimensionality and at the same time, destroying the inherent higher-order interaction structures. This paper introduces Tensor-variate Restricted Boltzmann Machines (TvRBMs) which generalize RBMs to capture the multiplicative interaction between data modes and the latent variables. TvRBMs are highly compact in that the number of free parameters grows only linear with the number of modes. We demonstrate the capacity of TvRBMs on three real-world applications: handwritten digit classification, face recognition and EEG-based alcoholic diagnosis. The learnt features of the model are more discriminative than the rivals, resulting in better classification performance.

Downloads

Published

2015-02-21

How to Cite

Nguyen, T., Tran, T., Phung, D., & Venkatesh, S. (2015). Tensor-Variate Restricted Boltzmann Machines. Proceedings of the AAAI Conference on Artificial Intelligence, 29(1). https://doi.org/10.1609/aaai.v29i1.9553

Issue

Section

Main Track: Novel Machine Learning Algorithms