Recurrent Attentional Topic Model

Authors

  • Shuangyin Li Hong Kong University of Science and Technology
  • Yu Zhang Hong Kong University of Science and Technology
  • Rong Pan Sun Yat-sen University
  • Mingzhi Mao Sun Yat-sen University
  • Yang Yang iPIN, Shen Zhen

DOI:

https://doi.org/10.1609/aaai.v31i1.10972

Keywords:

Recurrent, Attention, Topic Model

Abstract

In a document, the topic distribution of a sentence depends on both the topics of preceding sentences and its own content, and it is usually affected by the topics of the preceding sentences with different weights. It is natural that a document can be treated as a sequence of sentences. Most existing works for Bayesian document modeling do not take these points into consideration. To fill this gap, we propose a Recurrent Attentional Topic Model (RATM) for document embedding. The RATM not only takes advantage of the sequential orders among sentence but also use the attention mechanism to model the relations among successive sentences. In RATM, we propose a Recurrent Attentional Bayesian Process (RABP) to handle the sequences. Based on the RABP, RATM fully utilizes the sequential information of the sentences in a document. Experiments on two copora show that our model outperforms state-of-the-art methods on document modeling and classification.

Downloads

Published

2017-02-12

How to Cite

Li, S., Zhang, Y., Pan, R., Mao, M., & Yang, Y. (2017). Recurrent Attentional Topic Model. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.10972