Continuous-Time Attention for Sequential Learning

Authors

  • Jen-Tzung Chien National Chiao Tung University
  • Yi-Hsiang Chen National Chaio Tung University

DOI:

https://doi.org/10.1609/aaai.v35i8.16875

Keywords:

Time-Series/Data Streams, Representation Learning, Interpretaility & Analysis of NLP Models, Text Classification & Sentiment Analysis

Abstract

Attention mechanism is crucial for sequential learning where a wide range of applications have been successfully developed. This mechanism is basically trained to spotlight on the region of interest in hidden states of sequence data. Most of the attention methods compute the attention score through relating between a query and a sequence where the discrete-time state trajectory is represented. Such a discrete-time attention could not directly attend the continuous-time trajectory which is represented via neural differential equation (NDE) combined with recurrent neural network. This paper presents a new continuous-time attention method for sequential learning which is tightly integrated with NDE to construct an attentive continuous-time state machine. The continuous-time attention is performed at all times over the hidden states for different kinds of irregular time signals. The missing information in sequence data due to sampling loss, especially in presence of long sequence, can be seamlessly compensated and attended in learning representation. The experiments on irregular sequence samples from human activities, dialogue sentences and medical features show the merits of the proposed continuous-time attention for activity recognition, sentiment classification and mortality prediction, respectively.

Downloads

Published

2021-05-18

How to Cite

Chien, J.-T., & Chen, Y.-H. (2021). Continuous-Time Attention for Sequential Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 35(8), 7116-7124. https://doi.org/10.1609/aaai.v35i8.16875

Issue

Section

AAAI Technical Track on Machine Learning I