Memory-Gated Recurrent Networks

Authors

  • Yaquan Zhang National University of Singapore, Department of Mathematics and Risk Management Institute
  • Qi Wu City University of Hong Kong
  • Nanbo Peng JD Digits
  • Min Dai National University of Singapore, Department of Mathematics, Risk Management Institute, and Chong-Qing & Suzhou Research Institutes
  • Jing Zhang City University of Hong Kong
  • Hu Wang JD Digits

DOI:

https://doi.org/10.1609/aaai.v35i12.17308

Keywords:

Time-Series/Data Streams

Abstract

The essence of multivariate sequential learning is all about how to extract dependencies in data. These data sets, such as hourly medical records in intensive care units and multi-frequency phonetic time series, often time exhibit not only strong serial dependencies in the individual components (the "marginal" memory) but also non-negligible memories in the cross-sectional dependencies (the "joint" memory). Because of the multivariate complexity in the evolution of the joint distribution that underlies the data generating process, we take a data-driven approach and construct a novel recurrent network architecture, termed Memory-Gated Recurrent Networks (mGRN), with gates explicitly regulating two distinct types of memories: the marginal memory and the joint memory. Through a combination of comprehensive simulation studies and empirical experiments on a range of public datasets, we show that our proposed mGRN architecture consistently outperforms state-of-the-art architectures targeting multivariate time series.

Downloads

Published

2021-05-18

How to Cite

Zhang, Y., Wu, Q., Peng, N., Dai, M., Zhang, J., & Wang, H. (2021). Memory-Gated Recurrent Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12), 10956-10963. https://doi.org/10.1609/aaai.v35i12.17308

Issue

Section

AAAI Technical Track on Machine Learning V