TY - JOUR AU - Zhang, Yaquan AU - Wu, Qi AU - Peng, Nanbo AU - Dai, Min AU - Zhang, Jing AU - Wang, Hu PY - 2021/05/18 Y2 - 2024/03/29 TI - Memory-Gated Recurrent Networks JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 35 IS - 12 SE - AAAI Technical Track on Machine Learning V DO - 10.1609/aaai.v35i12.17308 UR - https://ojs.aaai.org/index.php/AAAI/article/view/17308 SP - 10956-10963 AB - The essence of multivariate sequential learning is all about how to extract dependencies in data. These data sets, such as hourly medical records in intensive care units and multi-frequency phonetic time series, often time exhibit not only strong serial dependencies in the individual components (the "marginal" memory) but also non-negligible memories in the cross-sectional dependencies (the "joint" memory). Because of the multivariate complexity in the evolution of the joint distribution that underlies the data generating process, we take a data-driven approach and construct a novel recurrent network architecture, termed Memory-Gated Recurrent Networks (mGRN), with gates explicitly regulating two distinct types of memories: the marginal memory and the joint memory. Through a combination of comprehensive simulation studies and empirical experiments on a range of public datasets, we show that our proposed mGRN architecture consistently outperforms state-of-the-art architectures targeting multivariate time series. ER -