Memory-Gated Recurrent Networks
AbstractThe essence of multivariate sequential learning is all about how to extract dependencies in data. These data sets, such as hourly medical records in intensive care units and multi-frequency phonetic time series, often time exhibit not only strong serial dependencies in the individual components (the "marginal" memory) but also non-negligible memories in the cross-sectional dependencies (the "joint" memory). Because of the multivariate complexity in the evolution of the joint distribution that underlies the data generating process, we take a data-driven approach and construct a novel recurrent network architecture, termed Memory-Gated Recurrent Networks (mGRN), with gates explicitly regulating two distinct types of memories: the marginal memory and the joint memory. Through a combination of comprehensive simulation studies and empirical experiments on a range of public datasets, we show that our proposed mGRN architecture consistently outperforms state-of-the-art architectures targeting multivariate time series.
How to Cite
Zhang, Y., Wu, Q., Peng, N., Dai, M., Zhang, J., & Wang, H. (2021). Memory-Gated Recurrent Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12), 10956-10963. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17308
AAAI Technical Track on Machine Learning V