Liquid Time-constant Networks

Authors

  • Ramin Hasani Massachusetts Institute of Technology (MIT) Technische Universität Wien (TU Wien)
  • Mathias Lechner Institute of Science and Technology Austria (IST Austria)
  • Alexander Amini Massachusetts Institute of Technology (MIT)
  • Daniela Rus Massachusetts Institute of Technology (MIT)
  • Radu Grosu Technische Universität Wien (TU Wien)

DOI:

https://doi.org/10.1609/aaai.v35i9.16936

Keywords:

(Deep) Neural Network Algorithms, Representation Learning, Time-Series/Data Streams

Abstract

We introduce a new class of time-continuous recurrent neural network models. Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems modulated via nonlinear interlinked gates. The resulting models represent dynamical systems with varying (i.e., liquid) time-constants coupled to their hidden state, with outputs being computed by numerical differential equation solvers. These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations, and give rise to improved performance on time-series prediction tasks. To demonstrate these properties, we first take a theoretical approach to find bounds over their dynamics, and compute their expressive power by the trajectory length measure in a latent trajectory space. We then conduct a series of time-series prediction experiments to manifest the approximation capability of Liquid Time-Constant Networks (LTCs) compared to classical and modern RNNs.

Downloads

Published

2021-05-18

How to Cite

Hasani, R., Lechner, M., Amini, A., Rus, D., & Grosu, R. (2021). Liquid Time-constant Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 35(9), 7657-7666. https://doi.org/10.1609/aaai.v35i9.16936

Issue

Section

AAAI Technical Track on Machine Learning II