PAC-Bayes Generalisation Bounds for Dynamical Systems including Stable RNNs

Authors

  • Deividas Eringis Department of Electronic Systems, Aalborg University
  • John Leth Department of Electronic Systems, Aalborg University
  • Zheng-Hua Tan Department of Electronic Systems, Aalborg University
  • Rafael Wisniewski Department of Electronic Systems, Aalborg University
  • Mihály Petreczky Univ. Lille, CNRS, Centrale Lille, UMR 9189 CRIStAL

DOI:

https://doi.org/10.1609/aaai.v38i11.29076

Keywords:

ML: Learning Theory, ML: Bayesian Learning, ML: Deep Learning Theory, ML: Time-Series/Data Streams

Abstract

In this paper, we derive a PAC-Bayes bound on the generalisation gap, in a supervised time-series setting for a special class of discrete-time non-linear dynamical systems. This class includes stable recurrent neural networks (RNN), and the motivation for this work was its application to RNNs. In order to achieve the results, we impose some stability constraints, on the allowed models. Here, stability is understood in the sense of dynamical systems. For RNNs, these stability conditions can be expressed in terms of conditions on the weights. We assume the processes involved are essentially bounded and the loss functions are Lipschitz. The proposed bound on the generalisation gap depends on the mixing coefficient of the data distribution, and the essential supremum of the data. Furthermore, the bound converges to zero as the dataset size increases. In this paper, we 1) formalize the learning problem, 2) derive a PAC-Bayesian error bound for such systems, 3) discuss various consequences of this error bound, and 4) show an illustrative example, with discussions on computing the proposed bound. Unlike other available bounds the derived bound holds for non i.i.d. data (time-series) and it does not grow with the number of steps of the RNN.

Published

2024-03-24

How to Cite

Eringis, D., Leth, J., Tan, Z.-H., Wisniewski, R., & Petreczky, M. (2024). PAC-Bayes Generalisation Bounds for Dynamical Systems including Stable RNNs. Proceedings of the AAAI Conference on Artificial Intelligence, 38(11), 11901-11909. https://doi.org/10.1609/aaai.v38i11.29076

Issue

Section

AAAI Technical Track on Machine Learning II