Avoiding Kernel Fixed Points: Computing with ELU and GELU Infinite Networks

Authors

  • Russell Tsuchida Commonwealth Science & Industrial Research Organisation The University of Queensland
  • Tim Pearce University of Cambridge
  • Chris van der Heide The University of Queensland
  • Fred Roosta The University of Queensland International Computer Science Institute
  • Marcus Gallagher The University of Queensland

DOI:

https://doi.org/10.1609/aaai.v35i11.17197

Keywords:

Bayesian Learning, (Deep) Neural Network Learning Theory, Kernel Methods

Abstract

Analysing and computing with Gaussian processes arising from infinitely wide neural networks has recently seen a resurgence in popularity. Despite this, many explicit covariance functions of networks with activation functions used in modern networks remain unknown. Furthermore, while the kernels of deep networks can be computed iteratively, theoretical understanding of deep kernels is lacking, particularly with respect to fixed-point dynamics. Firstly, we derive the covariance functions of multi-layer perceptrons (MLPs) with exponential linear units (ELU) and Gaussian error linear units (GELU) and evaluate the performance of the limiting Gaussian processes on some benchmarks. Secondly, and more generally, we analyse the fixed-point dynamics of iterated kernels corresponding to a broad range of activation functions. We find that unlike some previously studied neural network kernels, these new kernels exhibit non-trivial fixed-point dynamics which are mirrored in finite-width neural networks. The fixed point behaviour present in some networks explains a mechanism for implicit regularisation in overparameterised deep models. Our results relate to both the static iid parameter conjugate kernel and the dynamic neural tangent kernel constructions

Downloads

Published

2021-05-18

How to Cite

Tsuchida, R., Pearce, T., van der Heide, C., Roosta, F., & Gallagher, M. (2021). Avoiding Kernel Fixed Points: Computing with ELU and GELU Infinite Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 35(11), 9967-9977. https://doi.org/10.1609/aaai.v35i11.17197

Issue

Section

AAAI Technical Track on Machine Learning IV