Machines of Finite Depth: Towards a Formalization of Neural Networks

Authors

  • Pietro Vertechi Independent Researcher
  • Mattia G. Bergomi Independent Researcher

DOI:

https://doi.org/10.1609/aaai.v37i8.26199

Keywords:

ML: Deep Learning Theory, ML: Optimization, ML: Deep Neural Architectures, ML: Deep Neural Network Algorithms, ML: Learning Theory, ML: Auto ML and Hyperparameter Tuning, ML: Transparent, Interpretable, Explainable ML

Abstract

We provide a unifying framework where artificial neural networks and their architectures can be formally described as particular cases of a general mathematical construction---machines of finite depth. Unlike neural networks, machines have a precise definition, from which several properties follow naturally. Machines of finite depth are modular (they can be combined), efficiently computable, and differentiable. The backward pass of a machine is again a machine and can be computed without overhead using the same procedure as the forward pass. We prove this statement theoretically and practically via a unified implementation that generalizes several classical architectures---dense, convolutional, and recurrent neural networks with a rich shortcut structure---and their respective backpropagation rules.

Downloads

Published

2023-06-26

How to Cite

Vertechi, P., & Bergomi, M. G. (2023). Machines of Finite Depth: Towards a Formalization of Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 10061-10068. https://doi.org/10.1609/aaai.v37i8.26199

Issue

Section

AAAI Technical Track on Machine Learning III