AutoInit: Analytic Signal-Preserving Weight Initialization for Neural Networks

Authors

  • Garrett Bingham The University of Texas at Austin Cognizant AI Labs
  • Risto Miikkulainen The University of Texas at Austin Cognizant AI Labs

DOI:

https://doi.org/10.1609/aaai.v37i6.25836

Keywords:

ML: Deep Neural Network Algorithms, ML: Applications, ML: Auto ML and Hyperparameter Tuning, ML: Deep Learning Theory, ML: Evolutionary Learning, ML: Optimization

Abstract

Neural networks require careful weight initialization to prevent signals from exploding or vanishing. Existing initialization schemes solve this problem in specific cases by assuming that the network has a certain activation function or topology. It is difficult to derive such weight initialization strategies, and modern architectures therefore often use these same initialization schemes even though their assumptions do not hold. This paper introduces AutoInit, a weight initialization algorithm that automatically adapts to different neural network architectures. By analytically tracking the mean and variance of signals as they propagate through the network, AutoInit appropriately scales the weights at each layer to avoid exploding or vanishing signals. Experiments demonstrate that AutoInit improves performance of convolutional, residual, and transformer networks across a range of activation function, dropout, weight decay, learning rate, and normalizer settings, and does so more reliably than data-dependent initialization methods. This flexibility allows AutoInit to initialize models for everything from small tabular tasks to large datasets such as ImageNet. Such generality turns out particularly useful in neural architecture search and in activation function discovery. In these settings, AutoInit initializes each candidate appropriately, making performance evaluations more accurate. AutoInit thus serves as an automatic configuration tool that makes design of new neural network architectures more robust. The AutoInit package provides a wrapper around TensorFlow models and is available at https://github.com/cognizant-ai-labs/autoinit.

Downloads

Published

2023-06-26

How to Cite

Bingham, G., & Miikkulainen, R. (2023). AutoInit: Analytic Signal-Preserving Weight Initialization for Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 37(6), 6823-6833. https://doi.org/10.1609/aaai.v37i6.25836

Issue

Section

AAAI Technical Track on Machine Learning I