Zero Stability Well Predicts Performance of Convolutional Neural Networks

Authors

  • Liangming Chen Chongqing Key Laboratory of Big Data and Intelligent Computing, Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences Chongqing School, University of Chinese Academy of Sciences School of Information Science and Engineering, Lanzhou University
  • Long Jin Chongqing Key Laboratory of Big Data and Intelligent Computing, Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences School of Information Science and Engineering, Lanzhou University
  • Mingsheng Shang Chongqing Key Laboratory of Big Data and Intelligent Computing, Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences

DOI:

https://doi.org/10.1609/aaai.v36i6.20576

Keywords:

Machine Learning (ML), Computer Vision (CV)

Abstract

The question of what kind of convolutional neural network (CNN) structure performs well is fascinating. In this work, we move toward the answer with one more step by connecting zero stability and model performance. Specifically, we found that if a discrete solver of an ordinary differential equation is zero stable, the CNN corresponding to that solver performs well. We first give the interpretation of zero stability in the context of deep learning and then investigate the performance of existing first- and second-order CNNs under different zero-stable circumstances. Based on the preliminary observation, we provide a higher-order discretization to construct CNNs and then propose a zero-stable network (ZeroSNet). To guarantee zero stability of the ZeroSNet, we first deduce a structure that meets consistency conditions and then give a zero stable region of a training-free parameter. By analyzing the roots of a characteristic equation, we theoretically obtain the optimal coefficients of feature maps. Empirically, we present our results from three aspects: We provide extensive empirical evidence of different depth on different datasets to show that the moduli of the characteristic equation's roots are the keys for the performance of CNNs that require historical features; Our experiments show that ZeroSNet outperforms existing CNNs which is based on high-order discretization; ZeroSNets show better robustness against noises on the input. The source code is available at https://github.com/logichen/ZeroSNet.

Downloads

Published

2022-06-28

How to Cite

Chen, L., Jin, L., & Shang, M. (2022). Zero Stability Well Predicts Performance of Convolutional Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 36(6), 6268-6277. https://doi.org/10.1609/aaai.v36i6.20576

Issue

Section

AAAI Technical Track on Machine Learning I