TY - JOUR AU - Liu, Yuxiang AU - Ge, Jidong AU - Li, Chuanyi AU - Gui, Jie PY - 2021/05/18 Y2 - 2024/03/28 TI - Delving into Variance Transmission and Normalization: Shift of Average Gradient Makes the Network Collapse JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 35 IS - 3 SE - AAAI Technical Track on Computer Vision II DO - 10.1609/aaai.v35i3.16320 UR - https://ojs.aaai.org/index.php/AAAI/article/view/16320 SP - 2216-2224 AB - Normalization operations are essential for state-of-the-art neural networks and enable us to train a network from scratch with a large learning rate (LR). We attempt to explain the real effect of Batch Normalization (BN) from the perspective of variance transmission by investigating the relationship between BN and Weights Normalization (WN). In this work, we demonstrate that the problem of the shift of the average gradient will amplify the variance of every convolutional (conv) layer. We propose Parametric Weights Standardization (PWS), a fast and robust to mini-batch size module used for conv filters, to solve the shift of the average gradient. PWS can provide the speed-up of BN. Besides, it has less computation and does not change the output of a conv layer. PWS enables the network to converge fast without normalizing the outputs. This result enhances the persuasiveness of the shift of the average gradient and explains why BN works from the perspective of variance transmission. The code and appendix will be made available on https://github.com/lyxzzz/PWSConv. ER -