Instance Enhancement Batch Normalization: An Adaptive Regulator of Batch Noise

Authors

  • Senwei Liang Purdue University
  • Zhongzhan Huang Tsinghua University
  • Mingfu Liang Northwestern University
  • Haizhao Yang Purdue University

DOI:

https://doi.org/10.1609/aaai.v34i04.5917

Abstract

Batch Normalization (BN) (Ioffe and Szegedy 2015) normalizes the features of an input image via statistics of a batch of images and hence BN will bring the noise to the gradient of training loss. Previous works indicate that the noise is important for the optimization and generalization of deep neural networks, but too much noise will harm the performance of networks. In our paper, we offer a new point of view that the self-attention mechanism can help to regulate the noise by enhancing instance-specific information to obtain a better regularization effect. Therefore, we propose an attention-based BN called Instance Enhancement Batch Normalization (IEBN) that recalibrates the information of each channel by a simple linear transformation. IEBN has a good capacity of regulating the batch noise and stabilizing network training to improve generalization even in the presence of two kinds of noise attacks during training. Finally, IEBN outperforms BN with only a light parameter increment in image classification tasks under different network structures and benchmark datasets.

Downloads

Published

2020-04-03

How to Cite

Liang, S., Huang, Z., Liang, M., & Yang, H. (2020). Instance Enhancement Batch Normalization: An Adaptive Regulator of Batch Noise. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 4819-4827. https://doi.org/10.1609/aaai.v34i04.5917

Issue

Section

AAAI Technical Track: Machine Learning