Deep Learning with S-Shaped Rectified Linear Activation Units

Authors

  • Xiaojie Jin National University of Singapore
  • Chunyan Xu Nanjing University of Science and Technology
  • Jiashi Feng National University of Singapore
  • Yunchao Wei National University of Singapore
  • Junjun Xiong Beijing Samsung Telecom
  • Shuicheng Yan National University of Singapore

DOI:

https://doi.org/10.1609/aaai.v30i1.10287

Abstract

Rectified linear activation units are important components for state-of-the-art deep convolutional networks. In this paper, we propose a novel S-shaped rectifiedlinear activation unit (SReLU) to learn both convexand non-convex functions, imitating the multiple function forms given by the two fundamental laws, namely the Webner-Fechner law and the Stevens law, in psychophysics and neural sciences. Specifically, SReLU consists of three piecewise linear functions, which are formulated by four learnable parameters. The SReLU is learned jointly with the training of the whole deep network through back propagation. During the training phase, to initialize SReLU in different layers, we propose a “freezing” method to degenerate SReLU into a predefined leaky rectified linear unit in the initial several training epochs and then adaptively learn the good initial values. SReLU can be universally used in the existing deep networks with negligible additional parameters and computation cost. Experiments with two popular CNN architectures, Network in Network and GoogLeNet on scale-various benchmarks including CIFAR10, CIFAR100, MNIST and ImageNet demonstrate that SReLU achieves remarkable improvement compared to other activation functions.

Downloads

Published

2016-02-21

How to Cite

Jin, X., Xu, C., Feng, J., Wei, Y., Xiong, J., & Yan, S. (2016). Deep Learning with S-Shaped Rectified Linear Activation Units. Proceedings of the AAAI Conference on Artificial Intelligence, 30(1). https://doi.org/10.1609/aaai.v30i1.10287

Issue

Section

Technical Papers: Machine Learning Methods