IEEE Access (Jan 2019)

A Dynamic Rectified Linear Activation Units

  • Xiaobin Hu,
  • Peifeng Niu,
  • Jianmei Wang,
  • Xinxin Zhang

DOI
https://doi.org/10.1109/ACCESS.2019.2959036
Journal volume & issue
Vol. 7
pp. 180409 – 180416

Abstract

Read online

Deep neural network regression models produce substantial gains in big data prediction systems. Multilayer perceptron neural (MPL) networks have more various properties than single-layer feedforward neural networks. A deeper neural network is more intelligent and sophisticated, which is one of the main research directions. However, the disappearing gradient is the primary problem that restricts the research. The appropriate activation function is one of the effective methods for solving this problem. A bold idea about activation functions emerged: if the activation function is different in two adjacent training epochs, the probability of the same gradient value will be small. We proposed a novel activation function whose shape can be changed dynamically in training. Our experimental results show that this activation function with “dynamic” characteristics can effectively avoid the disappearing gradient and can make the multilayer perceptron neural networks deeper.

Keywords