IEEE Access (Jan 2018)
An Improved ResNet Based on the Adjustable Shortcut Connections
Abstract
ResNet can achieve deeper network and higher performance, but there is no good explanation for how identity shortcut connections solve the gradient fading problems. Moreover, it is not reasonable to adopt identity mapping for all layer parameters. In this paper, we first establish a simplified ResNet that is similar to the ResNet in principle, and deduce the back propagation of the networks. Second, according to the back propagation of the simplified ResNet, we indirectly explain how the identity shortcut connections solve the problems of gradient fading in convolutional neural networks. Third, we propose an improved ResNet via adjustable shortcut connections, and design a convex k strategy for the improved ResNet according to the different region parameters changing rules. Experimental results on the CIFAR-10 data set show that the test accuracy of the improved ResNet is 78.63%, which is 2.85% higher than that of ResNet. On the CIFAR-100 data set, the test accuracy of the improved ResNet is 42.53%, which is 3.66% higher than that of ResNet. More importantly, the improved ResNet does not increase the amount of computation compared with the classical ResNet.
Keywords