Applied Sciences (Apr 2025)
Optimizing Activation Function for Parameter Reduction in CNNs on CIFAR-10 and CINIC-10
Abstract
This paper explores a simple CNN architecture used for image classification. Since the first introduction of the CNN idea, LeNet5, CNNs have become the main method for image exploration. A previously unexplored activation function is proposed, which improves the accuracy on one hand, while reducing the execution time. The other positive side is the faster convergence of the loss function. Unlike other well-known activation functions, such as ReLU, Tanh, and others, Exponential Partial Unit (EPU) does not overfit after 100 or more iterations. For this study, the CIFAR-10 dataset is chosen, which is a benchmark for this kind of research. This paper aims to present another view on CNNs, showing that effective networks can be trained with fewer parameters and reduced computational resources.
Keywords