e-Prime: Advances in Electrical Engineering, Electronics and Energy (Jun 2024)

Comparative study of integer-order and fractional-order artificial neural networks: Application for mathematical function generation

  • Manisha Premkumar Joshi,
  • Savita Bhosale,
  • Vishwesh A. Vyawahare

Journal volume & issue
Vol. 8
p. 100601

Abstract

Read online

This paper investigates the impact of fractional derivatives on the activation functions of an artificial neural network (ANN). Based on the results and analysis, a three-layer backpropagation neural network model with fractional and integer derivatives employed in the activation function and fractional gradient descent method backpropagation learning algorithm has been proposed. Specifically, three perceptrons have been proposed based on fractional and integer derivatives applied to activation functions and learning algorithms. They are fractional derivative activation function (FDAF) perceptron, integer derivative activation function (IDAF)-fractional derivative learning algorithm (FDLA) perceptron, and fractional derivative learning algorithm (FDLA) perceptron. The Riemann-Liouville (RL) fractional derivative, Grunwald-Letnikov (GL) derivative, Caputo-Fabrizio (CF) fractional derivative, Caputo (C) fractional derivative, and Atangana-Baleanu (AB) fractional derivative have been employed. The impact of these derivatives’ fractional-order (FO) is investigated in a wide range from 0.1−0.9 on testing mean square error (MSE) and time required to train the perceptron. FO-based and IO-based perceptrons are compared with the help of performance metrics such as testing MSE and the time required to train the models. The training and testing simulation results illustrate that Caputo-Fabrizio derivative-based perceptrons outperform other fractional derivatives. Also, the performance of the FO-based perceptron is better in terms of the least MSE.

Keywords