ITM Web of Conferences (Jan 2018)
Backpropagation algorithm with fractional derivatives
Abstract
The paper presents a model of a neural network with a novel backpropagation rule, which uses a fractional order derivative mechanism. Using the Grunwald Letnikow definition of the discrete approximation of the fractional derivative, the author proposed the smooth modeling of the transition functions of a single neuron. On this basis, a new concept of a modified backpropagation algorithm was proposed that uses the fractional derivative mechanism both for modeling the dynamics of individual neurons and for minimizing the error function. The description of the signal flow through the neural network and the mechanism of smooth shape control of the activation functions of individual neurons are given. The model of minimization of the error function is presented, which takes into account the possibility of changes in the characteristics of individual neurons. For the proposed network model, example courses of the learning processes are presented, which prove the convergence of the learning process for different shapes of the transition function. The proposed algorithm allows the learning process to be conducted with a smooth modification of the shape of the transition function without the need for modifying the IT model of the designed neural network. The proposed network model is a new tool that can be used in signal classification tasks.