IEEE Access (Jan 2020)

Differential Neural Networks (DNN)

  • Sergio Ledesma,
  • Dora-Luz Almanza-Ojeda,
  • Mario-Alberto Ibarra-Manzano,
  • Eduardo Cabal Yepez,
  • Juan Gabriel Avina-Cervantes,
  • Pascal Fallavollita

DOI
https://doi.org/10.1109/ACCESS.2020.3019307
Journal volume & issue
Vol. 8
pp. 156530 – 156538

Abstract

Read online

In this work, we propose an artificial neural network topology to estimate the derivative of a function. This topology is called a differential neural network because it allows the estimation of the derivative of any of the network outputs with respect to any of its inputs. The main advantage of a differential neural network is that it uses some of the weights of a multilayer neural network. Therefore, a differential neural network does not need to be trained. First, a multilayer neural network is trained to find the best set of weights that minimize an error function. Second, the weights of the trained network and its neuron activations are used to build a differential neural network. Consequently, a multilayer artificial neural can produce a specific output, and simultaneously, estimate the derivative of any of its outputs with respect to any of its inputs. Several computer simulations were carried out to validate the performance of the proposed method. The computer simulation results showed that differential neural networks are capable of estimating with good accuracy the derivative of a function. The method was developed for an artificial neural network with two layers; however, the method can be extended to more than two layers. Similarly, the analysis in this study is presented for two common activation functions. Nonetheless, other activation functions can be used as long as the derivative of the activation function can be computed.

Keywords