Journal of Optimization, Differential Equations and Their Applications (Nov 2020)

Stability of Neural Ordinary Differential Equations with Power Nonlinearities

  • Vasiliy Ye. Belozyorov,
  • Danylo V. Dantsev

DOI
https://doi.org/10.15421/1420O5
Journal volume & issue
Vol. 28, no. 2
pp. 21 – 46

Abstract

Read online

The article presents a study of solutions of ODEs system with a special nonlinear part, which is a continuous analogue of an arbitrary recurrent neural network (neural ODEs). As a nonlinear part of the mentioned system of differential equations, we used sums of piecewise continuous functions, where each term is a power function. (These are activation functions.) The use of power activation functions (PAF) in neural networks is a generalization of well-known the rectified linear units (ReLU). In the present time ReLU are commonly used to increase the depth of trained of a neural network. Therefore, the introduction of PAF into neural networks significantly expands the possibilities of ReLU. Note that the purpose of introducing power activation functions is that they allow one to obtain verifiable Lyapunov stability conditions for solutions of the system differential equations simulating the corresponding dynamic processes. In turn, Lyapunov stability is one of the guarantees of the adequacy of the neural network model for the process under study. In addition, from the global stability (or at least the boundedness) of continuous analog solutions it follows that learning process of the corresponding neural network will not diverge for any training sample.

Keywords