Journal of Optimization, Differential Equations and Their Applications (Nov 2020)
Stability of Neural Ordinary Differential Equations with Power Nonlinearities
Abstract
The article presents a study of solutions of ODEs system with a special nonlinear part, which is a continuous analogue of an arbitrary recurrent neural network (neural ODEs). As a nonlinear part of the mentioned system of differential equations, we used sums of piecewise continuous functions, where each term is a power function. (These are activation functions.) The use of power activation functions (PAF) in neural networks is a generalization of well-known the rectified linear units (ReLU). In the present time ReLU are commonly used to increase the depth of trained of a neural network. Therefore, the introduction of PAF into neural networks significantly expands the possibilities of ReLU. Note that the purpose of introducing power activation functions is that they allow one to obtain verifiable Lyapunov stability conditions for solutions of the system differential equations simulating the corresponding dynamic processes. In turn, Lyapunov stability is one of the guarantees of the adequacy of the neural network model for the process under study. In addition, from the global stability (or at least the boundedness) of continuous analog solutions it follows that learning process of the corresponding neural network will not diverge for any training sample.
Keywords