IEEE Access (Jan 2022)
Activation Function Modulation in Generative Triangular Recurrent Neural Networks
Abstract
Autonomous generation of time series is challenging because the network must capture short-term features while tracking long-term time dependencies. This paper introduces the modulation of the activation function slopes of the upper-lower triangular recurrent neural networks (ULTRNNs) for dynamic variation of memory through a secondary recurrent network with its own independent states. A zigzag propagation algorithm for weight updates is proposed that accounts for the dynamic interaction of the states between the ULTRNN and the secondary network. A novel training method is proposed that distributes the eigenvalues of the closed-loop system around the unit circle in the complex z-plane to ensure that the network behaves as a nonlinear oscillator with an output that neither collapses nor saturates but continues to emulate the target. Examples encompassing the Lorenz series, Santa Fe laser data, kolam patterns, electrocardiogram (ECG) signals, stock pricing data, and smart grid data are presented to demonstrate that the proposed approach is highly effective in the generative modeling of complex periodic, chaotic, and nonstationary time series. The qualitative and quantitative performance of the ULTRNN obtained with the proposed activation-function modulation technique is comparable to that of state-of-the-art techniques including feedforward networks and generative adversarial networks, but with far fewer trainable parameters and shorter computation times.
Keywords