CAAI Transactions on Intelligence Technology (Dec 2023)

Numerical‐discrete‐scheme‐incorporated recurrent neural network for tasks in natural language processing

  • Mei Liu,
  • Wendi Luo,
  • Zangtai Cai,
  • Xiujuan Du,
  • Jiliang Zhang,
  • Shuai Li

DOI
https://doi.org/10.1049/cit2.12172
Journal volume & issue
Vol. 8, no. 4
pp. 1415 – 1424

Abstract

Read online

Abstract A variety of neural networks have been presented to deal with issues in deep learning in the last decades. Despite the prominent success achieved by the neural network, it still lacks theoretical guidance to design an efficient neural network model, and verifying the performance of a model needs excessive resources. Previous research studies have demonstrated that many existing models can be regarded as different numerical discretizations of differential equations. This connection sheds light on designing an effective recurrent neural network (RNN) by resorting to numerical analysis. Simple RNN is regarded as a discretisation of the forward Euler scheme. Considering the limited solution accuracy of the forward Euler methods, a Taylor‐type discrete scheme is presented with lower truncation error and a Taylor‐type RNN (T‐RNN) is designed with its guidance. Extensive experiments are conducted to evaluate its performance on statistical language models and emotion analysis tasks. The noticeable gains obtained by T‐RNN present its superiority and the feasibility of designing the neural network model using numerical methods.

Keywords