IEEE Access (Jan 2022)

An Interpretation of Long Short-Term Memory Recurrent Neural Network for Approximating Roots of Polynomials

  • Madiha Bukhsh,
  • Muhammad Saqib Ali,
  • Muhammad Usman Ashraf,
  • Khalid Alsubhi,
  • Weiqiu Chen

DOI
https://doi.org/10.1109/ACCESS.2022.3157306
Journal volume & issue
Vol. 10
pp. 28194 – 28205

Abstract

Read online

This paper aims to present a flexible method for interpreting the Long Short-Term Memory Recurrent Neural Network (LSTM-RNN) for the relational structure between the roots and the coefficients of a polynomial. A database is first developed for randomly selected inputs based on the degrees of the univariate polynomial which is then used to approximate the polynomial roots through the proposed LSTM-RNN model. Furthermore, an adaptive learning optimization algorithm is used specifically to update the network weights iteratively based on training deep neural networks data. Thus, the method can exploit the ability to find the individual learning rates for each variable through adaptive learning rate strategies to effectively prevent the weights from fluctuating in a wide spectrum. Finally, several experimental results are performed which shows that the proposed LSTM-RNN model can be used as an alternative approach to compute an approximation of each root for a given polynomial. Furthermore, the results are compared with the conventional feedforward neural network based artificial neural network model. The results clearly demonstrate the superiority of the proposed LSTM-RNN model for roots approximation in terms of accuracy, mean square error and faster convergence.

Keywords