Measurement: Sensors (Jun 2023)

Analysis of optimization algorithms for stability and convergence for natural language processing using deep learning algorithms

  • Ch Gangadhar,
  • Madiajajagn Moutteyan,
  • Rajeev Ratna Vallabhuni,
  • Vinodh P. Vijayan,
  • Neetu Sharma,
  • Robert Theivadas

Journal volume & issue
Vol. 27
p. 100784

Abstract

Read online

A boom in applying deep learning (DL) models over the past several years has advanced the discipline of NLP. Firstly, the theoretical foundations of artificial intelligence and NLP are briefly introduced in this survey. Then, it sorts through much recent research and compiles many pertinent contributions. Lately, this article has introduced optimization theory and techniques for neural network training. First, we classify and discuss the various facets and NLP applications profiting from deep learning. Second, we review generic language modelling methods used in pre-training neural networks, such as BERT, RoBERT, AlBERT and DeBERT. Third, we compared the different language models in GLUE, MNL1, and SQuAD for accuracy and efficiency for best optimization.

Keywords