IEEE Access (Jan 2024)

MPNet-GRUs: Sentiment Analysis With Masked and Permuted Pre-Training for Language Understanding and Gated Recurrent Units

  • Nicole Kai Ning Loh,
  • Chin Poo Lee,
  • Thian Song Ong,
  • Kian Ming Lim

DOI
https://doi.org/10.1109/ACCESS.2024.3394930
Journal volume & issue
Vol. 12
pp. 74069 – 74080

Abstract

Read online

Sentiment analysis, a pivotal task in natural language processing, aims to discern opinions and emotions expressed in text. However, existing methods for sentiment analysis face various challenges such as data scarcity, complex language patterns, and long-range dependencies. In this paper, we propose MPNet-GRUs, a hybrid deep learning model that integrates three key components: MPNet, BiGRU, and GRU. MPNet, a transformer-based pre-trained language model, enhances language understanding through masked and permuted pre-training. BiGRU and GRU, recurrent neural networks, capture long-term dependencies bidirectionally and unidirectionally. By combining the strengths of these models, MPNet-GRUs aims to provide a more effective and efficient solution for sentiment analysis. Evaluation on three benchmark datasets reveals the superior performance of MPNet-GRUs: 94.71% for IMDb, 86.27% for Twitter US Airline Sentiment, and 88.17% for Sentiment140, demonstrating its potential to advance sentiment analysis.

Keywords