Sensors (May 2022)

Transfer Learning for Sentiment Analysis Using BERT Based Supervised Fine-Tuning

  • Nusrat Jahan Prottasha,
  • Abdullah As Sami,
  • Md Kowsher,
  • Saydul Akbar Murad,
  • Anupam Kumar Bairagi,
  • Mehedi Masud,
  • Mohammed Baz

DOI
https://doi.org/10.3390/s22114157
Journal volume & issue
Vol. 22, no. 11
p. 4157

Abstract

Read online

The growth of the Internet has expanded the amount of data expressed by users across multiple platforms. The availability of these different worldviews and individuals’ emotions empowers sentiment analysis. However, sentiment analysis becomes even more challenging due to a scarcity of standardized labeled data in the Bangla NLP domain. The majority of the existing Bangla research has relied on models of deep learning that significantly focus on context-independent word embeddings, such as Word2Vec, GloVe, and fastText, in which each word has a fixed representation irrespective of its context. Meanwhile, context-based pre-trained language models such as BERT have recently revolutionized the state of natural language processing. In this work, we utilized BERT’s transfer learning ability to a deep integrated model CNN-BiLSTM for enhanced performance of decision-making in sentiment analysis. In addition, we also introduced the ability of transfer learning to classical machine learning algorithms for the performance comparison of CNN-BiLSTM. Additionally, we explore various word embedding techniques, such as Word2Vec, GloVe, and fastText, and compare their performance to the BERT transfer learning strategy. As a result, we have shown a state-of-the-art binary classification performance for Bangla sentiment analysis that significantly outperforms all embedding and algorithms.

Keywords