IEEE Access (Jan 2020)

Sarcasm Detection Using Multi-Head Attention Based Bidirectional LSTM

  • Avinash Kumar,
  • Vishnu Teja Narapareddy,
  • Veerubhotla Aditya Srikanth,
  • Aruna Malapati,
  • Lalita Bhanu Murthy Neti

DOI
https://doi.org/10.1109/ACCESS.2019.2963630
Journal volume & issue
Vol. 8
pp. 6388 – 6397

Abstract

Read online

Sarcasm is often used to express a negative opinion using positive or intensified positive words in social media. This intentional ambiguity makes sarcasm detection, an important task of sentiment analysis. Sarcasm detection is considered a binary classification problem wherein both feature-rich traditional models and deep learning models have been successfully built to predict sarcastic comments. In previous research works, models have been built using lexical, semantic and pragmatic features. We extract the most significant features and build a feature-rich SVM that outperforms these models. In this paper, we introduce a multi-head attention-based bidirectional long-short memory (MHA-BiLSTM) network to detect sarcastic comments in a given corpus. The experiment results reveal that a multi-head attention mechanism enhances the performance of BiLSTM, and it performs better than feature-rich SVM models.

Keywords