IEEE Access (Jan 2024)

Residual Relation-Aware Attention Deep Graph-Recurrent Model for Emotion Recognition in Conversation

  • Anh-Quang Duong,
  • Ngoc-Huynh Ho,
  • Sudarshan Pant,
  • Seungwon Kim,
  • Soo-Hyung Kim,
  • Hyung-Jeong Yang

DOI
https://doi.org/10.1109/ACCESS.2023.3348518
Journal volume & issue
Vol. 12
pp. 2349 – 2360

Abstract

Read online

This work addresses Emotion Recognition in Conversation (ERC), a task with substantial implications for the classification of the underlying emotions in spoken encounters. Our focus is on utilizing a fully connected directed acyclic graph to represent conversations, presenting inter-locutor and intra-locutor ties to capture intricate relationships. Therefore, we propose a novel methodology, Residual Relation-Aware Attention (RRAA) with Positional Encoding, enhancing speaker relations’ contexts for improved emotion recognition in conversation. The purpose of this mechanism is to facilitate a thorough comprehension of the connections between speakers, hence enhancing the sophistication and contextual awareness of an emotion recognition framework. We utilized the Gated recurrent units (GRU) to regulate context transmission, ensuring adaptability to changing emotional dynamics. It regulates the transmission of conversation context across all layers of the graph, guaranteeing a flexible and responsive representation of the changing emotional dynamics within the discourse. Evaluations on IEMOCAP, MELD, and EmoryNLP datasets disclose our model’s superior performance (F1 scores: 69.1%, 63.82%, 39.85%, respectively), outperforming state-of-the-art approaches. In general, this work enhances speaker interactions by utilizing a fully connected graph, and providing a more concise and efficient ERC framework.

Keywords