Cognitive Computation and Systems (Sep 2024)

Emotion classification with multi‐modal physiological signals using multi‐attention‐based neural network

  • Chengsheng Zou,
  • Zhen Deng,
  • Bingwei He,
  • Maosong Yan,
  • Jie Wu,
  • Zhaoju Zhu

DOI
https://doi.org/10.1049/ccs2.12107
Journal volume & issue
Vol. 6, no. 1-3
pp. 1 – 11

Abstract

Read online

Abstract The ability to effectively classify human emotion states is critically important for human‐computer or human‐robot interactions. However, emotion classification with physiological signals is still a challenging problem due to the diversity of emotion expression and the characteristic differences in different modal signals. A novel learning‐based network architecture is presented that can exploit four‐modal physiological signals, electrocardiogram, electrodermal activity, electromyography, and blood volume pulse, and make a classification of emotion states. It features two kinds of attention modules, feature‐level, and semantic‐level, which drive the network to focus on the information‐rich features by mimicking the human attention mechanism. The feature‐level attention module encodes the rich information of each physiological signal. While the semantic‐level attention module captures the semantic dependencies among modals. The performance of the designed network is evaluated with the open‐source Wearable Stress and Affect Detection dataset. The developed emotion classification system achieves an accuracy of 83.88%. Results demonstrated that the proposed network could effectively process four‐modal physiological signals and achieve high accuracy of emotion classification.

Keywords