PeerJ Computer Science (Mar 2024)

Emotion detection from handwriting and drawing samples using an attention-based transformer model

  • Zohaib Ahmad Khan,
  • Yuanqing Xia,
  • Khursheed Aurangzeb,
  • Fiza Khaliq,
  • Mahmood Alam,
  • Javed Ali Khan,
  • Muhammad Shahid Anwar

DOI
https://doi.org/10.7717/peerj-cs.1887
Journal volume & issue
Vol. 10
p. e1887

Abstract

Read online Read online

Emotion detection (ED) involves the identification and understanding of an individual’s emotional state through various cues such as facial expressions, voice tones, physiological changes, and behavioral patterns. In this context, behavioral analysis is employed to observe actions and behaviors for emotional interpretation. This work specifically employs behavioral metrics like drawing and handwriting to determine a person’s emotional state, recognizing these actions as physical functions integrating motor and cognitive processes. The study proposes an attention-based transformer model as an innovative approach to identify emotions from handwriting and drawing samples, thereby advancing the capabilities of ED into the domains of fine motor skills and artistic expression. The initial data obtained provides a set of points that correspond to the handwriting or drawing strokes. Each stroke point is subsequently delivered to the attention-based transformer model, which embeds it into a high-dimensional vector space. The model builds a prediction about the emotional state of the person who generated the sample by integrating the most important components and patterns in the input sequence using self-attentional processes. The proposed approach possesses a distinct advantage in its enhanced capacity to capture long-range correlations compared to conventional recurrent neural networks (RNN). This characteristic makes it particularly well-suited for the precise identification of emotions from samples of handwriting and drawings, signifying a notable advancement in the field of emotion detection. The proposed method produced cutting-edge outcomes of 92.64% on the benchmark dataset known as EMOTHAW (Emotion Recognition via Handwriting and Drawing).

Keywords