Journal of Electrical Systems and Information Technology (Apr 2023)

From face detection to emotion recognition on the framework of Raspberry pi and galvanic skin response sensor for visual and physiological biosignals

  • Varsha Kiran Patil,
  • Vijaya R. Pawar,
  • Shreiya Randive,
  • Rutika Rajesh Bankar,
  • Dhanashree Yende,
  • Aditya Kiran Patil

DOI
https://doi.org/10.1186/s43067-023-00085-2
Journal volume & issue
Vol. 10, no. 1
pp. 1 – 27

Abstract

Read online

Abstract The facial and physiological sensor-based emotion recognition methods are two popular methods of emotion recognition. The proposed research is the first of its kind in real-time emotion recognition that combines skin conductance signals with the visual-based facial emotion recognition (FER) method on a Raspberry Pi. This research includes stepwise documentation of method for automatic real-time face detection and FER on portable hardware. Further, the proposed work comprises experimentation related to video induction and habituation methods with FER and the galvanic skin response (GSR) method. The GSR data are recorded as skin conductance and represent the subject's behavioral changes in the form of emotional arousal and face emotion recognition on the portable device. The article provides a stepwise implementation of the following methods: (a) the skin conductance representation from the GSR sensor for arousal; (b) gathering visual inputs for identifying the human face; (c) FER from the camera module; and (d) experimentation on the proposed framework. The key feature of this article is the comprehensive documentation of stepwise implementation and experimentation, including video induction and habituation experimentation. An illuminating aspect of the proposed method is the survey of GSR trademarks and the conduct of psychological experiments. This study is useful for emotional computing systems and potential applications like lie detectors and human–machine interfaces, devices for gathering user experience input, identifying intruders, and providing portable and scalable devices for experimentation. We termed our approaches "sensovisual" (sensors + visual) and "Emosense" (emotion sensing).

Keywords