IEEE Access (Jan 2021)

A Human-Robot Interaction System Calculating Visual Focus of Human’s Attention Level

  • Partha Chakraborty,
  • Sabbir Ahmed,
  • Mohammad Abu Yousuf,
  • Akm Azad,
  • Salem A. Alyami,
  • Mohammad Ali Moni

DOI
https://doi.org/10.1109/ACCESS.2021.3091642
Journal volume & issue
Vol. 9
pp. 93409 – 93421

Abstract

Read online

Attention is the mental awareness of human on a particular object or a piece of information. The level of attention indicates how intense the focus is on an object or an instance. In this study, several types of human attention level have been observed. After introducing image segmentation and detection technique for facial features, eyeball movement and gaze estimation were measured. Eye movement were assessed using the video data, and a total of 10197 data instances were manually labelled for the attention level. Then Artificial Neural Network (ANN) and Recurrent Neural Network-Long Short Term Memory (LSTM) based Deep learning (DL) architectures have been proposed for analysing the data. Next, the trained DL model has been implanted into a robotic system that is capable of detecting various features; ultimately leading to the calculation of visual attention for reading, browsing, and writing purposes. This system is capable of checking the attention level of the participants and also can detect if participants are present or not. Based on a certain level of visual focus of attention (VFOA), this system interacts with the person, generates awareness and establishes verbal or visual communication with that person. The proposed ML techniques have achieved almost 99.24% validation accuracy and 99.43% test accuracy. It is also shown in the comparative study that, since the dataset volumes are limited, ANN is more suitable for attention level calculation than RNN-LSTM. We hope that the implemented robotic structure manifests the real-world implication of the proposed method.

Keywords