Sensors (Jul 2023)

Identification of Language-Induced Mental Load from Eye Behaviors in Virtual Reality

  • Johannes Schirm,
  • Andrés Roberto Gómez-Vargas,
  • Monica Perusquía-Hernández,
  • Richard T. Skarbez,
  • Naoya Isoyama,
  • Hideaki Uchiyama,
  • Kiyoshi Kiyokawa

DOI
https://doi.org/10.3390/s23156667
Journal volume & issue
Vol. 23, no. 15
p. 6667

Abstract

Read online

Experiences of virtual reality (VR) can easily break if the method of evaluating subjective user states is intrusive. Behavioral measures are increasingly used to avoid this problem. One such measure is eye tracking, which recently became more standard in VR and is often used for content-dependent analyses. This research is an endeavor to utilize content-independent eye metrics, such as pupil size and blinks, for identifying mental load in VR users. We generated mental load independently from visuals through auditory stimuli. We also defined and measured a new eye metric, focus offset, which seeks to measure the phenomenon of “staring into the distance” without focusing on a specific surface. In the experiment, VR-experienced participants listened to two native and two foreign language stimuli inside a virtual phone booth. The results show that with increasing mental load, relative pupil size on average increased 0.512 SDs (0.118 mm), with 57% reduced variance. To a lesser extent, mental load led to fewer fixations, less voluntary gazing at distracting content, and a larger focus offset as if looking through surfaces (about 0.343 SDs, 5.10 cm). These results are in agreement with previous studies. Overall, we encourage further research on content-independent eye metrics, and we hope that hardware and algorithms will be developed in the future to further increase tracking stability.

Keywords