IEEE Access (Jan 2023)

Evaluating User Interactions in Wearable Extended Reality: Modeling, Online Remote Survey, and In-Lab Experimental Methods

  • Yalda Ghasemi,
  • Heejin Jeong,
  • Kyeong-Beom Park,
  • Sung Ho Choi,
  • Jae Yeol Lee

DOI
https://doi.org/10.1109/ACCESS.2023.3298598
Journal volume & issue
Vol. 11
pp. 77856 – 77872

Abstract

Read online

The benefits of interaction techniques in extended reality (XR) have been investigated over the years. However, less attention has been given to comparing different methods to evaluate XR interactions. This study aims to (1) implement multimodal gaze-based and gesture-based interactions in a wearable XR environment and (2) evaluate the user interactions comprehensively using three methods – GOMS (Goals, Operators, Methods, and Selection rules)-based task analysis (modeling), a video-based online remote survey, and an in-lab user experiment. We used a pair of smart glasses (i.e., HoloLens 2) to implement three interaction modalities: two multimodal gaze-based interactions and a unimodal hand-based interaction. The first evaluation was performed by analyzing the task for the three interaction modes to understand human behavior in XR and predict task completion times using a GOMS model. For the second evaluation, an online survey was administered using a series of first-person point-of-view videos where a user performs a task using the three interactions. A total of 118 adults participated in this remote evaluation and compared the perceived workload of the interactions. The third evaluation was an in-lab experiment with 24 human subjects aiming to provide in-lab evidence for the former evaluations and compare their results. The results of the GOMS analysis showed that the task completion time was shorter in gaze-based modes than in the baseline. Moreover, the gaze-based interactions outperformed the baseline regarding physical demand and effort in the remote approach, but the baseline mode was preferred for mental demand and frustration. In addition, comparing the results of the GOMS and the remote experiment methods with an in-lab experiment for each interaction mode showed that the values are more consistent for the baseline mode since most of the users are generally familiar with hand interaction and prior knowledge helps them perceive this interaction more like the reality. This study paves the way for a deeper analysis of different evaluation methods in scrutinizing the gaze-based and gesture-based interactions in wearable XR environments.

Keywords