IEEE Access (Jan 2024)

Continuous Prediction of Pointing Targets With Motion and Eye-Tracking in Virtual Reality

  • Choongho Chung,
  • Sung-Hee Lee

DOI
https://doi.org/10.1109/ACCESS.2024.3350788
Journal volume & issue
Vol. 12
pp. 5933 – 5946

Abstract

Read online

We present a study on continuously predicting the direction to a pointing target in virtual environments using motion and eye-tracker data throughout the pointing process. We first collect time series data for user motion and eye-tracker in a cursorless, single-target pointing task. Results from analyzing fixation points from different sensors and observing velocity profiles over the course of pointing provide insights into optimally configuring features for predicting the target angles. Following this analysis, we train a recurrent neural network that feeds on sliding window inputs for continuously operating target direction prediction from start to finish. The input window contains historical data from past to current frames, capturing temporal changes in the feature data. By feeding on this input, our model can predict the direction of the target at any given time during pointing. Our findings demonstrate that incorporating eye-tracker data into the prediction model boosts the maximum achievable accuracy by 2.5 times when compared to baselines without eye-tracker data inputs. The results suggest that using features from both the eye-tracker and joint motion contributes to higher prediction performance, as well as faster stabilization of output values at the starting phase of pointing.

Keywords