IEEE Access (Jan 2020)

ISeeColor: Method for Advanced Visual Analytics of Eye Tracking Data

  • Karen Panetta,
  • Qianwen Wan,
  • Srijith Rajeev,
  • Aleksandra Kaszowska,
  • Aaron L. Gardony,
  • Kevin Naranjo,
  • Holly A. Taylor,
  • Sos Agaian

DOI
https://doi.org/10.1109/ACCESS.2020.2980901
Journal volume & issue
Vol. 8
pp. 52278 – 52287

Abstract

Read online

Recent advances in head-mounted eye-tracking technology have allowed researchers to monitor eye movements during locomotion in real-world environments, increasing the ecological validity of research on human gaze behavior. While collecting eye-tracking data is becoming more accessible, visual analytics of eye-tracking data remains difficult and time-consuming. As such, there is a significant need for developing efficient visualization and analysis tools for large-scale eye-tracking data. This work develops a first-of-its-kind eye-tracking data visualization and analysis system that allows for automatic recognition of independent objects within field-of-vision, using deep-learning-based semantic segmentation. This system recolors the fixated objects-of-interest by integrating gaze fixation information with semantic maps. The system effectively allows researchers to automatically infer what objects users view and for how long in dynamic contexts. The contributions are 1) a data visualization and analysis system that uses deep-learning technology along with eye-tracking data to automatically recognize objects-of-interest from head-mounted eye-tracking video recordings, and 2) a graphical user interface that presents objects-of-interest annotation along with eye-tracking data information. The architecture is tested with an outdoor case study of users walking around the Tufts University campus as part of a navigation study, which was administered by a team of research psychologists.

Keywords