IEEE Access (Jan 2021)

Real-Time Eye Tracking for Bare and Sunglasses-Wearing Faces for Augmented Reality 3D Head-Up Displays

  • Dongwoo Kang,
  • Lin Ma

DOI
https://doi.org/10.1109/ACCESS.2021.3110644
Journal volume & issue
Vol. 9
pp. 125508 – 125522

Abstract

Read online

Eye pupil tracking is important for augmented reality (AR) three-dimensional (3D) head-up displays (HUDs). Accurate and fast eye tracking is still challenging due to multiple driving conditions with eye occlusions, such as wearing sunglasses. In this paper, we propose a system for commercial use that can handle practical driving conditions. Our system classifies human faces into bare faces and sunglasses faces, which are treated differently. For bare faces, our eye tracker regresses the pupil area in a coarse-to-fine manner based on a revised Supervised Descent Method based eye-nose alignment. For sunglasses faces, because the eyes are occluded, our eye tracker uses whole face alignment with a revised Practical Facial Landmark Detector for pupil center tracking. Furthermore, we propose a structural inference-based re-weight network to predict eye position from non-occluded areas, such as the nose and mouth. The proposed re-weight sub-network revises the importance of different feature map positions and predicts the occluded eye positions by non-occluded parts. The proposed eye tracker is robust via a tracker-checker and a small model size. Experiments show that our method achieves high accuracy and speed, approximately 1.5 and 6.5 mm error for bare and sunglasses faces, respectively, at less than 10 ms on a 2.0GHz CPU. The evaluation dataset was captured indoors and outdoors to reflect multiple sunlight conditions. Our proposed method, combined with AR 3D HUDs, shows promising results for commercialization with low crosstalk 3D images.

Keywords